Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi! We're using the Airflow Job in Fabric.
We need to trigger a DAG when a specific event (webhook) happens. For that, we should use the Airflow Job in the Fabric API.
I haven't been successful with this. I see the documentation is very recent (https://learn.microsoft.com/en-us/fabric/data-factory/apache-airflow-jobs-api-capabilities). Could someone kindly share a curl command or similar that can trigger a DAG in my Airflow Job in Fabric?
Solved! Go to Solution.
Hi @mmmall,
Thank you for the response. Based on the screenshot here looks like the Fabric portal is calling the Airflow API using a session cookie, not a bearer token. Because of this internal authentication flow, it is not possible to reproduce the same call externally using a client-credentials token or any token we generate ourselves.
Currently fabric does not provide a public API to directly trigger a DAG inside an Apache Airflow Job, that is why your direct API calls fail even with a valid token.
The only supported approach is to trigger a Fabric item for example like a Notebook or Pipeline using the Fabric REST API and run the logic there.
Thanks and regards,
Anjan Kumar Chippa
Hi @mmmall,
Thank you for reaching out to Microsoft Fabric Community.
In Fabric, the supported way to trigger a DAG or job from an external webhook is through the Fabric REST API. Please follow below steps:
curl -s -X POST "https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token"
-H "Content-Type: application/x-www-form-urlencoded"
-d "grant_type=client_credentials&client_id={CLIENT_ID}&client_secret={CLIENT_SECRET}&scope=https://api.fabric.microsoft.com/.default"
Trigger the Fabric item Run On Demand Item Job API to start the Airflow Job:
curl -i -X POST "https://api.fabric.microsoft.com/v1/workspaces/{WORKSPACE_ID}/items/{ITEM_ID}/jobs/instances?jobType..."
-H "Authorization: Bearer {ACCESS_TOKEN}"
-H "Content-Type: application/json"
-d '{ "executionData": { "triggeredBy": "webhook", "webhookPayload": {"eventId":"12345"} } }'
Replace the required values, with correct Id's.
If successful you will get 202 Accepted and a job instance id.
Thanks and regards,
Anjan Kumar Chippa
Hi Anjan, thanks for your input.
I can find the Airflow Job item using fabric API:
curl -X GET "https://api.fabric.microsoft.com/v1/workspaces/$FABRIC_WORKSPACE_ID/items/$FABRIC_AIRFLOW_JOB_ID" \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Content-Type: application/json"
{"id":"99999-9999-9999-9999-9999999","type":"ApacheAirflowJob","displayName":"dev_airflow","description":"Airflow","workspaceId":"99999-9999-9999-9999-9999999"}
But subsequent API calls such as the one you mentioned (Trigger the Fabric item Run On Demand Item Job API to start the Airflow Job) always return The requested job type is invalid. I also noticed we would need to mention the DAG name (e.g. dag_run_ingestion) somewhere. I have tried multiple different jobType's, but it always returns job type is invalid.
curl -i -X POST "https://api.fabric.microsoft.com/v1/workspaces/$FABRIC_WORKSPACE_ID/items/$FABRIC_AIRFLOW_JOB_ID/jobs/instances?jobType=DefaultJob /
-H "Authorization: Bearer $ACCESS_TOKEN" /
-H "Content-Type: application/json" /
-d '{ "executionData": { "triggeredBy": "webhook", "webhookPayload": {"eventId":"12345"} } }'
HTTP/2 400
cache-control: no-store, must-revalidate, no-cache
pragma: no-cache
content-type: application/json; charset=utf-8
x-ms-public-api-error-code: InvalidJobType
strict-transport-security: max-age=31536000; includeSubDomains
x-frame-options: deny
x-content-type-options: nosniff
requestid: 9999999-9999-9999-9999-9999999999
access-control-expose-headers: RequestId
request-redirected: true
home-cluster-uri: https://wabi-us-north-central-j-primary-redirect.analysis.windows.net/
date: Thu, 30 Oct 2025 19:42:45 GMT
{"requestId":"9999999-9999-9999-9999-9999999999","errorCode":"InvalidJobType","message":"The requested job type is invalid"}
Would you know:
- If this works with Fabric Airflow Job (it differs from Azure manager airflow)?
- If the Airflow Job is in Starter Pool (Auto pausing), will it trigger a start for the Airflow Job, wait and trigger the DAG run?
- If this is the proper way to trigger an Airflow Job DAG?
Hi @mmmall,
Fabric’s Run On Demand Item Job API currently does not accept a jobType that directly starts a DAG inside an ApacheAirflowJob item, that is why you are getting the InvalidJobType error.
To trigger a DAG run, you have two supported options:
Please refer below document for your reference:
https://learn.microsoft.com/en-us/fabric/data-factory/apache-airflow-jobs-api-capabilities
Thanks and regards,
Anjan Kumar Chippa
Hi @mmmall,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hello Anjan,
When grabbing the powerBIAccessToken from Developer tools - Console, we are able directly use the Airflow API, in example below simply to list all DAGs:
curl -X 'GET' 'https://aeexxxxxxxxxxxxxx.northcentralus.airflow.svc.datafactory.azure.com/api/v1/dags?limit=100&only_active=true' \
-H 'accept: application/json' \
-H "Authorization: Bearer [TOKEN_GOES_HERE]"
But we were unable to generate a token that behaves the same way as the powerBIAccessToken. We always get a 302 Found (which sounds like a redirect to login page) when using our own token. Below is a sample of how we tried to generate a token, using a client_id & client_secret (App registration) that has permissions in below image as per documentation.
import msal
tenant_id = "xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx"
client_id = "xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx"
client_secret = "G63xxxxxxx"
authority = f"https://login.microsoftonline.com/{tenant_id}"
app = msal.ConfidentialClientApplication(
client_id,
authority=authority,
client_credential=client_secret
)
scopes = ["https://analysis.windows.net/powerbi/api/.default"]
token_response = app.acquire_token_for_client(scopes=scopes)
print("Bearer token:", token_response["access_token"])
Hi @mmmall,
Here the reason your MSAL token does not work because it’s being created for the Power BI API not for the Airflow API. The token you see in developer tools works because it is issued specifically for the Airflow resource in Fabric. So to fix this, please follow below steps:
This way the generated token will match what fabric expects and your Airflow API calls will work same like they do with the PowerBIAccessToken.
Thanks and regards,
Anjan Kumar Chippa
Hello Anjan,
I did that before sending the previous message. The scope I see by decoding PowerBIAccessToken token is the one I've used (
Hi @mmmall,
Thank you for the response. Even though the PowerBIAccessToken shows the power bi audience when you decode it, but this might not be the same token that the Fabric portal is actually sending to the Airflow API
This error with your token indicates that the token you generated is not valid for the Airflow resource. To get the correct audience, the only way is to capture the actual request the portal makes to the Airflow endpoint:
If you use this token, the Airflow API call will work the same way it does in the portal.
Thanks and regards,
Anjan Kumar Chippa
Hello Anjan,
Have you been trying out your proposed solutions, before proposing them - Do they work for you? Here if I try to do that, it apparently uses a Cookie instead, which is not a JWT decodable token.
Hi @mmmall,
Thank you for the response. Based on the screenshot here looks like the Fabric portal is calling the Airflow API using a session cookie, not a bearer token. Because of this internal authentication flow, it is not possible to reproduce the same call externally using a client-credentials token or any token we generate ourselves.
Currently fabric does not provide a public API to directly trigger a DAG inside an Apache Airflow Job, that is why your direct API calls fail even with a valid token.
The only supported approach is to trigger a Fabric item for example like a Notebook or Pipeline using the Fabric REST API and run the logic there.
Thanks and regards,
Anjan Kumar Chippa
Hi @mmmall,
We wanted to kindly follow up to check if the solution I have provided for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hello Anjan,
Ideally we should be able to reach the Fabric Airflow Job through it's API directly by external apps, the workaround of using a Fabric Item that calls the API is a bit cumbersome.
Hi @mmmall,
I understand the expectation. Currently fabric’s Airflow Job does not expose a public API so that external applications can call directly. The portal uses an internal authentication flow, which is why external requests cannot access the Airflow API.
Because of this current limitation, the only supported approach is to trigger a fabric item through the Fabric REST API and run your logic there. If direct external API access is added in the future, the documentation will be updated.
Thanks and regards,
Anjan Kumar Chippa
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!