Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
has anyone used apacahe airflow in fabric? Is it possible that I can run an airflow job that will trigger pipeline from another workspace?
Solved! Go to Solution.
Short answer
• There is no native Airflow service inside Microsoft Fabric.
• Use Airflow outside Fabric, then call Fabric with REST to run a pipeline or notebook.
• Cross-workspace works, you target the workspace that owns the pipeline.
What to set up
Create an Entra app registration.
Give it Fabric API permissions for the items you will run, and tenant access to the target workspace.
Use client-credentials to get an access token.
From Airflow, call the Fabric REST endpoint to start the run, then poll run status.
Airflow example:
from airflow import DAG
from airflow.operators.python import PythonOperator
import requests, datetime, time
TENANT = "your-tenant-id"
CLIENT_ID = "app-id"
CLIENT_SECRET = "app-secret"
WORKSPACE_ID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
PIPELINE_ID = "yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy" # the Fabric pipeline item id
AUTH_URL = f"https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token"
SCOPE = "https://api.fabric.microsoft.com/.default"
API_BASE = "https://api.fabric.microsoft.com/v1"
def _token():
r = requests.post(
AUTH_URL,
data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
"grant_type": "client_credentials", "scope": SCOPE},
timeout=30,
)
r.raise_for_status()
return r.json()["access_token"]
def run_fabric_pipeline(**_):
tok = _token()
h = {"Authorization": f"Bearer {tok}"}
# start a run
start = requests.post(
f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs",
headers=h, json={}, timeout=30
)
start.raise_for_status()
job_id = start.json()["id"]
# poll until done
while True:
s = requests.get(
f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs/{job_id}",
headers=h, timeout=30
)
s.raise_for_status()
state = s.json()["status"]
if state in {"Succeeded", "Failed", "Cancelled"}:
if state != "Succeeded":
raise RuntimeError(f"Fabric pipeline status {state}")
return
time.sleep(15)
with DAG(
dag_id="trigger_fabric_pipeline",
start_date=datetime.datetime(2025, 1, 1),
schedule=None,
catchup=False,
) as dag:
PythonOperator(task_id="run_fabric", python_callable=run_fabric_pipeline)
Notes for your scenario
• To trigger a pipeline in another workspace, supply that workspace’s id in the URL.
• To pass parameters, add them to the POST body where your pipeline expects them.
• For notebooks or Dataflows, use their item id in the same pattern.
• If you need network isolation, host Airflow on AKS, Azure Container Apps, or MWAA, and lock Fabric access with Entra app roles.
there is apache airflow integration in fabric, I want to know within fabric if it possible cross workspaces.
Run a Fabric pipeline and notebook using Apache Airflow DAG. - Microsoft Fabric | Microsoft Learn
Hi @DiKi-I
Thank you for reaching out to the Microsoft Fabric Forum Community.
Currently, there’s no official documentation confirming native cross-workspace orchestration support in Fabric (i.e., triggering an item in Workspace B from a DAG in Workspace A). It might be achievable using REST API calls by specifying the Workspace id provided the necessary permissions and authentication (via Microsoft Entra ID) are properly configured and both workspaces exist within the same tenant context.
I encourage you to submit a detail feedback via ideas forum Fabric Ideas - Microsoft Fabric Community.
Thanks.
Hi @DiKi-I
Thank you for reaching out to the Microsoft Fabric Forum Community.
Currently, there’s no official documentation confirming native cross-workspace orchestration support in Fabric (i.e., triggering an item in Workspace B from a DAG in Workspace A). It might be achievable using REST API calls by specifying the Workspace id provided the necessary permissions and authentication (via Microsoft Entra ID) are properly configured and both workspaces exist within the same tenant context.
I encourage you to submit a detail feedback via ideas forum Fabric Ideas - Microsoft Fabric Community.
Thanks.
Hi @DiKi-I
Could you please confirm if you've submitted this as an idea in the Ideas Forum? If so, sharing the link here would be helpful for other community members who may have similar feedback.
Thanks.
there is apache airflow integration in fabric, I want to know within fabric if it possible cross workspaces.
Run a Fabric pipeline and notebook using Apache Airflow DAG. - Microsoft Fabric | Microsoft Learn
Short answer
• There is no native Airflow service inside Microsoft Fabric.
• Use Airflow outside Fabric, then call Fabric with REST to run a pipeline or notebook.
• Cross-workspace works, you target the workspace that owns the pipeline.
What to set up
Create an Entra app registration.
Give it Fabric API permissions for the items you will run, and tenant access to the target workspace.
Use client-credentials to get an access token.
From Airflow, call the Fabric REST endpoint to start the run, then poll run status.
Airflow example:
from airflow import DAG
from airflow.operators.python import PythonOperator
import requests, datetime, time
TENANT = "your-tenant-id"
CLIENT_ID = "app-id"
CLIENT_SECRET = "app-secret"
WORKSPACE_ID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
PIPELINE_ID = "yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy" # the Fabric pipeline item id
AUTH_URL = f"https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token"
SCOPE = "https://api.fabric.microsoft.com/.default"
API_BASE = "https://api.fabric.microsoft.com/v1"
def _token():
r = requests.post(
AUTH_URL,
data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
"grant_type": "client_credentials", "scope": SCOPE},
timeout=30,
)
r.raise_for_status()
return r.json()["access_token"]
def run_fabric_pipeline(**_):
tok = _token()
h = {"Authorization": f"Bearer {tok}"}
# start a run
start = requests.post(
f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs",
headers=h, json={}, timeout=30
)
start.raise_for_status()
job_id = start.json()["id"]
# poll until done
while True:
s = requests.get(
f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs/{job_id}",
headers=h, timeout=30
)
s.raise_for_status()
state = s.json()["status"]
if state in {"Succeeded", "Failed", "Cancelled"}:
if state != "Succeeded":
raise RuntimeError(f"Fabric pipeline status {state}")
return
time.sleep(15)
with DAG(
dag_id="trigger_fabric_pipeline",
start_date=datetime.datetime(2025, 1, 1),
schedule=None,
catchup=False,
) as dag:
PythonOperator(task_id="run_fabric", python_callable=run_fabric_pipeline)
Notes for your scenario
• To trigger a pipeline in another workspace, supply that workspace’s id in the URL.
• To pass parameters, add them to the POST body where your pipeline expects them.
• For notebooks or Dataflows, use their item id in the same pattern.
• If you need network isolation, host Airflow on AKS, Azure Container Apps, or MWAA, and lock Fabric access with Entra app roles.