Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
DiKi-I
Post Partisan
Post Partisan

apache airflow in fabric

has anyone used apacahe airflow in fabric? Is it possible that I can run an airflow job that will trigger pipeline from another workspace?

3 ACCEPTED SOLUTIONS
MJParikh
Resolver III
Resolver III

Short answer

• There is no native Airflow service inside Microsoft Fabric.
• Use Airflow outside Fabric, then call Fabric with REST to run a pipeline or notebook.
• Cross-workspace works, you target the workspace that owns the pipeline.

What to set up

  1. Create an Entra app registration.

  2. Give it Fabric API permissions for the items you will run, and tenant access to the target workspace.

  3. Use client-credentials to get an access token.

  4. From Airflow, call the Fabric REST endpoint to start the run, then poll run status.

Airflow example:

from airflow import DAG
from airflow.operators.python import PythonOperator
import requests, datetime, time

TENANT = "your-tenant-id"
CLIENT_ID = "app-id"
CLIENT_SECRET = "app-secret"
WORKSPACE_ID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
PIPELINE_ID = "yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy"  # the Fabric pipeline item id

AUTH_URL = f"https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token"
SCOPE = "https://api.fabric.microsoft.com/.default"
API_BASE = "https://api.fabric.microsoft.com/v1"

def _token():
    r = requests.post(
        AUTH_URL,
        data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
              "grant_type": "client_credentials", "scope": SCOPE},
        timeout=30,
    )
    r.raise_for_status()
    return r.json()["access_token"]

def run_fabric_pipeline(**_):
    tok = _token()
    h = {"Authorization": f"Bearer {tok}"}
    # start a run
    start = requests.post(
        f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs",
        headers=h, json={}, timeout=30
    )
    start.raise_for_status()
    job_id = start.json()["id"]

    # poll until done
    while True:
        s = requests.get(
            f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs/{job_id}",
            headers=h, timeout=30
        )
        s.raise_for_status()
        state = s.json()["status"]
        if state in {"Succeeded", "Failed", "Cancelled"}:
            if state != "Succeeded":
                raise RuntimeError(f"Fabric pipeline status {state}")
            return
        time.sleep(15)

with DAG(
    dag_id="trigger_fabric_pipeline",
    start_date=datetime.datetime(2025, 1, 1),
    schedule=None,
    catchup=False,
) as dag:
    PythonOperator(task_id="run_fabric", python_callable=run_fabric_pipeline)

 

Notes for your scenario

• To trigger a pipeline in another workspace, supply that workspace’s id in the URL.
• To pass parameters, add them to the POST body where your pipeline expects them.
• For notebooks or Dataflows, use their item id in the same pattern.
• If you need network isolation, host Airflow on AKS, Azure Container Apps, or MWAA, and lock Fabric access with Entra app roles.

View solution in original post

DiKi-I
Post Partisan
Post Partisan

there is apache airflow integration in fabric, I want to know within fabric if it possible cross workspaces.

Run a Fabric pipeline and notebook using Apache Airflow DAG. - Microsoft Fabric | Microsoft Learn

View solution in original post

v-priyankata
Community Support
Community Support

Hi @DiKi-I 

Thank you for reaching out to the Microsoft Fabric Forum Community.

Currently, there’s no official documentation confirming native cross-workspace orchestration support in Fabric (i.e., triggering an item in Workspace B from a DAG in Workspace A). It might be achievable using REST API calls by specifying the Workspace id provided the necessary permissions and authentication (via Microsoft Entra ID) are properly configured and both workspaces exist within the same tenant context.

 

I encourage you to submit a detail feedback via ideas forum Fabric Ideas - Microsoft Fabric Community.

Thanks.

 

View solution in original post

4 REPLIES 4
v-priyankata
Community Support
Community Support

Hi @DiKi-I 

Thank you for reaching out to the Microsoft Fabric Forum Community.

Currently, there’s no official documentation confirming native cross-workspace orchestration support in Fabric (i.e., triggering an item in Workspace B from a DAG in Workspace A). It might be achievable using REST API calls by specifying the Workspace id provided the necessary permissions and authentication (via Microsoft Entra ID) are properly configured and both workspaces exist within the same tenant context.

 

I encourage you to submit a detail feedback via ideas forum Fabric Ideas - Microsoft Fabric Community.

Thanks.

 

Hi @DiKi-I 

Could you please confirm if you've submitted this as an idea in the Ideas Forum? If so, sharing the link here would be helpful for other community members who may have similar feedback.

Thanks.

DiKi-I
Post Partisan
Post Partisan

there is apache airflow integration in fabric, I want to know within fabric if it possible cross workspaces.

Run a Fabric pipeline and notebook using Apache Airflow DAG. - Microsoft Fabric | Microsoft Learn

MJParikh
Resolver III
Resolver III

Short answer

• There is no native Airflow service inside Microsoft Fabric.
• Use Airflow outside Fabric, then call Fabric with REST to run a pipeline or notebook.
• Cross-workspace works, you target the workspace that owns the pipeline.

What to set up

  1. Create an Entra app registration.

  2. Give it Fabric API permissions for the items you will run, and tenant access to the target workspace.

  3. Use client-credentials to get an access token.

  4. From Airflow, call the Fabric REST endpoint to start the run, then poll run status.

Airflow example:

from airflow import DAG
from airflow.operators.python import PythonOperator
import requests, datetime, time

TENANT = "your-tenant-id"
CLIENT_ID = "app-id"
CLIENT_SECRET = "app-secret"
WORKSPACE_ID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
PIPELINE_ID = "yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy"  # the Fabric pipeline item id

AUTH_URL = f"https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token"
SCOPE = "https://api.fabric.microsoft.com/.default"
API_BASE = "https://api.fabric.microsoft.com/v1"

def _token():
    r = requests.post(
        AUTH_URL,
        data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
              "grant_type": "client_credentials", "scope": SCOPE},
        timeout=30,
    )
    r.raise_for_status()
    return r.json()["access_token"]

def run_fabric_pipeline(**_):
    tok = _token()
    h = {"Authorization": f"Bearer {tok}"}
    # start a run
    start = requests.post(
        f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs",
        headers=h, json={}, timeout=30
    )
    start.raise_for_status()
    job_id = start.json()["id"]

    # poll until done
    while True:
        s = requests.get(
            f"{API_BASE}/workspaces/{WORKSPACE_ID}/items/{PIPELINE_ID}/jobs/{job_id}",
            headers=h, timeout=30
        )
        s.raise_for_status()
        state = s.json()["status"]
        if state in {"Succeeded", "Failed", "Cancelled"}:
            if state != "Succeeded":
                raise RuntimeError(f"Fabric pipeline status {state}")
            return
        time.sleep(15)

with DAG(
    dag_id="trigger_fabric_pipeline",
    start_date=datetime.datetime(2025, 1, 1),
    schedule=None,
    catchup=False,
) as dag:
    PythonOperator(task_id="run_fabric", python_callable=run_fabric_pipeline)

 

Notes for your scenario

• To trigger a pipeline in another workspace, supply that workspace’s id in the URL.
• To pass parameters, add them to the POST body where your pipeline expects them.
• For notebooks or Dataflows, use their item id in the same pattern.
• If you need network isolation, host Airflow on AKS, Azure Container Apps, or MWAA, and lock Fabric access with Entra app roles.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.