Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hi Fabric community, I'm wondering if anayone has had any success at making metadata driven DAGs in Apache Airflow in Fabric?
I love the idea of using Airflow for orchestration, however my orchestration currently is all metadata driven and I've not been able to find a way to replicate that in airflow.
Solved! Go to Solution.
Hi @tayloramy ,
Thanks for reaching out to the Microsoft Fabric Community.
At this time, Microsoft Fabric does not include native support for fully metadata-driven DAGs within its Apache Airflow integration. Airflow in Fabric uses standard Python-based DAG definitions, and while you can parameterize them using variables or external configuration, a fully metadata-driven orchestration approach requires custom implementation.
You may want to review this open-source project: Metadata-Driven ETL Framework for Complex Workflows in Apache Airflow . It demonstrates how metadata tables (stored in SQL Server or Postgres) can define job configurations, task sequences, and ETL logic, with Airflow reading and executing based on that metadata. This design pattern could be adapted in Fabric by sourcing metadata from a Lakehouse or Data Warehouse.
Some useful docs for reference:
What is Apache Airflow job? - Microsoft Fabric | Microsoft Learn
Dynamic Dag Generation — Airflow 3.1.0 Documentation
Hope this helps. Please reach out for further assistance.
Thank you.
Hi @tayloramy ,
Thanks for reaching out to the Microsoft Fabric Community.
At this time, Microsoft Fabric does not include native support for fully metadata-driven DAGs within its Apache Airflow integration. Airflow in Fabric uses standard Python-based DAG definitions, and while you can parameterize them using variables or external configuration, a fully metadata-driven orchestration approach requires custom implementation.
You may want to review this open-source project: Metadata-Driven ETL Framework for Complex Workflows in Apache Airflow . It demonstrates how metadata tables (stored in SQL Server or Postgres) can define job configurations, task sequences, and ETL logic, with Airflow reading and executing based on that metadata. This design pattern could be adapted in Fabric by sourcing metadata from a Lakehouse or Data Warehouse.
Some useful docs for reference:
What is Apache Airflow job? - Microsoft Fabric | Microsoft Learn
Dynamic Dag Generation — Airflow 3.1.0 Documentation
Hope this helps. Please reach out for further assistance.
Thank you.
Hi @tayloramy ,
Just wanted to check if the response provided was helpful. If further assistance is needed, please reach out.
Thank you.