Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
rabbyn
Regular Visitor

stuck with MS Learn Tutorial for Apache Airflow Jobs for dbt-fabric Orchestration

Hello, does anyone managed to implement the sample/tutorial Transform data using dbt - Microsoft Fabric | Microsoft Learn ?
For us, with a Starter Pool running on Apache Airflow version 2.10.5, only the version of astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0 are valid.

rabbyn_0-1747759252096.png

....but when running the DAG 'dags/my_cosmos_dag.py we systematically get a Failed to start error. Thanks

rabbyn_1-1747759413976.png

 




7 REPLIES 7
v-csrikanth
Community Support
Community Support

Hi @rabbyn 
Sorry for the late repsonse.

Have a try the below troubleshooting steps to resolve the issue:

  • Please try adding the ExecutionConfig parameter explicitly within your DbtDag definition to ensure the correct execution path for dbt is used. This has helped in several environments where the default dbt path isn’t properly picked up.
  • It may help to validate that both dbt_project.yml and profiles.yml are present in the expected directories and have the correct structure and file permissions. Ensuring these files are accessible to the Airflow environment can prevent runtime load failures.
  • As a quick debug step, you might consider testing with a minimal Python-based DAG (without dbt) just to confirm that the Airflow cluster is functional. This can help isolate whether the issue lies with the DAG structure or the environment setup.
  • Try enaming the DAG file (e.g., changing dbt_fabric_dag.py to main_dag.py) helps avoid conflicts when the filename matches the DAG ID. It’s a small change but worth trying in case of DAG registration issues during parsing.

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

Hi @v-csrikanth , stupid question but why are not replying to my replies (I mean that it seems that you're replying to my original thread post) ? it's just that it's making a mess out of this discussion thread. Anyway thanks for your new tips but I will wait for an update on my case #2505201420002935 , apparently a sort of fix is suppose to be released by PG , let's see....


#2505201420002935 - received the 7th of June - 

""Hi Guillaume,  I wanted to let you know that I’ve received confirmation from the product engineering team that the fix is scheduled for release next week. Once deployed, you’ll be able to use the following versions with DBT:

  • astronomer-cosmos==1.10.1
  • dbt-fabric==1.9.5

I’ll keep you informed once the fix is officially rolled out.""

Hi @rabbyn 

Thanks for your response. I will make sure to reply directly to your follow-ups moving forward to keep the conversation clearer. I appreciate you sharing the update from your ticket (#2505201420002935).

It’s great to hear that the product engineering team has scheduled the fix for release next week and that the compatible versions will be astronomer-cosmos==1.10.1 and dbt-fabric==1.9.5.


Once the fix is deployed and you've had a chance to test it successfully, it would be incredibly helpful for others in the community if you could share the working configuration and confirm that the issue is resolved just to close the loop on this topic.

Thanks again for your collaboration.

Best Regards,
Cheri Srikanth.

v-csrikanth
Community Support
Community Support

Hi @rabbyn 
Thanks for reaching out to Fabric Community.
Here are the few check points that might resolve your issue.

  • Ensure the required environment variables (FABRIC_WORKSPACE_ID, FABRIC_WAREHOUSE_ID, FABRIC_CAPACITY_ID) are explicitly defined in the "Environment variables" section of the Airflow configuration in Fabric.
  • Even if the requirements.txt file validates successfully with astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0, re-deploy the environment from scratch to eliminate issues caused by corrupted or cached dependencies.
  • Check the Airflow logs for any Python-related import errors such as No module named cosmos or dbt not found, which indicate module loading issues during DAG startup.
  • Verify that your my_cosmos_dag.py file is located directly under the /dags directory and follows correct DAG declaration syntax required by Airflow.
  • The Starter Pool in Fabric may impose limits on compute or task parallelism, so reduce task concurrency if your DAG is calling resource-heavy operations like dbt transformations.

Suggested troubleshooting steps:
Add your .env or DAG-level configuration using:

import os

os.environ["FABRIC_WORKSPACE_ID"] = "<your-workspace-id>"
os.environ["FABRIC_CAPACITY_ID"] = "<your-capacity-id>"
os.environ["FABRIC_WAREHOUSE_ID"] = "<your-warehouse-id>"

If you are using DbtDag, validate all dbt_kwargs and project paths:

DbtDag(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
# other parameters
)
Rebuild the environment or delete and recreate the Airflow workspace to clear possible internal caching.

 

If the above information helps you, please give us a Kudos and marked the Accept as a solution.

Best Regards,
Community Support Team _ C Srikanth.

Hi @v-csrikanth , thanks for your support. I did follow all the troubleshooting steps you mentioned but no success, despite the requirements.txt being "validated" the Airflow cluster is not starting up (time out after 18min). I tried with starter cluster then I recreate a brand new Apache Airflow from an other Workspace and the outcome is the same. Below the screenshot you find a code snippet from my latest DAG script (based on your instructions , I something is wrong let me know). Thanks

rabbyn_1-1749192284274.png

import os
from pathlib import Path
from datetime import datetime
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig

# Add environment variables here
os.environ["FABRIC_WORKSPACE_ID"] = "18584879-394b-****-8d34-61e864c0bd1c"
os.environ["FABRIC_CAPACITY_ID"] = "FAC80AA7-5E69-****-88E7-DE388FC23422"
os.environ["FABRIC_WAREHOUSE_ID"] = "60a45649-7e92-****-90b3-237243a35114"

DEFAULT_DBT_ROOT_PATH = Path(__file__).parent.parent / "dags" / "nyc_taxi_green"
DBT_ROOT_PATH = Path(os.getenv("DBT_ROOT_PATH", DEFAULT_DBT_ROOT_PATH))
profile_config = ProfileConfig(
     profile_name="nyc_taxi_green",
     target_name="fabric-dev",
     profiles_yml_filepath=DBT_ROOT_PATH / "profiles.yml",
)

dbt_fabric_dag = DbtDag(
     project_config=ProjectConfig(
         project_dir="/usr/local/airflow/dags/dbt",
         profiles_dir="/usr/local/airflow/dags/dbt",
     ),
     operator_args={"install_deps": True},
     profile_config=profile_config,
     schedule_interval="@daily",
     start_date=datetime(2024, 9, 10),
     catchup=False,
     dag_id="dbt_fabric_dag",
)

 

 

rabbyn
Regular Visitor

hi @v-csrikanth , did you manage to make it work ? thanks

v-csrikanth
Community Support
Community Support

Hi @rabbyn 
Thank you for being part of the Microsoft Fabric Community.
I trying to implement the sample tutorial in my workspace once it was successfully created will post the steps in detail that might help you to resolve your issue.

Best Regards,
Cheri Srikanth.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors