Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello, does anyone managed to implement the sample/tutorial Transform data using dbt - Microsoft Fabric | Microsoft Learn ?
For us, with a Starter Pool running on Apache Airflow version 2.10.5, only the version of astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0 are valid.
....but when running the DAG 'dags/my_cosmos_dag.py we systematically get a Failed to start error. Thanks
Hi @rabbyn
Sorry for the late repsonse.
Have a try the below troubleshooting steps to resolve the issue:
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @v-csrikanth , stupid question but why are not replying to my replies (I mean that it seems that you're replying to my original thread post) ? it's just that it's making a mess out of this discussion thread. Anyway thanks for your new tips but I will wait for an update on my case #2505201420002935 , apparently a sort of fix is suppose to be released by PG , let's see....
#2505201420002935 - received the 7th of June -
""Hi Guillaume, I wanted to let you know that I’ve received confirmation from the product engineering team that the fix is scheduled for release next week. Once deployed, you’ll be able to use the following versions with DBT:
I’ll keep you informed once the fix is officially rolled out.""
Hi @rabbyn
Thanks for your response. I will make sure to reply directly to your follow-ups moving forward to keep the conversation clearer. I appreciate you sharing the update from your ticket (#2505201420002935).
It’s great to hear that the product engineering team has scheduled the fix for release next week and that the compatible versions will be astronomer-cosmos==1.10.1 and dbt-fabric==1.9.5.
Once the fix is deployed and you've had a chance to test it successfully, it would be incredibly helpful for others in the community if you could share the working configuration and confirm that the issue is resolved just to close the loop on this topic.
Thanks again for your collaboration.
Best Regards,
Cheri Srikanth.
Hi @rabbyn
Thanks for reaching out to Fabric Community.
Here are the few check points that might resolve your issue.
Suggested troubleshooting steps:
Add your .env or DAG-level configuration using:
import os
os.environ["FABRIC_WORKSPACE_ID"] = "<your-workspace-id>"
os.environ["FABRIC_CAPACITY_ID"] = "<your-capacity-id>"
os.environ["FABRIC_WAREHOUSE_ID"] = "<your-warehouse-id>"
If you are using DbtDag, validate all dbt_kwargs and project paths:
DbtDag(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
# other parameters
)
Rebuild the environment or delete and recreate the Airflow workspace to clear possible internal caching.
If the above information helps you, please give us a Kudos and marked the Accept as a solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @v-csrikanth , thanks for your support. I did follow all the troubleshooting steps you mentioned but no success, despite the requirements.txt being "validated" the Airflow cluster is not starting up (time out after 18min). I tried with starter cluster then I recreate a brand new Apache Airflow from an other Workspace and the outcome is the same. Below the screenshot you find a code snippet from my latest DAG script (based on your instructions , I something is wrong let me know). Thanks
import os
from pathlib import Path
from datetime import datetime
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig
# Add environment variables here
os.environ["FABRIC_WORKSPACE_ID"] = "18584879-394b-****-8d34-61e864c0bd1c"
os.environ["FABRIC_CAPACITY_ID"] = "FAC80AA7-5E69-****-88E7-DE388FC23422"
os.environ["FABRIC_WAREHOUSE_ID"] = "60a45649-7e92-****-90b3-237243a35114"
DEFAULT_DBT_ROOT_PATH = Path(__file__).parent.parent / "dags" / "nyc_taxi_green"
DBT_ROOT_PATH = Path(os.getenv("DBT_ROOT_PATH", DEFAULT_DBT_ROOT_PATH))
profile_config = ProfileConfig(
profile_name="nyc_taxi_green",
target_name="fabric-dev",
profiles_yml_filepath=DBT_ROOT_PATH / "profiles.yml",
)
dbt_fabric_dag = DbtDag(
project_config=ProjectConfig(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
),
operator_args={"install_deps": True},
profile_config=profile_config,
schedule_interval="@daily",
start_date=datetime(2024, 9, 10),
catchup=False,
dag_id="dbt_fabric_dag",
)
Hi @rabbyn
Thank you for being part of the Microsoft Fabric Community.
I trying to implement the sample tutorial in my workspace once it was successfully created will post the steps in detail that might help you to resolve your issue.
Best Regards,
Cheri Srikanth.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.