Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hey,
I created a Python notebook to install and run dbt code, but when I call it from the pipeline, I can not get the execution logs
and get stuck in the loading state.
Thanks
Hi @nerminyehia,
The pipeline seems to be stuck because inline %pip install is disabled by default during notebook runs. As a result, installing dbt blocks execution and prevents logs from being streamed. Pre-install dbt in a Fabric Environment and attach it to your notebook or pipeline. This will allow for smooth execution and visible logs.
Below is the link fro your reference:
Manage Apache Spark libraries - Microsoft Fabric | Microsoft Learn
Thank you.
Hey,
Thank you so much for the suggestion. But I don't think it is related to the notebook content, because I tried to run an empty notebook with one print statement, but it keeps getting the same issue. I need to see the notebook run details whenever it is triggered by the pipeline.
Hi @nerminyehia,
Have you had a chance to review the solution we shared earlier? If the issue persists, feel free to reply so we can help further.
Thank you.
Hi @nerminyehia,
The notebook might seem stuck because every notebook run through a pipeline needs a Spark session to start, even if it’s empty. To check what’s happening and see the logs, you can go to Run → All Runs and use the Related Notebook tab in the monitoring view. Turning on High Concurrency mode for pipelines helps notebooks share sessions more efficiently and makes sure the logs are visible while they run.
Configure high concurrency mode for notebooks in pipelines - Microsoft Fabric | Microsoft Learn
Configure high concurrency mode for notebooks - Microsoft Fabric | Microsoft Learn
Thank you.
| User | Count |
|---|---|
| 8 | |
| 3 | |
| 3 | |
| 3 | |
| 2 |