Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I am looking for a better/alternative method to using dataflow gen 2 to load data into my Fabric Lakehouse. I looked into using Spark and/or Notebooks, but am stuggling to use my saved connection to access my source data. Is it possible to use a saved connection in a notebook (much like how you can with dataflows, pipelines, semantic models, etc)? Or is dataflow/copy data the best way to ingest data if a saved connection must be used?
Solved! Go to Solution.
Hi @parkergeis Thank you for reaching out to the Microsoft Fabric Community. I appreciate @suparnababu8 for their helpful input.
Currently, saved connections in Microsoft Fabric are not directly accessible within notebooks they are primarily supported in data pipelines, dataflows, and semantic models. If you require this functionality within notebooks, I recommend submitting feedback through Microsoft’s Ideas portal, where features with strong community support are considered for future updates: Fabric Ideas - Microsoft Fabric Community as mentioned by suparnababu8.
As an alternative, you can still use notebooks to load data into your Fabric Lakehouse by manually configuring the connection within the notebook. This can be achieved using Spark APIs (e.g., spark.read.format()) or Python libraries like pyodbc or requests for other data sources.
For security, it’s essential to avoid hardcoding credentials instead, use Azure Key Vault or Fabric’s built-in secret management. For more details, you can refer to the official documentation on using notebooks in Microsoft Fabric and explore community guidance on managing connections.
I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Hi @parkergeis Thank you for reaching out to the Microsoft Fabric Community. I appreciate @suparnababu8 for their helpful input.
Currently, saved connections in Microsoft Fabric are not directly accessible within notebooks they are primarily supported in data pipelines, dataflows, and semantic models. If you require this functionality within notebooks, I recommend submitting feedback through Microsoft’s Ideas portal, where features with strong community support are considered for future updates: Fabric Ideas - Microsoft Fabric Community as mentioned by suparnababu8.
As an alternative, you can still use notebooks to load data into your Fabric Lakehouse by manually configuring the connection within the notebook. This can be achieved using Spark APIs (e.g., spark.read.format()) or Python libraries like pyodbc or requests for other data sources.
For security, it’s essential to avoid hardcoding credentials instead, use Azure Key Vault or Fabric’s built-in secret management. For more details, you can refer to the official documentation on using notebooks in Microsoft Fabric and explore community guidance on managing connections.
I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Hello @parkergeis
Unfortunately, saved connections in Fabric environment are not directly accessable in noteboooks. This direct conenctions are currently supporting to data pipelines. Submit your idea to Microsoft. If you get enough votes, they may consider in future. Raise your idea here: [Submit Your Idea]
If you need additional info, Pls read the below blogs.
How to use notebooks - Microsoft Fabric | Microsoft Learn
Solved: Use connection from Manage connections and gateway... - Microsoft Fabric Community
Thank you!!!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!