Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
I’ve been given a task by my team (and my manager mentioned it could be a proving point for me). The task is to connect SharePoint to Fabric, but not through Dataflows, since that approach keeps failing and isn’t feasible. They want the connection to be made either through notebooks or some other alternative. Is there a way to do that?
Solved! Go to Solution.
Yes, there’s definitely a way out. You can connect SharePoint to Microsoft Fabric without using Dataflows. Here are your main alternatives:
If you have access to OneLake Notebooks in Fabric, this is the cleanest and most flexible approach. You can connect to SharePoint using Python libraries.
Steps:
Create a new Notebook in Fabric under your Workspace.
In the first cell, install and import the needed packages:
Then use the following sample code:
From there, you can write the data into a Lakehouse table:
This method avoids Dataflows entirely and lets you automate pulling data from SharePoint directly into your Lakehouse.
If you have Power Automate available, you can:
Set up a flow that copies files from SharePoint to OneLake or Azure Blob Storage.
Then connect Fabric to that location (which Fabric can read natively).
This is good for scheduled or triggered updates (e.g., when a file changes in SharePoint).
If your data isn’t changing too frequently, download the SharePoint file and upload it manually to your Lakehouse or OneLake.
You can then use Shortcuts in Fabric to reference that file as a dataset. Not ideal for automation, but good for one-time or proof-of-concept runs.
Yes, there’s definitely a way out. You can connect SharePoint to Microsoft Fabric without using Dataflows. Here are your main alternatives:
If you have access to OneLake Notebooks in Fabric, this is the cleanest and most flexible approach. You can connect to SharePoint using Python libraries.
Steps:
Create a new Notebook in Fabric under your Workspace.
In the first cell, install and import the needed packages:
Then use the following sample code:
From there, you can write the data into a Lakehouse table:
This method avoids Dataflows entirely and lets you automate pulling data from SharePoint directly into your Lakehouse.
If you have Power Automate available, you can:
Set up a flow that copies files from SharePoint to OneLake or Azure Blob Storage.
Then connect Fabric to that location (which Fabric can read natively).
This is good for scheduled or triggered updates (e.g., when a file changes in SharePoint).
If your data isn’t changing too frequently, download the SharePoint file and upload it manually to your Lakehouse or OneLake.
You can then use Shortcuts in Fabric to reference that file as a dataset. Not ideal for automation, but good for one-time or proof-of-concept runs.
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 45 | |
| 38 | |
| 34 | |
| 21 | |
| 17 |
| User | Count |
|---|---|
| 66 | |
| 65 | |
| 31 | |
| 26 | |
| 26 |