Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
My team and I are relatively new to Dataflows, and very new to Dataflows Gen2. It now pops up as the recommended starting place for new dataflows (if you select Gen 1).
However, even with my first attempt, I immediately run into an error aftter saving the flow. The preview while in the editor works fine. it's just when publishing and doing a full refresh (i assume). that the error pops up:
""An external error occurred while refreshing the dataflow: Unhandled exception while provisioning Lakehouse artifact (Session ID:xxxxxx)""
Other team members have reported similar difficulties with Gen2, and are sticking with creating Gen1 which tends to "just work"
I'm wondering if we're doing something wrong.
Solved! Go to Solution.
Hi @bryanrubink,
Thank you for reaching out to the Microsoft Fabric Community Forum.
The error you are seeing “Unhandled exception while provisioning Lakehouse artifact” typically indicates an issue with the Lakehouse association or provisioning process within your workspace. This is a common scenario when getting started with Dataflows Gen2, and it can usually be resolved by verifying a few key settings.
Could you please check the following:
If the issue continues, please try using a different browser or Incognito mode to eliminate any potential caching problems.
If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you
Sorry for the delay in responding - you responded too quickly, and I have this account hooked up to my personal email which i don't check often enough apparently. Probably TMI for you 🙂
1. yes - we are on fabric capacity
2. i created (just now) a lakehouse for my workspace, and selected that, and the flow runs.
i have a follow up question though. Should we create just 1 lakehouse to share between workspaces? Or is there a reason to creates a lakehouse in each workspace?
Hi @bryanrubink,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @bryanrubink,
Thank you for reaching out to the Microsoft Fabric Community Forum.
The error you are seeing “Unhandled exception while provisioning Lakehouse artifact” typically indicates an issue with the Lakehouse association or provisioning process within your workspace. This is a common scenario when getting started with Dataflows Gen2, and it can usually be resolved by verifying a few key settings.
Could you please check the following:
If the issue continues, please try using a different browser or Incognito mode to eliminate any potential caching problems.
If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 9 | |
| 8 | |
| 8 | |
| 8 | |
| 7 |
| User | Count |
|---|---|
| 38 | |
| 30 | |
| 26 | |
| 22 | |
| 19 |