Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
bryanrubink
Regular Visitor

Dataflow Gen2: "Unhandled exception while provisioning Lakehouse artifact"

My team and I are relatively new to Dataflows, and very new to Dataflows Gen2.  It now pops up as the recommended starting place for new dataflows (if you select Gen 1).   

However, even with my first attempt, I immediately run into an error aftter saving the flow.  The preview while in the editor works fine.  it's just when publishing and doing a full refresh (i assume).  that the error pops up:
""An external error occurred while refreshing the dataflow: Unhandled exception while provisioning Lakehouse artifact (Session ID:xxxxxx)""

Other team members have reported similar difficulties with Gen2, and are sticking with creating Gen1 which tends to "just work"

I'm wondering if we're doing something wrong.

1 ACCEPTED SOLUTION
v-ssriganesh
Community Support
Community Support

Hi @bryanrubink,

Thank you for reaching out to the Microsoft Fabric Community Forum.

The error you are seeing “Unhandled exception while provisioning Lakehouse artifact” typically indicates an issue with the Lakehouse association or provisioning process within your workspace. This is a common scenario when getting started with Dataflows Gen2, and it can usually be resolved by verifying a few key settings.

Could you please check the following:

  • Kindly verify that your workspace is backed by a Fabric Capacity (e.g., F64, F128, etc.). Go to your Workspace Settings → Premium/Capacity Settings. Confirm that the capacity type is Fabric. If it shows Pro instead, Dataflows Gen2 features (including Lakehouse integration) will not work.

  • Check if there is an existing Lakehouse in your workspace Navigate to your Workspace → Look for a Lakehouse item and Open your Dataflow Gen2 → Click Settings (top-right). Under Output Settings, confirm that a valid Lakehouse is selected.

  • If you just created a Lakehouse in Step 2, select that one as your destination.

  • Please ensure you have at least Member or Contributor access to the workspace. This is necessary to create artifacts like Lakehouse and Dataflows Gen2.

If the issue continues, please try using a different browser or Incognito mode to eliminate any potential caching problems.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you

View solution in original post

3 REPLIES 3
bryanrubink
Regular Visitor

Sorry for the delay in responding - you responded too quickly, and I have this account hooked up to my personal email which i don't check often enough apparently.  Probably TMI for you 🙂
1. yes - we are on fabric capacity

bryanrubink_0-1740457122472.png
2. i created (just now) a lakehouse for my workspace, and selected that, and the flow runs.

i have a follow up question though.  Should we create just 1 lakehouse to share between workspaces?  Or is there a reason to creates a lakehouse in each workspace?



v-ssriganesh
Community Support
Community Support

Hi @bryanrubink,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

v-ssriganesh
Community Support
Community Support

Hi @bryanrubink,

Thank you for reaching out to the Microsoft Fabric Community Forum.

The error you are seeing “Unhandled exception while provisioning Lakehouse artifact” typically indicates an issue with the Lakehouse association or provisioning process within your workspace. This is a common scenario when getting started with Dataflows Gen2, and it can usually be resolved by verifying a few key settings.

Could you please check the following:

  • Kindly verify that your workspace is backed by a Fabric Capacity (e.g., F64, F128, etc.). Go to your Workspace Settings → Premium/Capacity Settings. Confirm that the capacity type is Fabric. If it shows Pro instead, Dataflows Gen2 features (including Lakehouse integration) will not work.

  • Check if there is an existing Lakehouse in your workspace Navigate to your Workspace → Look for a Lakehouse item and Open your Dataflow Gen2 → Click Settings (top-right). Under Output Settings, confirm that a valid Lakehouse is selected.

  • If you just created a Lakehouse in Step 2, select that one as your destination.

  • Please ensure you have at least Member or Contributor access to the workspace. This is necessary to create artifacts like Lakehouse and Dataflows Gen2.

If the issue continues, please try using a different browser or Incognito mode to eliminate any potential caching problems.

If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.