Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
balafbm
Advocate I
Advocate I

Dataflow Gen2 Not loading and refreshing data

We were on a Fabric trial Capacity and I had created a dataflow Gen2 to load all our 10 dimension tables which comes from our Azure Synapse DB. The data was loaded properly and the PowerBI semantic model referred to the Dataflow and imported the data. All good.

 

The structure was SQL DB -> Dataflow Gen2 -> Semantic model import


Recently we moved to a Fabric Capacity, and I created the same dataflow, however an error showed up: 

" An external error occurred while refreshing the dataflow: DataflowLakehouseInternalSqlNotProvisioned "

 

balafbm_0-1738420639963.png

The data did not load into the dataflow and the semantic model does not detect the dataflow tables.

Any solution to this?

 

1 ACCEPTED SOLUTION
nilendraFabric
Super User
Super User

Hello @balafbm 

 

DataflowLakehouseInternalSqlNotProvisioned” is associated with SQL analytics endpoint provisioning failures during Dataflow Gen2 refreshes, particularly in migration scenarios from Fabric trial to paid capacity.

 

Dataflow Gen2 automatically creates a staging lakehouse, but prior to January 2025, a known issue caused the SQL analytics endpoint for this lakehouse to fail provisioning. This led to refresh errors like:
`Refresh failed. The staging lakehouse is not configured correctly` .

 

While this was officially fixed in January 2025, migration from trial to paid capacity can reintroduce configuration mismatches if the workspace or lakehouse isn’t properly reassigned to the new capacity .

Workspaces migrated from trial to paid capacity may retain outdated references to trial resources, causing conflicts in SQL endpoint provisioning

 

https://learn.microsoft.com/en-us/fabric/get-started/known-issues/known-issue-809-dataflow-gen2-refr...

 

Please check:

 

Verify workspace assignments to the new paid capacity in the Fabric Admin Portal

 

https://community.fabric.microsoft.com/t5/Dataflow/Dataflow-Gen2-Couldn-t-refresh-the-entity-because...

 

Possible solutions:

 

Existing Lakehouse artifacts in the new Fabric Capacity workspace may lack proper SQL endpoint provisioning.
• Delete the current Lakehouse and create a new one. Republish the Dataflow Gen2 to reference the new Lakehouse

https://github.com/MicrosoftDocs/fabric-docs/blob/main/docs/get-started/known-issues/known-issue-809...

 

 

Disable Staging in Power Query

Please try these and let us know if this works.

 

Thanks

 

 

View solution in original post

5 REPLIES 5
v-nmadadi-msft
Community Support
Community Support

Hi @balafbm,

Could you please confirm if the issue has been resolved after raising a support case? If a solution has been found, it would be greatly appreciated if you could share your insights with the community. This would be helpful for other members who may encounter similar issues.

Thank you for your understanding and assistance.

balafbm
Advocate I
Advocate I

Thanks for the detailed explanation @nilendraFabric .
Here's what worked and what didn't.

 

When we were on Fabric trial capacity, I had created the dataflow gen2 in a workspace called "BI Production".
Before transitioning to Fabric capacity, I deleted all Fabric artifacts (including dataflow gen2)manually from this workspace.
(We had to delete Fabric artifacts from workspaces, as we were moving from one region to another)

 

After moving to a Fabric Capacity (F64) , I recreated the same dataflow gen2 manually from scratch in that same "BI Production" workspace, and I got that error which I mentioned in my Original Post.

" An external error occurred while refreshing the dataflow: DataflowLakehouseInternalSqlNotProvisioned "

 

I am still getting the error after trying the steps mentioned above.

 

Here's what worked:

I created the dataflow gen2 from scratch in a different workspace which did not have any fabric artifact when we were on trial capacity.

The data loaded perfectly and the refresh works perfectly.

 

So now, we are still not able to create and load dataflow gen2 in the "BI Production" workspace which had fabric artifcats when we were on trial capacity.

 

Hoping this can be fixed by Microsoft.

Hi @balafbm ,
Sorry to know it didn’t help. Please consider reaching out to Microsoft Support. You can provide them with all the troubleshooting steps you've already taken, which will help them understand the issue better and provide a resolution. They might be able to identify something specific about your admin account setup or provide a solution that isn't immediately obvious. 

Below is the link to create Microsoft Support ticket:

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

Thanks and Regards

v-nmadadi-msft
Community Support
Community Support

Hi @balafbm  ,
Thanks for reaching out to the Microsoft fabric community forum.

 

Is there any update regarding your issue, are you still encountering the same error after trying @nilendraFabric's Suggestions?
In addition to their points, if you check the Known issue - Dataflow Gen2 refresh fails due to missing SQL analytics endpoint - Microsoft Fabric |... document in the solution and workarounds part it is mentioned that "The issue only affects existing staging lakehouses. If you create a new dataflow, it won't have this issue." 
if by removing lakehouse and creating a new one doesnt fix it, you may consider creating a new dataflow with the same transformations as the existing one so as to mitigate the issue

If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS.
Thanks and Regards



nilendraFabric
Super User
Super User

Hello @balafbm 

 

DataflowLakehouseInternalSqlNotProvisioned” is associated with SQL analytics endpoint provisioning failures during Dataflow Gen2 refreshes, particularly in migration scenarios from Fabric trial to paid capacity.

 

Dataflow Gen2 automatically creates a staging lakehouse, but prior to January 2025, a known issue caused the SQL analytics endpoint for this lakehouse to fail provisioning. This led to refresh errors like:
`Refresh failed. The staging lakehouse is not configured correctly` .

 

While this was officially fixed in January 2025, migration from trial to paid capacity can reintroduce configuration mismatches if the workspace or lakehouse isn’t properly reassigned to the new capacity .

Workspaces migrated from trial to paid capacity may retain outdated references to trial resources, causing conflicts in SQL endpoint provisioning

 

https://learn.microsoft.com/en-us/fabric/get-started/known-issues/known-issue-809-dataflow-gen2-refr...

 

Please check:

 

Verify workspace assignments to the new paid capacity in the Fabric Admin Portal

 

https://community.fabric.microsoft.com/t5/Dataflow/Dataflow-Gen2-Couldn-t-refresh-the-entity-because...

 

Possible solutions:

 

Existing Lakehouse artifacts in the new Fabric Capacity workspace may lack proper SQL endpoint provisioning.
• Delete the current Lakehouse and create a new one. Republish the Dataflow Gen2 to reference the new Lakehouse

https://github.com/MicrosoftDocs/fabric-docs/blob/main/docs/get-started/known-issues/known-issue-809...

 

 

Disable Staging in Power Query

Please try these and let us know if this works.

 

Thanks

 

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors