Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
RadkaProefo
Regular Visitor

Dataflow is not refreshing

Hello fellows,

 

I have a big big problem with dataflow refresh. Tihs dataflow is connected to files in sharepoint (json, xlsx). I needed to make some changes, do I made a copy. Then I loaded a new column in one query and replaced one column in merge by this column. Everithing looks correct, dataflow saves. But it does not refreshes (connections are correct). It shows this mistake:

 

Error: Request ID: aa73004d-f5bf-6778-7765-c692ff61b117 Activity ID: c4cb787a-758e-4148-ba6a-fa734dc17081

 

Can anybody help me please?

 

Radka

5 REPLIES 5
Anonymous
Not applicable

Hi @RadkaProefo ,

 

We can't check what the error is with the IDs etc you provided, are there other error messages?

 

You can provide specific error messages via refresh history so that we can figure out the error. I have a related datafolw refresh troubleshooting article here that you can refer to.

Troubleshooting dataflow issue - get data from dataflow - Power Query | Microsoft Learn

 

If it does not help, please provide more details with the error.

 

Best Regards,

Yifan Wang

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I got the exact same problem.  It's been like this for weeks now.  I tried everything, and even though the on-prem data loads into Power Query/DF Gen2, the automatic refresh that starts after you click publish ALWAYS FAILS!

 

What's the solution?

Hello,

 

I still try to find a solution. I had a call with Power BI Microsoft support and they told me, that they cannot solve it, because it is a problem of Excel and I have to asi Excel Microsoft support.

I tried something else yesterday.  I created 2 Dataflows Gen2: DF1 and DF2.  In DF1, connect to your data source, do transformations, etc, AND DO NOT specify a destination.  Then click Publish. This should create a table in the DataflowsStagingLakehouse (in my case all the source column names are lost and replaced by generic column names such as Column1, Column2, etc..., but I get them back at the end of this process). Wait til the refresh is finished.

 

Connect DF2 to DF1 AND, this time, specify a destination in Fabric (could be the default staging lakehouse or... actually I haven't tested with another type of destination other than the default staging lakehouse, so don't know if it'll work too for other destination types). Click Publish. Wait til the refresh is finished.

 

Go to your destination and you should find a table with the data and all the correct column names.  In my case, the default staging lakehouse contains 3 copies of the same data: once because of DF1 (even though no destination was specified, the Enable staging option was on), twice because of DF2 (same reason as DF1), and a third copy being the final result, which is the output from DF2.  That's a lot of waste storage if you ask me, but so far the only solution I found to ingest data from an on-prem DB.

 

Hello,

 

that is the problem, there were no other error messages. When dataflow was open, all queries are loaded and all column profiles shows no errors (it is settled for all rows).

RadkaProefo_0-1702648277853.png

 

Error message via refresh history is that one, I sent.

RadkaProefo_1-1702648328489.png

 

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors