March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi,
Currently, I am working on dataflows.
My 2/3 dataflows can be refreshed perfectly.
However, one of them cannot be refreshed. even I am trying to do the first refresh (refresh now) there is error like below.
"Error: Internal error Request ID: a76ee73d-2e6b-e6e6-37ec-cdb42fa7d5aa Activity ID: 8f483eba-e4e2-4fce-8266-897b522b049d"
I would like to understand the root cause of this error. So I can figure out the way to mitigate this problem.
Suggestions appreciated.
Hey, @collinq
Thank you so much for your suggestion. I will split it into 3 flows as per your suggestion! 🙂
Hey @Anonymous ,
Is the third dataflow using a different Data Source? Or different Gateway? Are both of those correctly configured? Is this an error after a long time (like maybe you are timiing out)?
Proud to be a Datanaut!
Private message me for consulting or training needs.
Hi,
Q: Is the third dataflow using a different Data Source? Or different Gateway?
Ans: Basically the 3rd dataflow are combining the table from 3 excels files into single table. The excels are located on the same sharepoint destination using my connection.
Q: Are both of those correctly configured?
Ans: Subject to my above answer, I guess both of them are correctly configured. Please advice if it is not.
Q: Is this an error after a long time (like maybe you are timiing out)?
Ans: Yes, the refresh tooks very long period (more than 2 hours).
Do you have idea the root cause subject to the above response??
Hey @Anonymous ,
I am betting the issue is the third one - the time limit. There is a 2 hour time limit on refreshes with a Pro license. I am wondering if you can maybe get the three files separately? Or, make your query more efficient somehow? Or, maybe make the source file smaller (ie. if it is a lot of years of data and you only need the last 5 then trim the source before you try to hit it).
But, I think that is the first place to look - to get the refresh time under 2 hours. Preferrably way under 2 hours so that if the data grows a bit you aren't in the same position quickly.
Proud to be a Datanaut!
Private message me for consulting or training needs.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
User | Count |
---|---|
36 | |
27 | |
19 | |
11 | |
8 |
User | Count |
---|---|
52 | |
43 | |
25 | |
12 | |
11 |