Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I created a Dataflow, which calls my Azure Function about 300 times. The Azure Function is perfect, it never errors. The Dataflow is perfect, the dataflow in the designer correctly loads my data 100% of the time. But I can't get the Dataflow to successfully refresh very often. May be only 10% of the time will it complete a refresh. Sometimes it fails after 14 seconds. Sometimes it fails after 3 minutes, and every possible time in between. The error Power BI tells me is an "Internal Server Error" in my Azure Function. However, this is not true. There are no errors in my Azure Function. I can only assume it's some jank reason that there is some rate limit, too many calls in too shorter period. Not that 300 calls in 3-4 minutes period is that high and should cause any problems for Azure or Power BI, afaik.
I don't have any idea on what the 'real' cause of the issue is or how to find out what the real error is. The dataflow doesn't seem to have a scaling option to run one query at a time like a Semantic Model. That throttling may've helped. I also tried a Fail-Wait-Retry algorithm on the Dataflows "Web.Contents". But Microsoft came up with some reason why such a Dataflow cannot be saved, apparently it's too dynamic for their liking. I've also added a 5 second delay after every Web.Contents to slow the Dataflow down, which has mixed results.
May be there's a way I can get around this "One or more tables references a dynamic data source" and actually use a Fail-Wait-Retry algorithm on the Dataflows "Web.Contents"? Or any other ideas or suggestions would be appreciated.
Solved! Go to Solution.
I did eventually find a way to create a 'Fail-Wait-Retry' algorithm that can be saved and doesn't get blocked by the "One or more tables references a dynamic data source" boss. The 'Fail-Wait-Retry' algorithm has, thus far, enabled my Dataflow to be run successfully 100% of the time.
Hi @shanerowley-ds ,
I'm glad to hear you've resolved the issue. You can accept your reply as solution. An answered thread is searched more easily than an open one. Others will learn more from your answer.
Best Regards
Community Support Team _ Rongtie
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @shanerowley-ds ,
I'm glad to hear you've resolved the issue. You can accept your reply as solution. An answered thread is searched more easily than an open one. Others will learn more from your answer.
Best Regards
Community Support Team _ Rongtie
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I did eventually find a way to create a 'Fail-Wait-Retry' algorithm that can be saved and doesn't get blocked by the "One or more tables references a dynamic data source" boss. The 'Fail-Wait-Retry' algorithm has, thus far, enabled my Dataflow to be run successfully 100% of the time.
User | Count |
---|---|
23 | |
20 | |
10 | |
9 | |
7 |
User | Count |
---|---|
46 | |
29 | |
19 | |
18 | |
15 |