Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
I have a power query that calls a function to get PowerBI Activity with an API Call. The function keeps calling itself combining the data until the continuation token is null. This works fine in PowerBI Desktop. But in Power BI Dataflow, I get circular dependency error.
This is the power query. Does anyone know how to get dataflow to not error with the circular dependency?
The Function Name is: GET Activity
Solved! Go to Solution.
Hi, @alan2468
Power Query in Power BI Desktop and Power BI Dataflow have some differences in how they process queries. Dataflows are designed to discourage or prevent circular references, which can cause sync issues in a cloud environment. In contrast, Power BI Desktop is more lenient with such dependencies because the processing is local.
Instead of using recursion, rework the logic to be iterative, you can use while loop to process pages of data until the continuation token is null, appending results after each call. If possible, pre-load the data to an intermediate storage (like Azure Blob Storage) then use DF to the read the data. If the data size is manageable, you might retrieve all the data in one call, then use schedule refresh.
Proud to be a Super User!
Hi, @alan2468
Power Query in Power BI Desktop and Power BI Dataflow have some differences in how they process queries. Dataflows are designed to discourage or prevent circular references, which can cause sync issues in a cloud environment. In contrast, Power BI Desktop is more lenient with such dependencies because the processing is local.
Instead of using recursion, rework the logic to be iterative, you can use while loop to process pages of data until the continuation token is null, appending results after each call. If possible, pre-load the data to an intermediate storage (like Azure Blob Storage) then use DF to the read the data. If the data size is manageable, you might retrieve all the data in one call, then use schedule refresh.
Proud to be a Super User!
| User | Count |
|---|---|
| 20 | |
| 10 | |
| 8 | |
| 8 | |
| 7 |