Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
We're runnning an Azure Synapse pipeline (daily) from which we loop over our Power BI datasets (currently 9). For each dataset we execute the refresh API using a POST call. After the POST call we have a check refresh step, waiting for the refresh to complete, either succesful or in failure.
What we observe is that the datasets often fail but it appears (to me) a bit random. The number or combination of which datasets fails look a bit random (one day just two fail, another day four, the next day none fail, the day after five fail...). After monitoring for a number of days there is no pattern emerging (yet).
The error messages also look a bit off since after a manual rerun in Power BI it seems to work fine. Error messages often hint towards corrupt datalines but we did not find actual issues in the data itself. To me it looks like the data is not fetched correctly, most likely incomplete.
Example of such message (removed the value reference for privacy reasons) below.
My questions regarding this issue:
- While looping over the datasets to post the refresh command, might there be a timing issue i.e. should we implement a delay between executing the refresh for each dataset?
- Is there a data limit e.g. cache or in memory what could cause this issue?
- Other possible cause...?
Error sample:
{"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"The key didn't match any rows in the table."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.ValueError.Key","detail":{"type":1,"value":"
No we have 9 datasets and the POST API refresh call is on each of the datasets...
i mean what is the data source for these semantic models?
The datasource is a SQL database - we have a deltalake database setup using Azure Synapse.
{synapse resource}-ondemand.sql.azuresynapse.net
are you suspending that after inactivity? It may need too much time to stand up again.
Are these refreshes all hitting the same data source?
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.