Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
Hi All,
I am getting an error and wonder if anyone else came across the same issue and managed to solve it.
Here is a snippet of the current composite model that I have.
I have disabled both the refresh and load of excel files as they are just historical data.
The Incremental refresh is configured only on the API table, which gets combined later with the historical data.
Following the upload of the pbix file to the premium capacity to refresh (so it gets partitioned), I receive the following error:
error:{"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"The operation has timed out."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.ValueError.Reason","detail":{"type":1,"value":"DataSource.Error"}}],"exceptionCulprit":1}}}Cluster URI:WABI-XXXXXXX-G-PRIMARY-redirect.analysis.windows.netActivity ID:XXXXXXXXXXXXXXXXXXXXXXXXXXXX
Looking at the gateway config, I can see the below missing info:
Do I need to do more configurations?
Thanks
Solved! Go to Solution.
Save yourself some grief and move your ETL from the dataset to dataflows. Create dataflows by source and avoid mixing sources that don't need the gateway with sources that do. Also, that way it's easy to manage dataflows with different refresh schedules (or no refreshes at all because they're a historical partition).
Save yourself some grief and move your ETL from the dataset to dataflows. Create dataflows by source and avoid mixing sources that don't need the gateway with sources that do. Also, that way it's easy to manage dataflows with different refresh schedules (or no refreshes at all because they're a historical partition).
Thanks for the swift response. I had a feeling I would be going in that direction.