Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hello folks!
First time poster but long time visitor of the forums.
Recently I migrated one of my larger data models (about 16 entities from roughly 6 source files) into a data flow so it can be used with multiple datasets. However; my refresh times have since sky rocketed. When working with the dataset in desktop, I got a refresh time of roughly 30 seconds; versus now, which takes about 15 minutes to refresh. I took a look at the refresh log, and there isn't one entity holding the whole thing up, its just the bigger entities take 2-3 minutes, and the smaller ones take 1 or less.
Some details:
- All queries are coming from a "web source" sharepoint folder, both now and before
- The largest chunk of data is only 25k rows
- Some entities are coming from other entities (query in a table of data, and then multiple different queries use that query as their source)
- There are a lot of expansion transformations (splitting rows, not just columns)
From what I've read, data flows are supposedly faster than power query in desktop, so I was wondering if anyone else has had some trouble or advice.
Solved! Go to Solution.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.