This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreDid you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now
Hello folks!
First time poster but long time visitor of the forums.
Recently I migrated one of my larger data models (about 16 entities from roughly 6 source files) into a data flow so it can be used with multiple datasets. However; my refresh times have since sky rocketed. When working with the dataset in desktop, I got a refresh time of roughly 30 seconds; versus now, which takes about 15 minutes to refresh. I took a look at the refresh log, and there isn't one entity holding the whole thing up, its just the bigger entities take 2-3 minutes, and the smaller ones take 1 or less.
Some details:
- All queries are coming from a "web source" sharepoint folder, both now and before
- The largest chunk of data is only 25k rows
- Some entities are coming from other entities (query in a table of data, and then multiple different queries use that query as their source)
- There are a lot of expansion transformations (splitting rows, not just columns)
From what I've read, data flows are supposedly faster than power query in desktop, so I was wondering if anyone else has had some trouble or advice.
Solved! Go to Solution.
Check out the April 2026 Power BI update to learn about new features.
Sign up to receive a private message when registration opens and key events begin.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 10 | |
| 8 | |
| 8 | |
| 8 | |
| 7 |
| User | Count |
|---|---|
| 49 | |
| 27 | |
| 24 | |
| 20 | |
| 20 |