Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
I was going to post a /rant about the ecosystem, but I think I'm past that. Now I'm just resigned to the futility of it.
TLDR; PowerBI development is cumbersome ... what are your secrets?
How do people actually develop code in this "wonderful new PowerBI world"? What's your toolset, your methodolody, your approach? I personally am finding it so cumbersome, so quirky, and so unbelievabley sloooooow that I don;t think I will produce anything useful in a month - no make that a year - of Sundays.
Here's my reference point:
There's always been a cycle of tension between of hardware and software maturity. My reference is Intel/Windows. The desktop PC was created, and the world opened up. hardware made rapid advances as it always has, going right from the original desktop PC upto the pentium, becoming faster at every turn. At the same time, Othe OS evolved from text to graphical and Windows went the route of '95, '98, NT, Vista, etc. all the way to the current Win11. (I have little exposure to other systems like Linux).
The hardware got faster, clever new OS's and Apps were created, and the hardware took strain, then the hardware got better, and things started flying again, and then the soft got more demanding, etc. The cycle continues.
So I remember decades ago, how we used to use SQL Server, with various coding front ends. Whatever they were... the one thing that I am missing is that they were fast. Fast to create, fast to run, fast to debug. Query a database of 20 million rows? - 10 seconds no problem. Run some code to create analytics? under 5 minutes to analyse the entire problem and priduce several million rows of analysis, detailed and summarised.
And now? The visuals are getting elegant and beautiful, sure, but the prep/clean steps of data engineering? It's effectively an unavoidable waste of time.
I work with PowerBI and here's the only way I know how.
1. Develop the powerBI dataflows in Excel, because at least one can save the code even if it's not running first.
2. Then copy and paste the code to PowerBI online when it's working in Excel.
3. Don't try do it in PowerBI Desktop, because there's no direct data view like Excelto explore the interim data produced.
4. But of course, Excel PowerQuery is not the same as PowerBI PowerQuery, so code will have to change.
5. Try and limit the DataSource calls. If I'm lucky, the API offers a filter, and I can retrieve 5 records for testing purposes. If I'm not lucky, the API won't, and I'll have to wait 20 minutes to pull 9,000 rows.
6. Be very very careful about what I click when. PowerQuery is the world's biggest macro. If I click on the wrong step, it starts refreshing, and everything hangs up. If I try interrupt it, then the entire application hangs up.
7. Underneath it all, remember that the files are stored on OneDrive. So best do a temporary work around and have the workspace on a local disk folder that is not synched with OneDrive.
8. And what about gateways. No, not all data is just accessible unless "gateways" are setup.
I'm well aware, that hardware plays a part. Of course there's a difference between server based resources and local hardware. There's a network, and security with firewalls, and local antivirus software. All that's happening in the background, and plays a part. Sure.
Maybe I'm supposed to use co-pilot. Sure, having a verbose dyslexic three year old tossing things at me is useful.
So, at the end of the day, I have spent a month working on 3 dataflows of maybe one hundred lines each, and it's taken me a full month?!?! With data from API's and databases, and SharePoint. So let's say 160 man hours.
I estimate 10% has been learning some new things, 40% decoding and analysing the data, and 50% waiting.
Of course, estimates are skewed by my frustration, and highly subjective.
So... how do you people do it?
Hi rajendraongole1
Considering what an abortion of a datastore SharePoint is, the supreme irony is that SharePoint is actually the easiest to use here - it gives the least hassles.
Creating sample datasets, building mock APIs, breaking up dataflows in modular components, these are all doable, sure. But tha is all required because PowerBI is like a coal powered truck in the data engineering world. It trundles along with massive inertia, but no elegance. It cannot dance. It's lipstick on a pig.
Thanks for the pointers, and acknowleding the problem.
Hi @Netrelemo - I totally get where you're coming from—the Power BI development process can indeed feel painfully slow and convoluted, especially when dealing with large datasets, API integrations, and Power Query quirks. You're not alone in feeling like the ecosystem is geared toward a “build it slowly” rather than “rapidly iterate” kind of workflow.
When dealing with large datasets, try creating sample datasets whenever possible. You can create a separate query that limits rows (e.g., only top 1000 rows) or filters data for testing. This minimizes load time as you refine your transformations.
If you're using an API, you can often filter on the API side to pull in fewer rows for testing. If the API doesn't support filtering, it might be worth creating a mock API with a subset of data for development.
Power BI Dataflows can take on some of the data transformation tasks, and because they're hosted in the Power BI service, they offload the processing load from your desktop machine.
Break up complex transformations into multiple dataflows or “staging” dataflows. This makes it easier to debug and speeds up refreshes when each stage is smaller.
With a Premium Per User (PPU) license, you get more flexibility and performance for larger dataflows.
Your frustrations are valid, and honestly, they’re common among Power BI developers—particularly when dealing with complex data and sources like SharePoint or APIs. The learning curve for Power BI development is steep, and it’s not always intuitive for those of us used to traditional, faster database tools. By incorporating some of these best practices and leaning on external tools, you can hopefully make the process a bit more manageable.
You're not alone, and with time, you’ll likely find a rhythm and toolset that works for you. If there’s a silver lining, it’s that Power BI’s flexibility does allow you to piece together the approach that best suits your workflow—even if it’s a bit of a patchwork solution!
I hope i covered all the points raised. Reference documents FYR:
Reference links:
Understanding query evaluation and query folding in Power Query - Power Query | Microsoft Learn
Using incremental refresh with dataflows - Power Query | Microsoft Learn
Proud to be a Super User! | |
The Power BI Data Visualization World Championships is back! It's time to submit your entry.
| User | Count |
|---|---|
| 49 | |
| 43 | |
| 36 | |
| 33 | |
| 30 |
| User | Count |
|---|---|
| 138 | |
| 120 | |
| 60 | |
| 59 | |
| 56 |