Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello everyone,
I'm working on a fairly complex Power BI report that pulls data from multiple surveys, including one particularly large dataset with over 200 questions. All data modeling is done in Power Query, and I've created over 100 DAX measures, along with several field parameters used across the report pages.
The report currently has 12 pages (with more potentially to come), each packed with visualizations that rely on these complex DAX measures. Unfortunately, this setup is pushing Power BI to its limits.
While performance in Power BI Service is acceptable (around 5–6 seconds to load a page or switch bookmarks), working locally in Power BI Desktop is quite painful — switching between pages can take up to 20 seconds.
To address this, I'm considering splitting the report into several smaller reports, each containing only a few related pages, DAX measures, and tables. I would then use a dashboard in Power BI Service as a central "repository" or navigation hub. I already have a main landing page in the report that users use to navigate to other pages, so this could be adapted into a dashboard linking to the smaller reports.
I know this topic has come up before, but each scenario is a bit different. I’d really appreciate your thoughts, experiences, or suggestions on improving performance and making development more manageable.
Thanks in advance!
Solved! Go to Solution.
Hi @mdm2025 ,
It sounds like you’ve built quite a substantial report, and it’s easy to see why performance might be struggling a bit given the amount of data and complexity involved. Before going down the route of splitting the report, it’s really worth focusing on optimizing the model and DAX first, since that’s usually where the biggest improvements come from. If your survey data is set up with one column per question, try restructuring it so each row represents a single response to a question. That approach tends to compress much better in Power BI and makes calculations more efficient. It also helps to remove any unused columns and double-check that your data types and relationships are as lean as possible, ideally following a clean star schema.
On the DAX side, review the measures that are doing the heavy lifting and see if they can be simplified or rewritten using variables to avoid repeated calculations. Creating base measures that other measures can build on can also make a noticeable difference. You might also want to check which visuals are taking longest to render by using the Performance Analyzer in Desktop sometimes a few visuals or interactions are the main cause of the lag.
The difference you’re seeing between Desktop and the Service is fairly normal with larger models, since the Service is running on more powerful infrastructure. For development, it can help to work with a smaller sample of your data and turn off background data or auto-refresh while you make changes.
If, after optimization, it still feels too slow, then splitting it into smaller, focused reports is a perfectly valid approach. Just be sure to keep a single shared dataset in the Service so your DAX and data model stay centralized. That way you can manage everything in one place while giving users faster, lighter reports to work with.
In most cases, though, a round of careful model and DAX tuning will give you a noticeable boost before you need to consider breaking it apart.
Best Regards,
Tejaswi.
Community Support
Hi @mdm2025 ,
It sounds like you’ve built quite a substantial report, and it’s easy to see why performance might be struggling a bit given the amount of data and complexity involved. Before going down the route of splitting the report, it’s really worth focusing on optimizing the model and DAX first, since that’s usually where the biggest improvements come from. If your survey data is set up with one column per question, try restructuring it so each row represents a single response to a question. That approach tends to compress much better in Power BI and makes calculations more efficient. It also helps to remove any unused columns and double-check that your data types and relationships are as lean as possible, ideally following a clean star schema.
On the DAX side, review the measures that are doing the heavy lifting and see if they can be simplified or rewritten using variables to avoid repeated calculations. Creating base measures that other measures can build on can also make a noticeable difference. You might also want to check which visuals are taking longest to render by using the Performance Analyzer in Desktop sometimes a few visuals or interactions are the main cause of the lag.
The difference you’re seeing between Desktop and the Service is fairly normal with larger models, since the Service is running on more powerful infrastructure. For development, it can help to work with a smaller sample of your data and turn off background data or auto-refresh while you make changes.
If, after optimization, it still feels too slow, then splitting it into smaller, focused reports is a perfectly valid approach. Just be sure to keep a single shared dataset in the Service so your DAX and data model stay centralized. That way you can manage everything in one place while giving users faster, lighter reports to work with.
In most cases, though, a round of careful model and DAX tuning will give you a noticeable boost before you need to consider breaking it apart.
Best Regards,
Tejaswi.
Community Support
@mdm2025 @MFelix has provided great feedback. To add more, 1 or 2 seconds for the DAX measure is a lot, given you have such a small dataset. Having said that, performance tuning is specific to the model you are working on, so doesn't matter how much info you provide here, it is hard to pinpoint the reasons, and a lot of this is guesswork based on experience. Good luck!
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Hi @mdm2025
Its a very Genric Question in Power BI Interviews as well regarding this Optimization. The best Way I would suggets is that create a Data Model and Measures in a single power Bi File and Publish it into service without any visuals in it.
Then Open a new Power BI file and load data using the Power Bi Semantic Model or Power Bi Datasets Option and follows the below steps:
Keep your existing .pbix file with data model + measures only (no visuals).
Publish this to the Power BI Service as your Certified Semantic Model (say, SurveyAnalyticsDataset).
Open a new Power BI file → Get Data → Power BI Datasets → connect to your published dataset.
Build each new report with only the required visuals (2–3 pages per report).
Keep related measures and field parameters grouped logically (e.g. “Engagement”, “Satisfaction”, “Demographics”, etc.).
These thin reports load instantly in Desktop because they don’t contain the heavy model.
In Power BI Service, create a dashboard or App that links to these reports.
Your existing landing page can become the Dashboard Home with tiles or buttons linking to each smaller report.
This achieves:
✅ Faster development
✅ Easier testing and maintenance
✅ Reuse of the same data model (no duplication)
✅ Better governance (centralized dataset)
Hope this Helps!
@MFelix you are the best. Cheers!
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
@MFelix sorry buddy, seems like we both replied at the same time.
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
@parry2k
No problem have no issues with that we both were typing at the same time, but beleive that in the end we are saying the same thing performance starts at the model side, then DAX, visualizations and then split.
Regards
Miguel Félix
Proud to be a Super User!
Check out my blog: Power BI em Português@mdm2025 to add more, Justin did a session at our UG recently and talked about optimization, check the recording here. Hope you find some useful tip from it:
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Hi @parry2k ,
Thank you for the input. I am using a company laptop so can't install open source or dedicated testing tools.
I have used the Performance analyzer within PowerBI and spoted a few 10 seconds DAX measures which I can reduce to 1-2 seconds depending on the conditions. But other than that most measures/cards or other vizual elements are within 1 second performance. The issues is that each report will have dozens of them. Minumum 20 or can go up to 40 or more due to the complexity of the visualization needs.
@mdm2025 — how large is your dataset? Have you gone through a full optimization of the semantic model and DAX performance tuning? I’d recommend making sure you’ve explored all optimization options before deciding to split it into smaller reports.
If your main fact table has fewer than 200 million rows, I’d definitely focus on optimization first — regardless of how complex your DAX measures are. In many cases, once the model is optimized properly, everything performs as expected.
That said, optimization techniques vary depending on the model design, DAX logic, and overall objectives. However, there are some general best practices you can apply to improve performance in most scenarios.
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Hi @mdm2025 ,
This depends on a lot of things but believe that the main point is how you have setup you model and the measures, having more than 100 dax measures seems to be fairly complex, moreover the way you setup the questions on your model is also important.
The report performance on the desktop should not be impacted a lot except if you have a very low performance computer, low ram values.
Can you give some litle more context on how the model is setup and the relationships and tables.
Regards
Miguel Félix
Proud to be a Super User!
Check out my blog: Power BI em PortuguêsHi @MFelix ,
The main data file that I'm importing is a xlsx file of 1 GB size with around 7 mil rows. The relationship model is very simple just the main Table created by the main data file and 2 other created in PowerQuery from data in the main Table(these are very small and used for USERELANTHIOSHIP functions used across the majority of DAX measures). So the relationship in the model is one(small table) to many (main large Table).
I did not split the main Table data into dedicated Questions tables because I am using to many question from the main data file so the Star schema would have to many dimensions tables if I would.
I am bringing several other xlsx files in the report but these don't have a relationship with the main table and have their own DAX measure unrelated with the main Table measures.
Regarding the hardware I am using is not the greatest(a AMD Ryzen 5 laptop with 32 GB of RAM) but I don't see the CPU going full throtlle when navigating between report pages.
Hi @MFelix ,
The main data file that I'm importing is a xlsx file of 1 GB size with around 7 mil rows. The relationship model is very simple just the main Table created by the main data file and 2 other created in PowerQuery from data in the main Table(these are very small and used for USERELANTHIOSHIP functions used across the majority of DAX measures). So the relationship in the model is one(small table) to many (main large Table).
I did not split the main Table data into dedicated Questions tables because I am using to many question from the main data file so the Star schema would have to many dimensions tables if I would.
I am bringing several other xlsx files in the report but these don't have a relationship with the main table and have their own DAX measure unrelated with the main Table measures.
Regarding the hardware I am using is not the greatest(a AMD Ryzen 5 laptop with 32 GB of RAM) but I don't see the CPU going full throtlle when navigating between report pages.
Hi @mdm2025 ,
A 32gb computer is not bad. Concerning the model it seems like it's a simple one. Why do you have the USERELATIONSHIP are you having a lot of inactive relations between tables?
Allow me to give some points on the response you gave below:
"The issues is that each report will have dozens of them. Minumum 20 or can go up to 40 or more due to the complexity of the visualization needs."
Having several visualizations load in a table will make more requests to the semantic model wich causes slow refresh time.
Since you are using cards I suggest that you try to use the new card visual since you can place several measures or values at once and it will work has a single visual, can help reduce the time.
Regards
Miguel Félix
Proud to be a Super User!
Check out my blog: Power BI em PortuguêsThe Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
Check out the November 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 66 | |
| 47 | |
| 43 | |
| 26 | |
| 19 |
| User | Count |
|---|---|
| 196 | |
| 127 | |
| 102 | |
| 67 | |
| 49 |