Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
dallass
New Member

Data Modeling Help Needed

Hi All,

I am needing some help with enhancing the speed of my dashboard.  I've set refresh rates, caching etc. that I have found in other forums, but the response time is still very slow.  I have 35 tables with hundreds of columns and I am using direct query via Google Bigquery and SQL to pull from 2 different data sources.  I am pulling large data sets with Purchase order information and retail information down to the store level by week.  If I could get someone to guide me via teams or zoom, that'd be ideal as there are probably multiple options to enhance based on the data structure. 

 

Thanks all!

1 ACCEPTED SOLUTION
burakkaragoz
Community Champion
Community Champion

Hi @dallass ,

 

You're dealing with a perfect storm for slow performance - 35 tables, cross-source DirectQuery, and BigQuery. No wonder it's crawling.

The brutal truth: DirectQuery with BigQuery is often a nightmare. BigQuery is designed for big analytical queries, not the hundreds of tiny queries Power BI fires off. Plus, mixing BigQuery and SQL Server means Power BI can't push anything down to the source - it's doing all the heavy lifting locally.

What's probably happening: Every time someone clicks a slicer, Power BI is generating separate queries to both sources, pulling data back, then trying to merge it. That's why your caching isn't helping much.

Real fixes:

Split the **bleep** thing - Separate reports for BigQuery vs SQL data. Stop trying to make them play nice together in one model.

Import what you can - Weekly retail data? That's probably small enough to import instead of DirectQuery. Save DirectQuery for the massive transactional stuff.

Pre-aggregate in BigQuery - Create summary tables for your common views (store/week rollups). BigQuery loves big scans, hates small lookups.

Kill unused columns - Those "hundreds of columns" are slowing every query even if you're not using them.

The webcall thing makes sense - this kind of mess usually needs someone to look at the actual model and say "why the hell are you doing it that way?"

What's your biggest pain point - the initial load time or when users interact with filters?


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.

View solution in original post

6 REPLIES 6
v-prasare
Community Support
Community Support

We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance. If your query got resolved we will go ahead and close the thread

 

 


Thanks,

Prashanth Are

MS Fabric community support

v-prasare
Community Support
Community Support

Hi @dallass,

We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.

 

 


Thanks,

Prashanth Are

MS Fabric community support

v-prasare
Community Support
Community Support

Hi @dallass,

We would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.

 


@burakkaragoz & @FBergamaschi , thanks for your prompt response.


Thanks,

Prashanth Are

MS Fabric community support


If our super user response resolved your issue, please mark it as "Accept as solution" this helps other user find similar answers quickly

burakkaragoz
Community Champion
Community Champion

Hi @dallass ,

 

You're dealing with a perfect storm for slow performance - 35 tables, cross-source DirectQuery, and BigQuery. No wonder it's crawling.

The brutal truth: DirectQuery with BigQuery is often a nightmare. BigQuery is designed for big analytical queries, not the hundreds of tiny queries Power BI fires off. Plus, mixing BigQuery and SQL Server means Power BI can't push anything down to the source - it's doing all the heavy lifting locally.

What's probably happening: Every time someone clicks a slicer, Power BI is generating separate queries to both sources, pulling data back, then trying to merge it. That's why your caching isn't helping much.

Real fixes:

Split the **bleep** thing - Separate reports for BigQuery vs SQL data. Stop trying to make them play nice together in one model.

Import what you can - Weekly retail data? That's probably small enough to import instead of DirectQuery. Save DirectQuery for the massive transactional stuff.

Pre-aggregate in BigQuery - Create summary tables for your common views (store/week rollups). BigQuery loves big scans, hates small lookups.

Kill unused columns - Those "hundreds of columns" are slowing every query even if you're not using them.

The webcall thing makes sense - this kind of mess usually needs someone to look at the actual model and say "why the hell are you doing it that way?"

What's your biggest pain point - the initial load time or when users interact with filters?


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.

This is great info.  I think I need to rebuild the model based on what you've said.  I intially was just trying to see if I could get the data models to connect and visualize the 2 data sources in 1 visual.  I will work on the recommendations and see how that helps with performance.  Thank you so much!

FBergamaschi
Solution Sage
Solution Sage

I cannot guide you via webcall but just to inform you that DirectQuery is inerently slow. The best thing to do is optimizing the data source, as the perfomrance depends on it.

 

Are the data sources start schema databases? That is very important for performance

 

Are the data sources dedicated to BI (DWH) or they have to serve ERPs? DWH would perform better

 

Of course it also depends on your DAX as that needss to be translated into SQL, so please stay away from complex DAX and try to keep it optimized

These are just first things to check, if you want I can give yousome hints on how to keep DAX optmized

 

If this helped, please consider giving kudos and mark as a solution

@me in replies or I'll lose your thread

consider voting this Power BI idea

Francesco Bergamaschi

MBA, M.Eng, M.Econ, Professor of BI

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.