Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
Anonymous
Not applicable

PowerBI SQL Best Practice for a large amount of different values

Hi Everyone,

 

I'm relatively new to PowerBI and SQL so please explain as in depth as possible for me.

 

Implementation: I've created a SQL Database that stores data counters from units we have, there is ~9000 of these counters. As seen below, I have seperated the counters into different tables by type of about ~500 columns. Currently, we use Python to bring the data into PowerBI. We will be performing Dax calculations on these columns then present them in a graph by Month, Day and Hour. There is going to be a minimum of 10 graphs.

 

Q1) What is the best implementation of the SQL database? ( Very slow performance in PowerBI) Should I instead create 1 singular table with a counter and value column instead of multiple tables?

Q2) We are considering moving the SQL table to Azure does that change the architecture?

 

alee11_0-1649793764664.png

 

2 REPLIES 2
amitchandak
Super User
Super User

@Anonymous , First of all why there are 500 columns, do you need that many?

 

Import mode is best. Unless you have huge data. dataset size till 35-40 GB in P2 premium should be fine.

 

You should always prefer Star schema

https://www.sqlshack.com/implementing-star-schemas-in-power-bi-desktop/
https://docs.microsoft.com/en-us/power-bi/guidance/star-schema
https://www.youtube.com/watch?v=vZndrBBPiQc&feature=youtu.be

 

Also check

https://maqsoftware.com/expertise/powerbi/power-bi-best-practices
https://docs.microsoft.com/en-us/power-bi/guidance/power-bi-optimization
https://www.c-sharpcorner.com/article/power-bi-best-practices-part-1/

 

 

Share with Power BI Enthusiasts: Full Power BI Video (20 Hours) YouTube
Microsoft Fabric Series 60+ Videos YouTube
Microsoft Fabric Hindi End to End YouTube
Anonymous
Not applicable

Hi @amitchandak ,

 

Thank you for you reply. The reason I have that many columns is because each column represents a distinct and unique value at a certain time. There is actually close to 9000 of these distinct values that I have split into different tables. I could create less columns by having a "Counter ID" column and a corresponding value column. However, the data is stored in 15 min intervals so I'm concerned the table would expand quite fast given we will be working half a year to a full year of data. (9000*4*24*31*12 = 321 408 000 rows) I'm also not sure how Dax would approach this since we would need to find all the same id's in a time interval and sum them.  In terms of the star schema, I guess I would create a dimension table with Counter ID and Counter name? 

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

Top Solution Authors
Top Kudoed Authors