Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi Everyone,
I'm relatively new to PowerBI and SQL so please explain as in depth as possible for me.
Implementation: I've created a SQL Database that stores data counters from units we have, there is ~9000 of these counters. As seen below, I have seperated the counters into different tables by type of about ~500 columns. Currently, we use Python to bring the data into PowerBI. We will be performing Dax calculations on these columns then present them in a graph by Month, Day and Hour. There is going to be a minimum of 10 graphs.
Q1) What is the best implementation of the SQL database? ( Very slow performance in PowerBI) Should I instead create 1 singular table with a counter and value column instead of multiple tables?
Q2) We are considering moving the SQL table to Azure does that change the architecture?
@Anonymous , First of all why there are 500 columns, do you need that many?
Import mode is best. Unless you have huge data. dataset size till 35-40 GB in P2 premium should be fine.
You should always prefer Star schema
https://www.sqlshack.com/implementing-star-schemas-in-power-bi-desktop/
https://docs.microsoft.com/en-us/power-bi/guidance/star-schema
https://www.youtube.com/watch?v=vZndrBBPiQc&feature=youtu.be
Also check
https://maqsoftware.com/expertise/powerbi/power-bi-best-practices
https://docs.microsoft.com/en-us/power-bi/guidance/power-bi-optimization
https://www.c-sharpcorner.com/article/power-bi-best-practices-part-1/
Hi @amitchandak ,
Thank you for you reply. The reason I have that many columns is because each column represents a distinct and unique value at a certain time. There is actually close to 9000 of these distinct values that I have split into different tables. I could create less columns by having a "Counter ID" column and a corresponding value column. However, the data is stored in 15 min intervals so I'm concerned the table would expand quite fast given we will be working half a year to a full year of data. (9000*4*24*31*12 = 321 408 000 rows) I'm also not sure how Dax would approach this since we would need to find all the same id's in a time interval and sum them. In terms of the star schema, I guess I would create a dimension table with Counter ID and Counter name?