Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello everybody,
seems my previous post got lost. I have a question in regards to if I should go through an additional normalization step during the import proces to improve system performance.
The fact table has about 100k (max. 300k one day) rows and 12 columns. Values to calculate with are only in one column. This table represents the financial reports (P&L, BS) for 3 years incl. actual, Budget and FC on a monthly level and for all items a balance sheet and P&L has to offer. 6 of those columns are representing organizational levels (Legal entity, plant and profit center) and show for each a key (integer) and name (text).
Would it be better to delete the columns with the names and have them in dim-tables, or keep them in the fact table?
Many thanks for your opinion in advance.
Daniel
Solved! Go to Solution.
@danielboi , It always better to have a dimension table and be in a Star schema
refer
https://www.youtube.com/watch?v=vZndrBBPiQc&feature=youtu.be
https://www.sqlbi.com/articles/the-importance-of-star-schemas-in-power-bi/
@danielboi , It always better to have a dimension table and be in a Star schema
refer
https://www.youtube.com/watch?v=vZndrBBPiQc&feature=youtu.be
https://www.sqlbi.com/articles/the-importance-of-star-schemas-in-power-bi/
Thank you Amit. I read that "over-normalizing" could be detriment to read performance. That was triggering my question. 🙂
@danielboi , I think you can work very well with a single table. But when you need to use all and all selected then, in that case, having a dimension table help. I can use it all in one dimension without disturbing others. There many advantages of Star schema, once complex needs start coming, you will have most of the solutions around it.
@amitchandak I followed your advice and what shall I say. It is worth gold. The income statement were downloaded per year in one workbook with actual, fc and budget each on a separate sheet. Took 25 minutes to go through 3 years worth of data. Now that I moved everything out to dim-tables, it is done below 30 seconds!
I got the comment regarding the detriment of over-normalizing from Ferrari's/Russo's book Analyzing Data with Microsoft Power BI and Power Pivot for Excel. Looked it up again. Was my fault. They were talking about the snowflake schema, which could slow things down.
So many thanks again for pointing me into the right direction.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!