The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
My PBI model contains a table of 8 rows and 14 columns, 10 integers, 4 text (7 char); 112 data items. According to DAX Studio it’s easily the largest object in the model at over 2Mb, taking up over “40% DB”. (I’m not sure whether that means exactly what it seems to.)
It contrasts sharply with the fact table which has nearly 600,000 data items and is half the size.
My questions are:
Background on the large-size table (‘Year’):
It’s a look-up used to attach an informative label to the performance information in the fact table. All the performance lines have a Year that they are reported in, but the performance may relate to an earlier period. The Year table has a label for each of the different kinds of performance information. Two of them are calculated fields to allow them to change depending on when the report is viewed. For example:
YearTypeA
= IF (
MONTH ( TODAY () ) < 11
&& MONTH ( TODAY () ) > 8
&& 'Year'[MaxYrMinus1] = 'Year'[AcYrStart] - 1,
BLANK (),
'Year'[CurrentYear]
)
These two calculated fields account for 98% of ‘Year’ in the VertiPaq Analyzer report.
Solved! Go to Solution.
The SQLBI link is helpful. Evaluating the Year table takes only a couple of milliseconds so I guess I don't have to worry about it even if its size is still a mystery.
Thanks for your input, @davehus .
Hi @Anonymous ,
Have a look at this link. https://www.youtube.com/watch?v=V_AHYAFuFs8&ab_channel=SQLBI
Basically the server timings will show what part of the query is the biggest, how long it took to run etc. The size usually relates to how unique the data is your in model. This might help you narrow down why your query is so large.
The SQLBI link is helpful. Evaluating the Year table takes only a couple of milliseconds so I guess I don't have to worry about it even if its size is still a mystery.
Thanks for your input, @davehus .
Thanks for your reply @davehus . Apologies, I don't quite understand what you are asking. My question was really about the table size.
The PBI report does have some performance issues but I don't know if they are related to this inflated table. I'm not sure how to use DAX Studio to do what you are suggesting. Generating the calculated column is reported as taking 2ms in the Formula Engine. But I don't know where to look for knock-on effects, if that's what you mean.
Hi @Anonymous , The server timings in Dax Studio wil give you an idea of the sql code being run on the query. In terms of the improving performance, are you in a position to apply this if statements upstream in powerquery?
Did I help you today? Please accept my solution and hit the Kudos button.
User | Count |
---|---|
65 | |
61 | |
60 | |
53 | |
27 |
User | Count |
---|---|
181 | |
83 | |
68 | |
49 | |
46 |