Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hi Folks ,
We are facing issues in 2 of our PBI reports , these reports are created using Datalake semantic model with Some DAX measures coming directly from there .
But we are also having certain ad-hoc files which are importing through sharepoint as excel files .We are creating some time intelligence and some other DAX in PBI using Datalake and sharepoint combined model .
Power Bi service- Our semantic model is in a different tenant and report is in a different tenant so we are using cross-tenant architecture.
Visuals on our pages : We are having 5 matrices with 13 KPIs each on our pages , 2 bookmarks and 3 KPI cards and 4 slicers are there .
Problem we are facing :
Our reports either load very slowly as well as they fail to load sometimes with error message : visual exceeded available resources .
Please can anybody guide us as to how we can improve the performance of our reports .
Thanks in advance .It would truely be a game changing help , all suggestions are heartly welcome .
@nandic @lbendlin
Answers to your questions:
1.We are using snowflake schema i.e Our manual files(sharepoint tables) are connected to the dimension tables via many to many relations.
2.There are some transformations done in Power Query editor on top of these manual files.Transformations are of the nature - Merged some tables ,created new table which we are using and aggregate functions are being used to populate the columns in the tables like addition , division ,multiplication etc.
3.The merged files also have a relation with dimension tables and are being used for DAX query calculation creation.
4.DAX which are using is similar to MTD and YTD of the previous one (which I pasted in another reply ) ,just the columns are from the tables of manual files .
@lbendlin Bro , Client is unwilling to change the number/type of visuals , so they is not negotiable .
What else could we do in order to optimize the performance of our model .
@nandic @hackcrr @lbendlin Hi Guys , I did a Poc and checked the performance of the report when I remove the ad-hoc files which are coming from the sharepoint source . My reports are working superfast when I am using the datalake only as the source .
But , the issue is , the files coming from sharepoint are also vital for the reports as they also are useful for certain columns being used for DAX .
Please help me in this .What can I do to make the reports fast even when I want to use the sharepoint files ?
@rohandixit , it is great that you managed to find what is the main issue. That is our starting point. Now next stage comes, data model optimization.
In that case we would need at least these information:
1) how are these sharepoint files (tables) connected to existing model? Are they related directly to main (fact) tables? Is it using 1>many relationship or many>many, is it using single or both cross-filter direction
2) how are these sharepoint tables used in above mentioned DAX? Show on example of one measure the full lineage how these sharepoint files impact DAX logic
3) are these sharepoint tables used as start or snowflake schema? Screenshot of data model with marked fact tables and sharepoint tables would be very benefitial
@nandic Yeah I agree .But this is to fulfill the demand of business .They are not willing to compromise on this .
Could you suggest something
Hi, @rohandixit
Can you use the DaxStudio tool to test your metric performance? Here are some articles you need to refer to:
https://www.sqlbi.com/articles/importing-performance-analyzer-data-in-dax-studio/
https://daxstudio.org/docs/features/load-powerbi-perf-data/
If I have answered your question, please mark my reply as solution and kudos to this post, thank you!
@hackcrr Already did that .What should I do after this ?
Presently in a catch 22 situation, feeling stuck man.
Could you please help
@tackytechtom @tackytechtom2 @Icey @GilbertQ @Sahir_Maharaj @Anonymous @Ritaf1983 @rajendraongole1 @SamWiseOwl @lbendlin @Greg_Deckler @VahidDM
@Ashish_Mathur @ryan_mayu @AmiraBedh @NaveenGandhi @hackcrr @Anonymous @v-jiascu-msft Please could you help with your valueable inputs .
Hi @rohandixit , i would start from performance analyzer: https://www.youtube.com/watch?v=VFcDCls0la4&t=7s
For starting point, it will show you where the issue is, which visual is the issue. Then we can focus on measure that is the main issue (99% the issue is data modelling related to DAX).
You could post here DAX of the slowest measure and data model screenshot.
Cheers,
Nemanja
Tables which has most rows:
Facts Sales: 16 M
Sellst: 25 M
1) Sellin_UC_MTD_MIX_all% =
var MTD_Sales_445_Sellin_uc = CALCULATE([Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname2] = MAX('tablename1'[columnname2]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
)
)
var Total_MTD_Sales_445_Sellin_uc = CALCULATE([Sellin Unit Cases (UC)],
ALL(tablename2[columnname4],tablename2[columnname5]),ALL(tablename3[columnname6],tablename3[columnname7],tablename3[columnname8]))
var total_percent_sellin_uc = DIVIDE(MTD_Sales_445_Sellin_uc,Total_MTD_Sales_445_Sellin_uc,0)
return total_percent_sellin_uc
2) Sellin_UC_YTD_Mix_ALL% =
var YTD_Sales_445_Sellin_uc = CALCULATE(
[Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
)
)
var Total_YTD_Sales_445_Sellin_uc = CALCULATE(
[Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
),
ALL(tablename2[columnname4],tablename2[columnname5]),ALL(tablename3[columnname6],tablename3[columnname7],tablename3[columnname8]))
var total_percent_sellin_uc = DIVIDE(YTD_Sales_445_Sellin_uc,Total_YTD_Sales_445_Sellin_uc,0)
RETURN
total_percent_sellin_uc
3) Sellin_UC_YoY % =
var sellin_uc_cytd= CALCULATE(
[Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
)
)
var sellin_uc_pytd = CALCULATE(
[Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname3] = MAX('tablename1'[columnname3]) - 1 &&
'tablename1'[columnname2] <= MAX('tablename1'[columnname2])
)
)
var sellin_uc_ytd = DIVIDE(sellin_uc_cytd,sellin_uc_pytd,0)-1
RETURN
sellin_uc_ytd
4) MTD UC Target =
var MTD_target_raw = CALCULATE(SUM('tablename4'[columnname9]),
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname2] = MAX('tablename1'[columnname2]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
))
var MTD_target = IF(
ISBLANK(MTD_target_raw),
[3months avg sellin_UC], MTD_target_raw
)
Return
MTD_Target
5) Sellin_MTD_UC =
CALCULATE([Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname2] = MAX('tablename1'[columnname2]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
)
)
6) Sellin_MTD UC Achievement % = DIVIDE([Sellin_MTD_UC],[MTD UC Target],0)
7) Total Month UC Target =
var mtd_target_raw = CALCULATE(SUM('tablename4'[columnname9]),
FILTER(
ALL('tablename1'),
'tablename1'[columnname2] = MAX('tablename1'[columnname2]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
))
var sellin_uc_target = IF(
ISBLANK(MTD_target_raw),
[3months avg sellin_UC], MTD_target_raw
)
return
sellin_uc_target
😎 Total_Month_UC Achievement % =
DIVIDE([Sellin_MTD_UC],[Total Month UC Target],0)
9) Sellin_NSR per UC = DIVIDE(SUM('tablename5'[columnname10]),SUM('tablename5'[columnname11]),0)
10) YTD UC Target =
VAR SelectedMonth = SELECTEDVALUE('tablename1'[columnname2])
VAR SelectedYear = SELECTEDVALUE('tablename1'[columnname3])
VAR MTD_Target =
CALCULATE(
[MTD UC Target],
'tablename1'[columnname2] = SelectedMonth,
'tablename1'[columnname3] = SelectedYear
)
VAR Total_Upto_Selected_Month_Target =
CALCULATE(
[Total Month UC Target_ForYTD],
FILTER(
ALL('tablename1'),
'tablename1'[columnname2] < SelectedMonth &&
'tablename1'[columnname3] = SelectedYear
)
)
VAR Total_Previous_Months_Target = Total_Upto_Selected_Month_Target
RETURN
IF(
NOT ISBLANK(MTD_Target),
Total_Previous_Months_Target + MTD_Target,
BLANK()
)
11) Sellin_UC_YTD =
CALCULATE(
[Sellin Unit Cases (UC)],
FILTER(
ALL('tablename1'),
'tablename1'[columnname1] <= MAX('tablename1'[columnname1]) &&
'tablename1'[columnname3] = MAX('tablename1'[columnname3])
)
)
12) Sellin_YTD UC Achievement % = DIVIDE([Sellin_UC_YTD],[YTD UC Target],0)
13) Sellin_UC_YTD BTG = [YTD UC Target] - [Sellin_UC_YTD]
@nandicThese are DAX mesures we are using and the tables which have the highest columns.
Hi @rohandixit ,
Based on these measures, in all of them you use ALL function and then 2-3 comparisons on a row level.
Which means for each measure it is running through at least 15-25 M rows.
Ways to improve it:
1) it would be benefitial if you could have additional columns in data model (in these largest tables) which would already precalculate MAX(column1), MAX(column2), furthermore to have complete logic in these additional columns (instead of measures).
By doing it via columns, refresh may be a little bit longer, but measures will be much faster for sure.
Or at least try using variables, something like this:
Sellin_MTD_UC =
VAR maxColumn1 = MAX('tablename1'[columnname1])
VAR maxColumn2 = MAX('tablename1'[columnname2])
VAR maxColumn3 = MAX('tablename1'[columnname3])
RETURN
CALCULATE(
[Sellin Unit Cases (UC)],
'tablename1'[columnname1] <= maxColumn1,
'tablename1'[columnname2] = maxColumn2,
'tablename1'[columnname3] = maxColumn3
)
These measures with combination of ALL(table) + MAX conditions are the issue.
@nandic Should I create these columns in the power bi as dax calculated columns .
Because my data semantic model mostly is coming as direct query from the data lake .
Data lake is providing some measures created over there as well .
And some measures I am creating in power bi when the ad-hoc files from SharePoint are coming.
Could you help me in this ?
Thanks a lot in advance @nandic
@rohandixit if these DAX measures above are in your semantic model (dataset), then you should do upgrades in that pbix file.
What i would do:
1) put each measure which contains ALL(Table) + conditions in separate card visual on the same page. Then run performance analyzer to see which is consuming the most dax time
2) then create duplicate of that measure, but using variables as mentioned above. Add this new measure to the new card visual on the page and run performace analyser again. Compare original vs this new measure to see if that has any impact on dax time (and make sure values match, that the logic didn't impact on calculation logic)
3) if it doesn't have any impact, then try adding new columns in the model for Max(column1, column2) or even further full logic if(column1>=max(column2),1,0). And then in measure just use filter(table, columnN =1...)
This is how i would approach the problem. Start from 1 measure which takes a lot of time and try to optimize it.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.