Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
MAli_Ch
Regular Visitor

SSAS Tabular Model performance Issue

Hi I have a SSAS tabular model of almost 2 GB size. I refresh it's data using Azure Analysis Service on S1 tier(P.S. I have tried with S2 tier too but it still consume same amount of time) and refresh request is generated through ADF. Earlier it was working fine and refreshing data with 40 mins for all requested partitions. Now someone has added few new DAX measures to my model and since then me refresh time has increased to 3 hours. I want to optimize my model and reduce refresh time. I am only refreshing the relevant data(current data not the historical data). How can I reduce refresh time and optimize my tabular model.

I tried reducing the data but still its taking long time to process the data which is not the ideal case. I guess the effectiveness is effected due to any DAX measure addition but still its just a guess. I dont know the method to identify the cause as I have too many DAX measures in my model.

3 REPLIES 3
Ankur04
Resolver II
Resolver II

Hi @MAli_Ch ,

 

As rightly said by @lbendlin , measures can have impact on refresh time. they are being calculated on the go.

please check your query if it is sql view or query try running it on SSMS first and monitor how much time it is taking for successful run. if your query has join or where condition you can consider created non cluster index on the columns.

 

Thanks,

 

Ankur

v-fenling-msft
Community Support
Community Support

Thanks for lbendlin's concern about this issue.

 

Hi, @MAli_Ch 

Since the increase in refresh time occurs at the same time as the addition of new DAX Measure, it is possible that some Measure are more computationally intensive. You can review these DAX Measure to see if there is room for optimisation. Consider whether certain Measure can be pre-computed during the ETL process rather than in the model.

 

Try to avoid using Calculated columns or Calculated tables, as they are calculated during processing and may increase refresh time.

 

Monitor resource usage during processing to see if any limits are reached. You can also use SQL Server Profiler or Extended Event Trace to track processing events. This can help you understand how time is being consumed during processing.

 

If you are refreshing only the current data, be sure to use the Process Add operation instead of Process Full. Process Add is more efficient when adding new data without reprocessing the entire model.

 

 

 

I hope my suggestions give you good ideas, if you have any more questions, please clarify in a follow-up reply.
Best Regards,
Fen Ling,
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

lbendlin
Super User
Super User

Measures should not have an impact - they are only consuming resources when they are used.  Did they maybe add new calculated columns or calculated tables?

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.