Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Dellis81
Continued Contributor
Continued Contributor

Calc table Performance issues in composite models

Hello

 

I have an "ugly baby" - poor performance.    I am using calculation tables, and in the primary (top) model, performance is slow, but acceptable.  However, when I implement in a thin model and publish as an APP - often have memory errors, etc.   I have read several places - direct query models and calculation tables are not recommended.  Is that rule of thumb generally accepted?


Are there things I can do to enhance performance? 

a) currently on a Pro license, would moving up to PPU be a viable option? b) Is performance impacted by whether the calculation table is built in the base model or within the chained composite model? c) I have tried field parameters - performance is still poor, and lose control of formatting.

Any references or thoughts would be greatly appreciated.!

1 ACCEPTED SOLUTION
lbendlin
Super User
Super User

The single most important factor is the cardinality of the join columns.  Examine some of your measures in DAX Studio and you will find that the queries include enumerations of the values in the join column.  If the cardinality is more than a couple thousand then you very quickly end up in situations where the query is slow not because it is complex, but because it is big, physically big.  You can have scenarios where the query text is multiple gigabytes in size, and these gigabytes have to travel through the network for every single query.

 

Keep the link cardinality as low as possible.

View solution in original post

3 REPLIES 3
Dellis81
Continued Contributor
Continued Contributor

Thank you @ibendlim for your thoughts.  I will dig into DaxStudio for your comments on cardinality.   I was working late Saturday evening on this issue - and responsiveness at least return a result.   The one visual maintained it's normal 8-10 seconds, but the problematic one resutling in errors was almost instaneous.  I simplified one of my measures.   However, today - Sunday afternoon, I am back to the slow responsiveness.

 

I am wondering when I do a refresh of the local (chained file) - the refresh seems to take an extended period of time (even longer than a refresh of the remote (base file).  Yesterday afternoon, the refresh of the local file was within 30 seconds.

 

I am also attempting to understand the aggregations features.   I am using import mode from dataflows, and seem to get hung up on creating a live query connection from a dataflow.   Unfortunately, I have multiple transformations within the dataflow, pulling live from the source file would be of low value (transitioning these transformations to a live basis would be disasterouly slow).   All my reading/video watching suggest aggregatations require a live connection - that is converted to dual mode.  I am not finding how to do that with dataflow imports.

 

Thank you again for your thoughts - I will definitely dig into the dax studio suggestion.

lbendlin
Super User
Super User

The single most important factor is the cardinality of the join columns.  Examine some of your measures in DAX Studio and you will find that the queries include enumerations of the values in the join column.  If the cardinality is more than a couple thousand then you very quickly end up in situations where the query is slow not because it is complex, but because it is big, physically big.  You can have scenarios where the query text is multiple gigabytes in size, and these gigabytes have to travel through the network for every single query.

 

Keep the link cardinality as low as possible.

Dellis81
Continued Contributor
Continued Contributor

As I continue to learn, I suspect my cardinality issue is attempting to do this with a "day' calendar column.   When I flipped to a 'year' column, things started working much better. 

 

 I do think there are times when the service is more efficient than normal.  I know, late Saturday afternoon - things were working great, and then Sunday morning - the old memory errors seemed to pop up.

Thanks for the input, I am slowly learning!

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.