Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
sivarajan21
Post Prodigy
Post Prodigy

Calculated Table causes performance issue and dataset refresh fails

Hi,

 

I have a calculated table called 'Invoice Issues' as below which is derived from the existing Data table:

sivarajan21_0-1708322251010.png

The main purpose of this table is to create a calculated column called 'Status' which is used in visual:

 

Calculated column:

sivarajan21_2-1708322671479.png

 

visual:

sivarajan21_1-1708322612532.png

 

This works perfectly and our report is liked by the customers.But this causes lot of performance issues and dataset report refresh fails in workspace.

 

Is there a way to creat this as a dax measure or any other alternate solutions is much appreciated?

PFA file herewith PR-419 - Data Coverage RLS ADB (1).pbix

 

@amitchandak @Ahmedx @marcorusso @Greg_Deckler @Ashish_Mathur @Anonymous 

 

1 ACCEPTED SOLUTION
marcorusso
Most Valuable Professional
Most Valuable Professional

When you create a calculated table, the entire uncompressed table must be materialized in memory and then compressed. This could be a memory-intensive operation that works on your PC and fails on Power BI Service.

You should reduce that table by removing unused columns and remove unnecessary rows - e.g. keep only the rows that have an issue.

Or compute the column outside of Power BI, so it's processed like other tables segment by segment, reducing the memory requirement at refresh time.

 

View solution in original post

3 REPLIES 3
Ashish_Mathur
Super User
Super User

Hi,

Share a much smaller sample dataset with only the relevant tables.  Explain your requirement and show the expected result.


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/
marcorusso
Most Valuable Professional
Most Valuable Professional

When you create a calculated table, the entire uncompressed table must be materialized in memory and then compressed. This could be a memory-intensive operation that works on your PC and fails on Power BI Service.

You should reduce that table by removing unused columns and remove unnecessary rows - e.g. keep only the rows that have an issue.

Or compute the column outside of Power BI, so it's processed like other tables segment by segment, reducing the memory requirement at refresh time.

 

Hi @marcorusso 

 

Many thanks Sir for your quick response!

Apologise for delay in response! we will follow this going forward in our reports

Just wanna confirm,
 when you say outside of power bi, it means in our case it is Data Flow where transformations can be done?

when you say 'keep only the rows that have an issue.' that means applying Date range parameters in power query?

Is there a way to measure the performance of data model & dataflow similar to how dax studio is used for measuring the power bi report performance?

sivarajan21_0-1708694049094.png

 

 Thanks in advance!

 

@amitchandak @Ahmedx @marcorusso @Greg_Deckler @Ashish_Mathur @v-cgao-msft @Daniel29195 

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors