Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi,
I'm contemplating creating a few DAX Tables which and quite lengthy and taking the load of some SQL queries (which are proving to slow the process down somewhat).
I know we can time how long DAX items to take to run etc. But my question is if I have a DAX table within my data model running 100,000+ rows (and growing) will this slow down my data model and overall report performance. Is Power BI constanlty running the DAX behind the table to find the answers in my subsequent measures etc. Or does it get created and stay static when the report is produced, then sit there like an imported table?
I'm working with a sizeable report with a large data model and I need to tread carefully in regards to performance.
Thanks for any insights on the above, I really appreciate any similiar scenarios people have dealt with.
Not sure if this is possible for you but a possible solution would be to push any transformations back to the data source and bring in aggregated views of the data needed.
If I answered your question, please mark my post as solution, Appreciate your Kudos 👍
Thanks for both your replies, it's much appreciated. I'll mark your other answer as an accepted solution. We've pushed calculations back when memory issues have arisen on pages. However in this case the pivoting of large data sets in SQL is terribly slow. And I'm trying to approach this issue either using a DAX table or in Power Query, with the latter please see my other post this morning.
Hi,
DAX calculated tables are not like imported tables that are loaded into memory and remain static, they are computed dynamically when needed. This can impact performance, especially when dealing with large datasets.
If I answered your question, please mark my post as solution, Appreciate your Kudos 👍