Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi folks,
I have a requirement to create a power BI report, where the data source is gonna be Azure Data lake gen 2 with lots and lots of data. I tried to apply some transformational steps at the power query but now while I am working on the visualizations the report is really working slow.
I am checking on other ways how to achieve better performance and also reduce the load on the report thats why I came to the conclusion that i want to apply the logics at the data source itself.
I have came across new feature released in May called datamarts. If I use datamarts and apply the transformational steps in the azure SQL database component provided in the feature then will it fulfil what i want to achieve here. I have read lots and lots of article about datamarts but nowhere anything is mentioned about the enhancing the performance. Can someone help me in finalizing the approach or suggest any alternative solutions.
P.S. pushing the data into SQL databases from data lake gen 2 is out of scope over here.
Hoping to get an answer asap.
Hi @monishamathew.
Which type of operation are you applying in the query editor? Does any advanced operation used on your query tables? (merge, combine, invoke the custom function, looping calculation through table records) Can you please share some more detail about these?
How to Get Your Question Answered Quickly
For data marts it used proactive caching to handle the data refresh, you can take a look at the following link if helps:
Understand datamarts (preview) - Power BI | Microsoft Learn
Regards,
Xiaoxin Sheng