Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi folks,
I have a requirement to create a power BI report, where the data source is gonna be Azure Data lake gen 2 with lots and lots of data. I tried to apply some transformational steps at the power query but now while I am working on the visualizations the report is really working slow.
I am checking on other ways how to achieve better performance and also reduce the load on the report thats why I came to the conclusion that i want to apply the logics at the data source itself.
I have came across new feature released in May called datamarts. If I use datamarts and apply the transformational steps in the azure SQL database component provided in the feature then will it fulfil what i want to achieve here. I have read lots and lots of article about datamarts but nowhere anything is mentioned about the enhancing the performance. Can someone help me in finalizing the approach or suggest any alternative solutions.
P.S. pushing the data into SQL databases from data lake gen 2 is out of scope over here.
Hoping to get an answer asap.
Hi @monishamathew.
Which type of operation are you applying in the query editor? Does any advanced operation used on your query tables? (merge, combine, invoke the custom function, looping calculation through table records) Can you please share some more detail about these?
How to Get Your Question Answered Quickly
For data marts it used proactive caching to handle the data refresh, you can take a look at the following link if helps:
Understand datamarts (preview) - Power BI | Microsoft Learn
Regards,
Xiaoxin Sheng
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
16 | |
2 | |
1 | |
1 | |
1 |