The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi,
I'm facing a problem while trying to get data from "OData Connector", trying to get a 100M+ rows dataset.
After 45M rows loaded my RAM is overloaded (see screenshot).
How to analyze a high volumetry dataset ? (looking for both free and paid options)
Regards,
Adry
Solved! Go to Solution.
You can try to use Direct Query instead of Import mode, or try to add extra memory for your computer.
Best Regards,
Herbert
In order to reduce the amount of RAM your Data Model uses you must consider casting the columns to the appropriate data types. Very often a forgotten text column which actually represents a number would consume all of your RAM in no time.
Also worth mentioning that the data types that you have in your data model are not necessary reflective of those that you get in your query. That is why checking how your data is structured in your Data Model is a quick win for RAM consumption.
Secondly, depending from where you get your data from you consider changin your M Query. The quick wins here are streaming the data or if you are making any intensive calculations in your queries Buffer the intermediarry steps.
And finally consider loading the data one query at a time.
In order to reduce the amount of RAM your Data Model uses you must consider casting the columns to the appropriate data types. Very often a forgotten text column which actually represents a number would consume all of your RAM in no time.
Also worth mentioning that the data types that you have in your data model are not necessary reflective of those that you get in your query. That is why checking how your data is structured in your Data Model is a quick win for RAM consumption.
Secondly, depending from where you get your data from you consider changin your M Query. The quick wins here are streaming the data or if you are making any intensive calculations in your queries Buffer the intermediarry steps.
And finally consider loading the data one query at a time.
You can try to use Direct Query instead of Import mode, or try to add extra memory for your computer.
Best Regards,
Herbert
Thanks for the tips
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
User | Count |
---|---|
108 | |
77 | |
71 | |
48 | |
41 |
User | Count |
---|---|
137 | |
108 | |
69 | |
64 | |
58 |