The ultimate Microsoft Fabric, Power BI, Azure AI, and SQL learning event: Join us in Stockholm, September 24-27, 2024.
Save €200 with code MSCUST on top of early bird pricing!
Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello friends.
I have the following situation:
I imported some .XML files into PowerQuery and transformed the data into the layout I need. I did this procedure with around a thousand files.
However, I have a folder with 70GB of .XML files that I need to import, transform and cross-reference with another table.
As you may have already noticed, this is a very high volume of files.
Yesterday I did a test with around 1 million files and after 10 hours of loading, the procedure still hadn't finished.
How can I work with this volume of data? I don't understand anything about MySQL or PostgreSQL.
Solved! Go to Solution.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Check out the August 2024 Power BI update to learn about new features.
User | Count |
---|---|
119 | |
87 | |
75 | |
55 | |
44 |
User | Count |
---|---|
135 | |
125 | |
78 | |
64 | |
63 |