Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hello friends.
I have the following situation:
I imported some .XML files into PowerQuery and transformed the data into the layout I need. I did this procedure with around a thousand files.
However, I have a folder with 70GB of .XML files that I need to import, transform and cross-reference with another table.
As you may have already noticed, this is a very high volume of files.
Yesterday I did a test with around 1 million files and after 10 hours of loading, the procedure still hadn't finished.
How can I work with this volume of data? I don't understand anything about MySQL or PostgreSQL.
Solved! Go to Solution.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
117 | |
73 | |
58 | |
49 | |
48 |
User | Count |
---|---|
171 | |
122 | |
60 | |
59 | |
56 |