Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
Hello friends.
I have the following situation:
I imported some .XML files into PowerQuery and transformed the data into the layout I need. I did this procedure with around a thousand files.
However, I have a folder with 70GB of .XML files that I need to import, transform and cross-reference with another table.
As you may have already noticed, this is a very high volume of files.
Yesterday I did a test with around 1 million files and after 10 hours of loading, the procedure still hadn't finished.
How can I work with this volume of data? I don't understand anything about MySQL or PostgreSQL.
Solved! Go to Solution.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @jeanrozin ,
Please try to filter and aggregate the data before loading it into Power Query, This will help you reduce the size of the dataset and improve performance.
Best regards,
Community Support Team_Binbin Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
123 | |
69 | |
67 | |
58 | |
52 |
User | Count |
---|---|
185 | |
92 | |
67 | |
62 | |
52 |