Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
IanDavies
Helper III
Helper III

Help with best approach requested

I am looking for suggestions on how I might solve a problem that I have been given. I am new(ish) to data analyisis and PowerBI/Power Query but am comfortable with M code and used PQ in Excel in the past. 

 

I have five files, all are CSV and all are emailed to a shared exchange mailbox. Each of these need to be a source for a new query (I think) and then the queries would be used to produce a single, defintive, de-duped list of all the computers from all four sources to which I can then merge certain detail from each of the queries to form a dashboard to show performance metrics.


Each source column naming is different.

 

I have tried to import these files directly from the mailbox, and it works but everything is so slow and when I created the defintive list of "All Computers" it took hours to lookup detail from the other queries and import it.

 

Even without the "All Computers" table the imported data is slow to load and a refresh to any of the queries after any changes can take several minutes.

     
Is there a recommended approach to bring this data into PowerBI for transformation that is not tied to a single user for updating for example not using my personal power automate to create flows to save the files to sharepoint.

 

The data changes weekly at the minute, there is a desire to make this daily and so updating the report should be as automated as possible.

The current imported table sizes are showing in PBI as :
Table 1: 12 Columns 199+ Rows

Table 2 : 11 Columns 228 Rows

Table 3: 26 Columns 999+ Rows
Table 4: 34 Columns 999+ Rows

Table 5: 91 Columns 999+ Rows

 

1 ACCEPTED SOLUTION
Greg_Deckler
Community Champion
Community Champion

@IanDavies One method, if possible, would be to change the delivery mechanism from email to something like FTP. If that is not possible, you could create a service account that would be used to create and publish your Power Automate flow that would take the data from the mailbox and put it in SharePoint/OneDrive or a network file folder. 



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

View solution in original post

4 REPLIES 4
Anonymous
Not applicable

Hi @IanDavies 

 

Greg_Deckler offers great advice, so if he has helped you solve your problem, please consider accepting Greg_Deckler's reply as a solution.
One thing I'd like to add is that if you have a large amount of data, you may want to consider using an incremental refresh to update only the data that changes to improve your refresh efficiency.

You can refer to the following link to learn about incremental refresh:

Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

 

Best Regards,
Jarvis Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Jarvis 

 

whilst I dont think I can use this suggestion on this model and data, I have given you a thumns up because I didnt know this was possible and in the future it may well be helpful

Thanks Gregg,

 

I have used this method before, I assume from your comment that pulling CSV directly from mailboxes is not recommended.

 

Sadly FTP is not an option and the aim is to autiomate the process as muich as possioble and decopuple the report from an individual\personal accvount so that anhy member of the team can update/access it and it isnt tied to me (currenlty all flows are on my account and mailbox), I'll see if they will licence a service account to get access to Power Automate.

Greg_Deckler
Community Champion
Community Champion

@IanDavies One method, if possible, would be to change the delivery mechanism from email to something like FTP. If that is not possible, you could create a service account that would be used to create and publish your Power Automate flow that would take the data from the mailbox and put it in SharePoint/OneDrive or a network file folder. 



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors