Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
BryceBicknell
Frequent Visitor

Managing Data Best Practices

Hi all,

 

I receive data on a weekly basis in CSV format which I save in a file and combine the sheets using PowerBI, however as I have well over three years of weekly data (100mb per week) this causes my refresh to be extreemly slow, so I was after some ideas on how to best manage this data and if there is a way to refresh only the latest but keep all my historical data in the file.

 

Any ideas would be greatly appreciated.

1 ACCEPTED SOLUTION

Hope that helps.


My background is heavily SQL so my default suggestion would be to ingest each weeks file into a SQL database; you could spin one up in Azure.

The other way to go is to look at a data lake in azure which I suspect is cheaper. I'm no expect in that area but following might give you a starting point:

Create a storage account for Azure Data Lake Storage Gen2 | Microsoft Docs

Drop the csv files direct in there

You could then hook the dataflow storage layer direct into it:
Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Docs

 



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

View solution in original post

3 REPLIES 3
bcdobbs
Community Champion
Community Champion

Hi, 

Sounds like you need incremental refresh. Have a look at https://www.fourmoo.com/2020/06/10/how-you-can-incrementally-refresh-any-power-bi-data-source-this-e...

 

That said can you give a little more detail:

1) Do you have a single file that contains 3 years data or are you merging each weeks file?

2) Do you have ppu or premium licences.

3) Do you have a sql server?

 



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Hi,

 

I get a file that is one weeks data at a time then I merge them together using PowerBI, so at the moment I have a folder with over 180 csv's.

 

I have a PPU Licence and no I dont have an SQL Server.

 

I will have a read through the article you sent that might be my answer.

Hope that helps.


My background is heavily SQL so my default suggestion would be to ingest each weeks file into a SQL database; you could spin one up in Azure.

The other way to go is to look at a data lake in azure which I suspect is cheaper. I'm no expect in that area but following might give you a starting point:

Create a storage account for Azure Data Lake Storage Gen2 | Microsoft Docs

Drop the csv files direct in there

You could then hook the dataflow storage layer direct into it:
Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Docs

 



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.