Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hi all,
I receive data on a weekly basis in CSV format which I save in a file and combine the sheets using PowerBI, however as I have well over three years of weekly data (100mb per week) this causes my refresh to be extreemly slow, so I was after some ideas on how to best manage this data and if there is a way to refresh only the latest but keep all my historical data in the file.
Any ideas would be greatly appreciated.
Solved! Go to Solution.
Hope that helps.
My background is heavily SQL so my default suggestion would be to ingest each weeks file into a SQL database; you could spin one up in Azure.
The other way to go is to look at a data lake in azure which I suspect is cheaper. I'm no expect in that area but following might give you a starting point:
Create a storage account for Azure Data Lake Storage Gen2 | Microsoft Docs
Drop the csv files direct in there
You could then hook the dataflow storage layer direct into it:
Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Docs
Hi,
Sounds like you need incremental refresh. Have a look at https://www.fourmoo.com/2020/06/10/how-you-can-incrementally-refresh-any-power-bi-data-source-this-e...
That said can you give a little more detail:
1) Do you have a single file that contains 3 years data or are you merging each weeks file?
2) Do you have ppu or premium licences.
3) Do you have a sql server?
Hi,
I get a file that is one weeks data at a time then I merge them together using PowerBI, so at the moment I have a folder with over 180 csv's.
I have a PPU Licence and no I dont have an SQL Server.
I will have a read through the article you sent that might be my answer.
Hope that helps.
My background is heavily SQL so my default suggestion would be to ingest each weeks file into a SQL database; you could spin one up in Azure.
The other way to go is to look at a data lake in azure which I suspect is cheaper. I'm no expect in that area but following might give you a starting point:
Create a storage account for Azure Data Lake Storage Gen2 | Microsoft Docs
Drop the csv files direct in there
You could then hook the dataflow storage layer direct into it:
Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Docs
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
49 | |
32 | |
27 | |
27 | |
26 |
User | Count |
---|---|
61 | |
56 | |
34 | |
29 | |
28 |