Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hello Fabric Community,
I manage a Power BI dashboard (PPU account) where around 100 CSV files, each visualized on separate tabs. Each CSV file is required to be updated at a specific frequency (e.g., daily, weekly, monthly).
Manually monitoring and refreshing these datasets is time-consuming, and refreshing all at once generates large logs that are hard to analyze when failures occur.
I’m looking for an efficient way to:
I’d appreciate your suggestions or best practices to handle this scenario effectively.
Thank you in advance!
Hi @data_team
Are these 100 CSV files all included in the same dataset (semantic model), or does each file represent a separate dataset? I guess it might be the former. If so, then splitting them into different dataflows based on the refresh frequency requirements is indeed a good approach, as @ALLUREAN suggested. In this way, you can schedule specific refresh times and frequencies for different dataflows. When refreshing a dataflow, it will load data from the source CSV files into the dataflow's built-in storage. Then when you refresh the dataset of a report, it will load data from these dataflows directly, rather than from the source CSV files.
To automate the refresh of dataflows or datasets, you can also consider using Power BI REST APIs and PowerShell for programmatic refresh. Here are some documentations for your reference:
Understand and optimize dataflows refresh - Power BI | Microsoft Learn
For Dataflow:
Dataflows - Refresh Dataflow - REST API (Power BI Power BI REST APIs) | Microsoft Learn
Dataflows - Update Refresh Schedule - REST API (Power BI Power BI REST APIs) | Microsoft Learn
powerbi-powershell/examples/dataflows at master · microsoft/powerbi-powershell · GitHub
For Dataset:
Datasets - Refresh Dataset In Group - REST API (Power BI Power BI REST APIs) | Microsoft Learn
For monitoring, you can set up email notifications for dataflow's schedule refresh. There are also some Power Automate templates available that you can use for monitoring: Power Automate templates for the dataflows connector - Power Query | Microsoft Learn
For troubleshooting, usually we need to determine where the refresh failed based on the refresh history, and then analyze the possible causes and find solutions.
Hope this would be helpful.
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
One way is to create 3 dataflows (Daily, Weekly, Monthly) and move the respective csv files based on their refresh sequence there and then set the refresh schedule of the dataflows. Another way is using Power Automate.
Proud to be a Super User!
Instructions for creating dataflows like these for csv files that are already in Power BI seems quite confusing. Can you direct me to specific pages on the web to follow?