Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi all,
I have a problem with one dataflow with the refresh.
Use case:
Our roboter will be save each day a excel file at our sharepoint folder.
The dataflow will be refreshed automatically each day and should be pick up the new file with the new data.
The refresh will be done without errors. But the new data from the excel file are not shown in the dataflow.
If I open the excel file and store the file with the same name at same place and do the refresh again it will be available.
Do you have any ideas which problem with the excelfile /refresh can be there?
BR
Sebastian
Solved! Go to Solution.
Ensure Proper File Release by the Robot:
Validate the File Format:
Force Metadata Update:
Clear SharePoint Cache:
Use a File Monitoring Trigger:
Debugging Steps:
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Best Regards,
HSathwara.
Hi @sewi - all of the suggestions listed by @HarshSathwara19 are good. In your current process, it sounds like the dataflow refresh may just be occurring before the new file is added to the document library.
You can investigate this by comparing the created date for the new SharePoint file that is added by the robot to the dataflow refresh start time. If the dataflow refresh start time is equal to/before the file created date or even 1-2 minutes after, the new file may not be readable by the dataflow. If you find that the dataflow start time is after 1-2 minutes of the file created date, then look to see if the file was checked out - this can sometimes cause it to be unreadable.
You can also include source file info which the dataflow is using, in the result of the dataflow, so you can see which file was included in the run.
If the objective is for the dataflow to be refreshed when a new file is added to the document library, I agree with @HarshSathwara19 that the best option would be to use Power Automate, which can be created manually or you describe the flow and let Power Automate create it for you.
Please let us know if we can further assist.
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
The issue may be due to file locking, metadata caching, or sync delays. Here’s how to fix it:
This will help ensure Power BI picks up the new data during automatic refreshes.
Hi sewi,
We are following up to see if your query has been resolved. Should you have identified a solution, we kindly request you to share it with the community to assist others facing similar issues.
If our response was helpful, please mark it as the accepted solution and provide kudos, as this helps the broader community.
Thank you.
Hi sewi,
We have not received a response from you regarding the query and were following up to check if you have found a resolution. If you have identified a solution, we kindly request you to share it with the community, as it may be helpful to others facing a similar issue.
If you find the response helpful, please mark it as the accepted solution and provide kudos, as this will help other members with similar queries.
Thank you.
Hi @sewi,
We would like to inquire if the solution offered by @HarshSathwara19 and @jennratten has resolved your issue. If you have discovered an alternative approach, we encourage you to share it with the community to assist others facing similar challenges.
Should you find the response helpful, please mark it as the accepted solution and add kudos. This recognition benefits other members seeking solutions to related queries.
Thank you.
Hi @sewi - all of the suggestions listed by @HarshSathwara19 are good. In your current process, it sounds like the dataflow refresh may just be occurring before the new file is added to the document library.
You can investigate this by comparing the created date for the new SharePoint file that is added by the robot to the dataflow refresh start time. If the dataflow refresh start time is equal to/before the file created date or even 1-2 minutes after, the new file may not be readable by the dataflow. If you find that the dataflow start time is after 1-2 minutes of the file created date, then look to see if the file was checked out - this can sometimes cause it to be unreadable.
You can also include source file info which the dataflow is using, in the result of the dataflow, so you can see which file was included in the run.
If the objective is for the dataflow to be refreshed when a new file is added to the document library, I agree with @HarshSathwara19 that the best option would be to use Power Automate, which can be created manually or you describe the flow and let Power Automate create it for you.
Please let us know if we can further assist.
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
Ensure Proper File Release by the Robot:
Validate the File Format:
Force Metadata Update:
Clear SharePoint Cache:
Use a File Monitoring Trigger:
Debugging Steps:
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Best Regards,
HSathwara.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
14 | |
13 | |
12 | |
12 | |
12 |