Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I have a automated solution in place about 10 months but few days ago, the scheduled refresh of the report published to Powerbi Service started to not finish the refresh and stooppped on error: "DataFormat.Error: We were unable to load this Excel file because we couldn't understand its format. File contains corrupted data.. Microsoft.Data.Mashup.ErrorCode = 10942. ;We were unable to load this Excel file because we couldn't understand its format. File contains corrupted data.. The exception was raised by the IDbCommand interface. ".
The solution uses scheduled exports from the company application , the exported files have xlsx format and are saved using Power Automate to the sharepoint folder. The exported data file is not corrupted, it has no password, sensitivity label and it seems it did not change formating at all (I compared older exports = looked the same). I tried many times to export file, change data in the report. The powerBI desktop works fine with that xlsx fomat, I can reload data, all steps in power query work. I tried to republish to the other workspace but no change. The scheduled refresh on service simply stopps on the error message. It worked I think until Friday 13th 2025 (the date? :-). I have noticed new PowerBI version was released on June 10th with some new features about sensitivity lableling but this should not be the case. Why does it work in Dektop but not after publish? Any good advice from anyone? Thank you in advance.
Solved! Go to Solution.
@JoeAnalyst Managed to get a work around which works for me by running an empty office script in the power automate flow to act as an open and save.
After the create file action, use an excel Run Script action.
function main(workbook: ExcelScript.Workbook) {
// This script doesn't need to make any explicit changes or call a 'save' method.
// You can add a simple log to confirm it ran if you wish, but it's not strictly necessary.
console.log("Office Script ran successfully for workbook: " + workbook.getName());
}
Thanks for the reply again. This XML from my file has all these custom override as in your screenshot.. Also tried to copy paste the same content to a new Excel worksheet, and it doesn't fix the custom entries in content type xml in my end.
and if you open it and do a save as, and reinspect those customs are still there?
Unfortunately I no longer work with salesforce data and have access to these files to check but I am certain that by triggering a save the xlm was rebuilt with the custom overides removed.
You could try for testing purposes to manualy remove the custom overides,
customXml/itemProps1.xml
customXml/itemProps2.xml
customXml/itemProps3.xml
and I think also Override PartName="/docProps/custom.xml"
save it and try run it in the web service again. if it works which i suspect it should then we just need to work out why the save is keeping those attributes, maybe the metadat is embeded.
If you are copying the worksheet to a new workbook and the customs are still there then maybe try past the data as values and then inspect the new copy to see if the custom data is no longer there, this should remove any chance of embeded data from copying over.
if that works, then you could just change out your office script to copy the data and paste as values.
@JoeAnalyst I getting the script not found error. Maybe it's a data loss prevention policy at my company?
{"message":"Script not found. It may have been unshared or deleted.\r\nclientRequestId: 8d9d057d-ca17-4960-b135-c0be8acf2ebd","logs":null}
I have been utilizing this Run Script step in my Power Automate flows and it has been working great for months. Today they all failed. Can't figure out why they are all of a sudden failing.
It worked for me once, but has not worked since. I still have an open ticket, but Microsoft is not really giving me a solution. They are just asking for log trace after log trace. I provided them this topic as reference and sent the issue to the Excel team I believe.
Hi @pb1051 ,
I have made some adjustments to the flow as suggested by others, and it has been working fine so far. I have added "Get file Metadata" in the flow. You might consider replicating the same approach. My flow is as follows:
This didn't work for me, but I have a ticket open with the PowerAutomate team now.
Hi ,
Could you please share more details on what you entered in the get file metadata action, and also provide the script?
I'd appreciate it.
Thanks.
Could you please describe the flow in more detail? Preferably with a screenshot? Unfortunately, I don't know where I need to enter the code after adding the additional action.
@Olze-Kay
start with Excel and go to the Automate tab and click on ‘New Script’
In the editor that will pop up on the left, delete everything and replace it with,
function main(workbook: ExcelScript.Workbook) {}then save the script with a name of your choice, the script will save in your office scripts folder of your OneDrive
In your power automate flow, add a new activity, Excel Online (Business) Run script after your create file action which is likely a SharePoint or OneDrive create file action.
Fill in the four fields for the Runs script action as,
Location: the location where you created the file, choose OneDrive for business if you created the file on your one drive, else the sharepoint location you created it.
Document Library: should be Documents
File: use dynamic content and add body/Id when you over it should show something like the below, assuming your previous create file action is labelled Create file,
outputs('Create_file')?['body/Id']
Script: you should find the script you just created in here, select that one.
That is it.. good luck..
Hi @fr_jc ,
I'm facing same issue and did change my flow as per your screenshot. However, i'm still encounter the error at Run Script where "The file format is not recognized". My flow is trigger when receive an email with attachment (File from Salesforce) and save in SharePoint Folder. is there any workaround can be followed?
Hi @90_eryka
assuming your saving action is storing an exact copy of the attachment, that copy will still have the custom overide entries in the xml. at this point if you open it and save it the power query connector will read it ok. This is where the run script action automates that for you.
your flow should look something like the below, the for each loop will iterate through all the emails attachments.
Confirm that what the flow saves in your sharepoint is actualy what you are expecting, the xlsx file.
Your condition logic should only select the attachment you are after, you can add something like, item()?['name'] ends with .xlsx or use the exact name of the attachment to be more precise.
If there are images like logo's ect in the signature, these are also attachments, make sure you are not picking those up and storing them. For example if the body of the email had 2 images in the signature and one excel attachment, without the logic your create item will save each one with the same name in the same location so only the last one will exist which could actually be an image which is causing your problem.
Hi @fr_jc ,
I’ve confirmed that there is no extra image, and I’m storing the exact file as per the attachment.
I’ve also added the condition you suggested, along with an additional step to delay the flow by 5 seconds (as recommended by Co-pilot) to ensure the file is fully created before the script runs. However, the Run Script step still fails. Is there any step I might be missing?
Thank you.
@90_eryka that looks to be right from what I can see with actions.
can you maybe share the actual error description that you are getting?
Just checking on the Run Script actions location and doc library.
The location should be the same site you used in the previous create file action
if you look at the code view you should see the source name for the run script should match the dataset name of the create file.
The Document library normaly defaults to Documents, I see you have Analytics. Did you slect this from the availble drop down or did you add as a custom value?
Hi @fr_jc ,
The error message is "Flow run failed". My script in excel is
Also I did select "Documents" for Document Library but the flow still failed
That all looks ok to me, but let’s go through it again.
The delay action is good if the file is slow to create or is stuck in a locked status. Normally though you would receive a file is locked error,
Using a Do Until action can work well here if the file is locked.
The office script is correct; you do have a space between the curly braces but that shouldn’t make any difference. You should be able to test it by simply running it in excel and it should look like it does nothing, but it should run successfuly without error.
function main(workbook: ExcelScript.Workbook) {}
Your Create File action is storing the attachment in a SharePoint location using the same name as the attachment.
Your Run Script action is,
For the error, we should see a more descriptive error other than “Flow run failed”. We need to pin point where it fails.
Are you able to share a screenshot of where in the flow the red x’s appear? On that action it should show the error that we are interested in.
Does the run script have any output log or does it fail before generating one?
When I run the flow I get the error 404 Script not found. I even resaved the script under a different title. Is there a workaround for this?
This worked perfectly for me! Thanks.
@JoeAnalyst When I add the script in my flow I get an error about the file format not being recognized. "The file format is not recognized. Please check that the selected workbook is valid."
I hypothesize that since I'm not using the "Create file" action and my trigger is that a SP file is "Created or modified" that my action with the script requires an Apply to Each with the script iside. The error still occurs after selecting the dynamic "ID" from the trigger event. I'm lost at this point.
I've been having the users open the file manually and re-save and this has helped with the error, but I would like to return to an automated solution for them.
Any help from anyone that is more familiar with PowerAutomate would be greatly appreciated.
@pb1051 You need to use "Indentifier" instead of "ID". And I also recommend to create the condition only for creation (not modification of the file). As the condition formula you can use timestamps Modified=Created. Otherwise the file will be touched twice: first time when file is uploaded to the folder and again when the flow modifies it.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!