Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Asmatte
New Member

Fabric Dataflow Gen2 – Refresh succeeds despite Power Query error

Hi,

 

I’ve built a Data Pipeline that refreshes a Dataflow Gen2 and logs the result of this refresh into a log Delta table.

 

In this pipeline, I’ve added an if condition that checks the status of the activity (activity('myDataflow').status).

If the status is "Failed", then I log the errorCode and errorMessage into the log table via a notebook.

 

To simulate a failure scenario, I deliberately introduced an error in the Power Query of the Dataflow (e.g. renaming a non-existent column).

In the Dataflow Gen2, I can clearly see the error under the “Check Validation” section.

 

The problem:

When I trigger the refresh from the pipeline (or even manually), the refresh status still returns “Succeeded”, and no error is thrown, so my log never captures the failure.

 

My question:

Is it expected behavior that a Dataflow Gen2 with a Power Query error still returns a “Succeeded” status when refreshed?

How can I catch Power Query errors (from “Check Validation”) during the pipeline execution, so they can be properly logged and flagged?

 

This is critical for me to ensure that failed transformations don’t silently pass as successful.

 

Thanks !

1 ACCEPTED SOLUTION

When you do the "save & run", if the save operation fails then you'll be able to check it through the "Check validation". If it fails, it means that the version that you tried to save didn't actually save, so you are using the previous version of your Dataflow.

 

You have a couple of ways now to check what your published Dataflow looks like. Here are a the list of ways to check them out from easiest to more complex:

  1. Opening the Dataflow: When you use Dataflow Gen2 with CI/CD capabilities, you can always discard any previously unsaved versions of your Dataflow and then open again the Dataflow which should match the version of the Dataflow that the pipeline can trigger for refresh
  2. Checking Git: As this is not an option for you, I'll just skip this one but its also one of the easiest ones where you could just check Git and see exactly what the M code looks like
  3. REST API: This is a bit more complex, but you can leverage the REST API endpoint for GET Dataflow definition. The actual M code for your is in the path: mashup.pq and you'll need to decode the payload which is in base64 to then see your full Dataflow mashup script.

 

In other words, if we find any issues with your Dataflow during the "save" operation, then it will simply not be committed whatsoever.

 

If the intent is to trigger an error to see how things look like in Dataflows when an error occurs, you could take a different path where you can create a Fabric item (like a Lakehouse), create a Dataflow that connects to a table, save and run the Dataflow, then rename the table in the lakehouse and try to run the Dataflow again. It should fail as you've changed the name of the table in the Fabric item after the Dataflow was correctly saved and no validations failed.

View solution in original post

4 REPLIES 4
miguel
Community Admin
Community Admin

Could you check on Git your pq file and see if it contains the step that introduces the error?

 

in principle, the save validation should prevent you from saving your dataflow hence that version of the dataflow shouldn't be used for refresh purposes. Therefore, what you're running in a pipeline is not actually the dataflow with the error that you introduced but probably a prior version of it.

 

do please let us know what you see on git 

About GIT :

I checked for Git integration as you suggested, but unfortunately, Git is not enabled in my organization. Is there any other way to see the published version history of the Dataflow Gen2 ?

 

I wanted also to follow up with a clearer explanation of my issue, along with some screenshots to help illustrate what I’m observing.

 

Dataflow Gen2 - Power Query : 

 

let
Source = CommonDataService.Database("mycompany.crm.dynamics.com"),
dbo_account = Source{[Schema="dbo",Item="account"]}[Data],
#"Renamed columns" = Table.RenameColumns(dbo_account, {{"xxxx", "Error"}})
in
#"Renamed columns"

 

 Asmatte_0-1747820891407.png

 

 

 

 

 

 

This error is also detected in the "Check Validation" section :

 

Asmatte_1-1747821188408.png

Despite the Power Query error, the pipeline activity returns "Succeeded". I get this status whether I'm doing a Save or a manual Refresh of the Dataflow. However, I do have a "Failed" when I'm doing a Save & Run. This would confirm the idea that there will be a versioning issue. 

 

My Pipeline Design

Asmatte_3-1747822661756.png

Here’s how it works:

Step 1 – Set Start Time: @utcNow() 

Step 2 – Until : @or(equals(activity('Bronze - Dynamics - Accounts').Status, 'Succeeded'), greaterOrEquals(variables('Counter'), 3))

Step 3 – Set End Time: @utcNow() 

Step 4 – Log Outcome:  

  • Succeeded : 
    • errorCode = null 
    • errorMessage = null
  • Failed :
    • errorCode =@activity('Bronze - Dynamics - Accounts').error.code
    • errorMessage = @activity('Bronze - Dynamics - Accounts').error.message
Step 5 : Append my Log in my Log Delta Table

 

My concerns : 

If the query has changed (a renamed or missing column using), but the Dataflow still returns "Succeeded" due to versioning or cached logic, then errors could go unnoticed.

  • How can I guarantee the latest version of the query is what gets executed in the pipeline?
  • How can I catch schema drift or query errors reliably at runtime?
  • Is there a way to log these errors, even if the pipeline says "Succeeded"?

I’m just getting started with Fabric and want to make sure I’m approaching this the right way.

When you do the "save & run", if the save operation fails then you'll be able to check it through the "Check validation". If it fails, it means that the version that you tried to save didn't actually save, so you are using the previous version of your Dataflow.

 

You have a couple of ways now to check what your published Dataflow looks like. Here are a the list of ways to check them out from easiest to more complex:

  1. Opening the Dataflow: When you use Dataflow Gen2 with CI/CD capabilities, you can always discard any previously unsaved versions of your Dataflow and then open again the Dataflow which should match the version of the Dataflow that the pipeline can trigger for refresh
  2. Checking Git: As this is not an option for you, I'll just skip this one but its also one of the easiest ones where you could just check Git and see exactly what the M code looks like
  3. REST API: This is a bit more complex, but you can leverage the REST API endpoint for GET Dataflow definition. The actual M code for your is in the path: mashup.pq and you'll need to decode the payload which is in base64 to then see your full Dataflow mashup script.

 

In other words, if we find any issues with your Dataflow during the "save" operation, then it will simply not be committed whatsoever.

 

If the intent is to trigger an error to see how things look like in Dataflows when an error occurs, you could take a different path where you can create a Fabric item (like a Lakehouse), create a Dataflow that connects to a table, save and run the Dataflow, then rename the table in the lakehouse and try to run the Dataflow again. It should fail as you've changed the name of the table in the Fabric item after the Dataflow was correctly saved and no validations failed.

Ok, very clear ! 

 

I now fully understand how Dataflow Gen2 works, and it makes sense, especially compared to Gen1, which I was more familiar with before. That explains the differences I was seeing.

 

To confirm, I ran a test where I didn’t introduce an error in the Power Query itself (since that would prevent saving), but instead I introduced an error in the data source. That way, the Dataflow was successfully saved and then failed at runtime, exactly as expected. The error was properly raised and surfaced in my Data Pipeline.

 

Appreciate the clarification!

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors