Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
useruserhi91
Helper I
Helper I

Dataflow Gen2 error - "Error in replacing table's content with new data in a version"

When loading a single table from D365 BC to my Fabric Lakehouse using Dataflow Gen2, I get this error:

"[TABLENAME]_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}., Underlying error: OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.;Detail = [DataSourceKind = "Dynamics365BusinessCentral", DataSourcePath = "Dynamics365BusinessCentral"];Message.Format = OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.;Microsoft.Data.Mashup.Error.Context = User'. Error code: Mashup Exception Data Source Error."

 

The table I am trying to load is the "generalLedgerEntries" table from the Standard BC API (v2). I have succesfully loaded tables using the exact same logic for other countries (we have 3 countries/tenants with the same table structures for generalLedgerEntries).

 

In my Dataflow Gen2 transformations, I do the following main actions: 1) drill down on "dimensionSetLines", 2) expand on dimensionSetLines to get all dimensionSetLines columns, 3) remove columns I don't need, 4) Filter away nulls, 5) change data types to text, 6) add custom column which concatenates two columns 7) remove some duplicates.

 

I don't get the error - it's impossible to debug something like this.

 

I have no existing table with the same name in my lakehouse destination.

 

Thanks!

 

 

1 ACCEPTED SOLUTION
SuryaTejaK
Advocate II
Advocate II

Hi @useruserhi91 

 

This error often occurs due to a combination of network instability, data size, or data structure inconsistencies when using Dataflow Gen2 with Dynamics 365 Business Central (D365 BC).

Steps you can try-

Confirm Connectivity and Stability

  • Ensure a stable network connection between Microsoft Fabric and the D365 BC API.
  • If the error mentions "transport connection forcibly closed", it could be caused by a timeout or API rate limits. Consider these actions:
    • Test the connection to D365 BC using a smaller dataset to confirm whether the issue persists.
    • Contact your D365 BC administrator to confirm that no rate limits or API restrictions are being hit.

Review the Data Source

  • Verify that the generalLedgerEntries table does not contain problematic data, particularly in fields like dimensionSetLines that you’re expanding.
  • Test loading the raw data without transformations to see if the issue lies in the data extraction step.

Modify Transformation Steps

Some of your transformations may contribute to the issue. To isolate the problem:

  • Step-by-step testing: Incrementally apply transformations and test the load after each step to pinpoint where the error occurs.
    • Custom Columns: Simplify or temporarily remove any custom column logic to verify if it’s causing issues.
    • Filters: Apply filters to limit the data being processed during testing.

Validate Destination Compatibility

  • Even if there’s no existing table with the same name in your lakehouse: Ensure that column names and data types in the transformed table match what the lakehouse can accept

Enable Diagnostics

In Microsoft Fabric Dataflows Gen2, enable verbose diagnostics: Open the Dataflow settings àEnable Tracing to capture detailed logs. Review logs for specific details about the failure.

 

**** If the issue persists, try exporting data from D365 BC manually to a CSV or Parquet file and loading it into your lakehouse directly. This bypasses potential API issues.

 

                            --If this helpful to resolve the issue, kindly accept it as solution--

 

Thank you

 

View solution in original post

3 REPLIES 3
Anonymous
Not applicable

Hi @useruserhi91 ,

 

Thanks for the reply from SuryaTejaK .

 

Is my follow-up just to ask if the problem has been solved?

 

If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?

 

Thank you very much for your cooperation!

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

SuryaTejaK
Advocate II
Advocate II

Hi @useruserhi91 

 

This error often occurs due to a combination of network instability, data size, or data structure inconsistencies when using Dataflow Gen2 with Dynamics 365 Business Central (D365 BC).

Steps you can try-

Confirm Connectivity and Stability

  • Ensure a stable network connection between Microsoft Fabric and the D365 BC API.
  • If the error mentions "transport connection forcibly closed", it could be caused by a timeout or API rate limits. Consider these actions:
    • Test the connection to D365 BC using a smaller dataset to confirm whether the issue persists.
    • Contact your D365 BC administrator to confirm that no rate limits or API restrictions are being hit.

Review the Data Source

  • Verify that the generalLedgerEntries table does not contain problematic data, particularly in fields like dimensionSetLines that you’re expanding.
  • Test loading the raw data without transformations to see if the issue lies in the data extraction step.

Modify Transformation Steps

Some of your transformations may contribute to the issue. To isolate the problem:

  • Step-by-step testing: Incrementally apply transformations and test the load after each step to pinpoint where the error occurs.
    • Custom Columns: Simplify or temporarily remove any custom column logic to verify if it’s causing issues.
    • Filters: Apply filters to limit the data being processed during testing.

Validate Destination Compatibility

  • Even if there’s no existing table with the same name in your lakehouse: Ensure that column names and data types in the transformed table match what the lakehouse can accept

Enable Diagnostics

In Microsoft Fabric Dataflows Gen2, enable verbose diagnostics: Open the Dataflow settings àEnable Tracing to capture detailed logs. Review logs for specific details about the failure.

 

**** If the issue persists, try exporting data from D365 BC manually to a CSV or Parquet file and loading it into your lakehouse directly. This bypasses potential API issues.

 

                            --If this helpful to resolve the issue, kindly accept it as solution--

 

Thank you

 

Anonymous
Not applicable

Hi @useruserhi91 ,

 

The error message indicates a problem when refreshing Dataflow.

 

I have a few suggestions below:

 

  • First, make sure your internet connection is stable, especially during a Dataflow refresh.
  • Second, try clearing your browser cache and then re-sign in.
  • Again, make sure that the account you are logged into while using Dataflow has the required permissions to access the Dynamics 365 BC data source. Then double-check the configuration of the Dynamics 365 BC datasource to make sure it is correct and up-to-date.
  • Finally, try recreating Dataflow from scratch to check if the problem still exists.

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.