Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
When loading a single table from D365 BC to my Fabric Lakehouse using Dataflow Gen2, I get this error:
"[TABLENAME]_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Error in replacing table's content with new data in a version: #{0}., Underlying error: OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. Details: Reason = DataSource.Error;ErrorCode = Lakehouse036;Message = OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.;Detail = [DataSourceKind = "Dynamics365BusinessCentral", DataSourcePath = "Dynamics365BusinessCentral"];Message.Format = OData: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.;Microsoft.Data.Mashup.Error.Context = User'. Error code: Mashup Exception Data Source Error."
The table I am trying to load is the "generalLedgerEntries" table from the Standard BC API (v2). I have succesfully loaded tables using the exact same logic for other countries (we have 3 countries/tenants with the same table structures for generalLedgerEntries).
In my Dataflow Gen2 transformations, I do the following main actions: 1) drill down on "dimensionSetLines", 2) expand on dimensionSetLines to get all dimensionSetLines columns, 3) remove columns I don't need, 4) Filter away nulls, 5) change data types to text, 6) add custom column which concatenates two columns 7) remove some duplicates.
I don't get the error - it's impossible to debug something like this.
I have no existing table with the same name in my lakehouse destination.
Thanks!
Solved! Go to Solution.
This error often occurs due to a combination of network instability, data size, or data structure inconsistencies when using Dataflow Gen2 with Dynamics 365 Business Central (D365 BC).
Steps you can try-
Confirm Connectivity and Stability
Review the Data Source
Modify Transformation Steps
Some of your transformations may contribute to the issue. To isolate the problem:
Validate Destination Compatibility
Enable Diagnostics
In Microsoft Fabric Dataflows Gen2, enable verbose diagnostics: Open the Dataflow settings àEnable Tracing to capture detailed logs. Review logs for specific details about the failure.
**** If the issue persists, try exporting data from D365 BC manually to a CSV or Parquet file and loading it into your lakehouse directly. This bypasses potential API issues.
--If this helpful to resolve the issue, kindly accept it as solution--
Thank you
Hi @useruserhi91 ,
Thanks for the reply from SuryaTejaK .
Is my follow-up just to ask if the problem has been solved?
If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
This error often occurs due to a combination of network instability, data size, or data structure inconsistencies when using Dataflow Gen2 with Dynamics 365 Business Central (D365 BC).
Steps you can try-
Confirm Connectivity and Stability
Review the Data Source
Modify Transformation Steps
Some of your transformations may contribute to the issue. To isolate the problem:
Validate Destination Compatibility
Enable Diagnostics
In Microsoft Fabric Dataflows Gen2, enable verbose diagnostics: Open the Dataflow settings àEnable Tracing to capture detailed logs. Review logs for specific details about the failure.
**** If the issue persists, try exporting data from D365 BC manually to a CSV or Parquet file and loading it into your lakehouse directly. This bypasses potential API issues.
--If this helpful to resolve the issue, kindly accept it as solution--
Thank you
Hi @useruserhi91 ,
The error message indicates a problem when refreshing Dataflow.
I have a few suggestions below:
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.