The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi all,
I am trying to configure incremental refresh on a data source that contains financial entries (general ledger entries). I am using a Dataflow Gen 2 and am publishing to a Fabric Lakehouse. When I publish the dataflow, the refresh fails with the error "[TableName]+WriteToDataDestination: There was a problem refreshing the dataflow. Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: The value does not support versioning..."
If I disable incremental refresh, I am able to publish the data to the lakehouse, but the volume of the data is too high for me to always do a replace query.
I confirmed that my query is fully folded. I am using Transaction Date as my date/time column. I have also tried publishing to a Warehouse, but I get the same error.
Can anyone suggest additional things to check?
Hi @craigmday ,
Could you please confirm if the issue has been resolved after raising a support case? If a solution has been found, it would be greatly appreciated if you could share your insights with the community. This would be helpful for other members who may encounter similar issues.
Thank you for your understanding and assistance.
Hi,
I was not able to get the incremental refresh to work but we are applying a work-around of doing daily append updates for now. I will advise if/when we get the issue properly resolved.
Hi @craigmday ,
We really appreciate your efforts and for letting us know the update on the issue.
Please continue using fabric community forum for your further assistance.
Thank you
Hi craigmday,
That error usually comes up because of how Dataflow Gen2 handles versioning when writing to a Lakehouse/Warehouse. With incremental refresh, Fabric tries to write only deltas, but the destination table isn’t supporting the versioning process. That’s why it works with full refresh but fails with incremental.
A couple of things you can check/try:
This way, Dataflow is only handling small, fast loads, and your Lakehouse manages history. It usually performs much better and avoids this type of error.
Hope this helps — let us know how it goes 👍
Hi Tanveer,
Thanks very much for the feedback. The data source has a compound primary key consisting of six different fields. I tried including all fields to ensure the full key value was present, but got the same error. I tried adding a merge column which combined the six columns into one, however, this part of the query does not fold, so I can't publish with incremental refresh.
Do you know if I need to have a single unique key column in my data source for this work?
Thanks.
Hi @craigmday ,
Please consider reaching out to Microsoft Support. You can provide them with all the troubleshooting steps you've already taken, which will help them understand the issue better and provide a resolution. They might be able to identify something specific about your admin account setup or provide a solution that isn't immediately obvious.
Below is the link to create Microsoft Support ticket:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
Thank you
Hi @craigmday ,
Thanks for reaching out to the Microsoft fabric community forum.
Always ensure the basic refresh requirements are met before troubleshooting further.
Additionally, as mentioned by @BhaveshPatel this error could be an issue with your data type within your Power BI Desktop file or Excel workbook. It can also be due to an out-of-date Power BI Desktop version.
I hope this information helps. Please do let us know if you have any further queries.
Thank you
Hi @craigmday,
For Incremental Refresh In Dataflow Gen 2, Sometimes Dataflow Gen 2 is failing. The reason behind that the data is not integar column in Delta Lake. For that, You have to first apply data ( Full Load ) in Dataflow Gen 2. After that, convert it into lakehouse notebooks and after that use incremental refresh. ( below is a sample dimdate notebooks incremental refresh). This will be successful. Here, You see Year column is an integar column in Delta Lake.