Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi folks,
Hoping someone can help me. I am having my dataflows fail with the above message: MashupException.Error: The value does not support versioning
I cant see any reference to this particular error message in any other answers or elsewhere on the web.
Some detail:
These flows have been copied using templates from another workspace. the sink is a lakehouse which is also a copy of one in another workspace. the combination works perfectly in the original workspace and the new one should be an exact copy.
The purpose of these dataflows is to append a delta of new rows to tables in the lakehouse after extracting from another DB.
Thank you for your help
I have the same issue, and I don't see this mentioned on the Fabric Known Issues list. @Anonymous can you help?
Steps followed: Check source table schema, create destination table schema, copy data.
Fast copy and staging are disabled in the power query settings.
1. Check source table schema using Table.Schema()
2. Create a table which has the same table schema. Note I specify the database schema: bronze, not dbo
%%sql
CREATE TABLE lakehouse.bronze.table(
objecttypecode STRING,
useradditionalinfo STRING,
additionalinfo STRING,
createdon TIMESTAMP,
attributemask STRING,
auditid STRING,
_userid_value STRING,
versionnumber BIGINT,
changedata STRING,
_regardingobjectid_value STRING,
transactionid STRING,
timetoliveinseconds INT,
operation INT,
action INT,
_objectid_value STRING,
_callinguserid_value STRING
)
USING DELTA
3. Attempt to run the dataflow. Source and destination mapping shown below
Outcome: fail
I've had to work around this in the end and just use the default schema for any dataflows, helpfully pipelines and notebooks work well so I can manage til microsoft work out the last bugs in the lakehouse schemas.
Hi @jackcapel
Based on the information you have provided, I speculate that the error may be caused by a mismatch between the data type of the new table and the original table. Here are some of my suggestions for you:
Ensure that the schema of the table you're appending to in the new lakehouse matches exactly with the schema in the original lakehouse.
Verify that the column data type in the data flow matches the data type in the target table.
You can try to simplify the data flow by removing any complex transformations or steps that precede the append operation. This can help find the problem and determine if a particular transformation is causing the problem.
As a temporary solution, you mentioned that it is possible to create new tables in default mode. You can use this method to ensure that data is appended correctly.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I should add that the new lakehouse uses schemas but i was able to sucessfully select the correct table to append to in the output selector.
Also it appears that the dataflows work if i create a new table in the default schema as the output. it fails when i try to append or overwrite a table in a different schema
I have the same exact issue. When writing to a new table in the default dbo schema it works, but replacing data in an existing table in another schema does not work.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
1 | |
1 | |
1 |