The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I am encountering an issue when using Dataflow Gen2 to write data into a Warehouse located in another workspace. Specifically, when trying to use a Dataflow from Workspace A, I cannot select a Warehouse from Workspace B as the destination. The message displayed is: 'For performance reasons, only Warehouses in the current workspace are shown.'
To work around this, I created a Shortcut in Workspace A's Lakehouse that points to the Warehouse in Workspace B. This allowed me to see and select the table I wanted from the Warehouse in Workspace B. However, when I try to publish the Dataflow, I receive the following error:
"Mashup Exception Expression Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: The value does not support versioning. Details: Reason = Expression.Error;Detail = #table({"id", "batch_count", "date_id"}, {});Microsoft.Data.Mashup.Error.Context = User"
I am wondering whether:
Can anyone confirm if using Shortcuts in this scenario is supported, or is this issue related to the Power Query transformations? Any help would be greatly appreciated.
Solved! Go to Solution.
Hi @saradada-xtivia ,
Below are some of my thoughts on your question:
1. Writing Data into a Warehouse in Another Workspace:
Currently, Dataflow Gen2 does not support writing data directly into a Warehouse located in a different workspace. This is a known limitation for performance reasons. The message you received, "For performance reasons, only Warehouses in the current workspace are shown," confirms this limitation.
2. Using Shortcuts to Work Around the Limitation:
While creating a Shortcut in Workspace A's Lakehouse that points to the Warehouse in Workspace B allows you to see and select the table, it seems this approach is not fully supported for publishing Dataflows. The error you're encountering, "Mashup Exception Expression Error," suggests there might be an issue with the Power Query transformations or the way the Shortcut is being handled.
3. You can look at these documents below:
Solved: Re: DataFlow with DataWarehouse in other Workspace... - Microsoft Fabric Community
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
So I think it's best to keep the data flow and the repository in the same workspace as far as I can tell. This will avoid the problems you are experiencing. Also you can keep an eye on Data Factory updates as they may introduce new features or improvements that will solve this limitation in the future.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @saradada-xtivia ,
Below are some of my thoughts on your question:
1. Writing Data into a Warehouse in Another Workspace:
Currently, Dataflow Gen2 does not support writing data directly into a Warehouse located in a different workspace. This is a known limitation for performance reasons. The message you received, "For performance reasons, only Warehouses in the current workspace are shown," confirms this limitation.
2. Using Shortcuts to Work Around the Limitation:
While creating a Shortcut in Workspace A's Lakehouse that points to the Warehouse in Workspace B allows you to see and select the table, it seems this approach is not fully supported for publishing Dataflows. The error you're encountering, "Mashup Exception Expression Error," suggests there might be an issue with the Power Query transformations or the way the Shortcut is being handled.
3. You can look at these documents below:
Solved: Re: DataFlow with DataWarehouse in other Workspace... - Microsoft Fabric Community
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
So I think it's best to keep the data flow and the repository in the same workspace as far as I can tell. This will avoid the problems you are experiencing. Also you can keep an eye on Data Factory updates as they may introduce new features or improvements that will solve this limitation in the future.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.