Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
Hi,
I have read that one can use dataflows accros multiple workspaces. This would be ideal if the reports from different workspaces are using the same queries for the same dataflows and if one has a complete overview of all dataflows use in different reports and workspaces. It would be impossible/or costing a lot of time to and correct the reports if there is something wrong with a dataflows that is used in multiple reports accros workspaces. Could this be one of the reason one should not use linked dataflows/dataflows from other workspaces???? Or am I missing something here?
Solved! Go to Solution.
Hi @Anonymous ,
Links between workspaces
Refresh for links from entities in different workspaces behave like an external data source. When the dataflow refreshes, it takes the latest data for the entity from the source dataflow. If the source dataflow refreshes, it doesn’t automatically impact the data in the destination dataflow.
Referring to the official document, it is concluded that if something goes wrong with the original entities, it won't be synchronized to the destination dataflow. The error will only be synchronized when refreshing. It is the same to fix errors.
Best Regards,
Icey
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
Is this problem solved?
If it is solved, please always accept the replies making sense as solution to your question so that people who may have the same question can get the solution directly.
If not, please let me know.
Best Regards,
Icey
Hi @Anonymous ,
Are you referring to "linked entities"?
Or, connect to one dataflow in Power BI Desktop, create different reports, and then publish to different workspaces?
Best Regards,
Icey
Hi @Anonymous ,
Links between workspaces
Refresh for links from entities in different workspaces behave like an external data source. When the dataflow refreshes, it takes the latest data for the entity from the source dataflow. If the source dataflow refreshes, it doesn’t automatically impact the data in the destination dataflow.
Referring to the official document, it is concluded that if something goes wrong with the original entities, it won't be synchronized to the destination dataflow. The error will only be synchronized when refreshing. It is the same to fix errors.
Best Regards,
Icey
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 19 | |
| 8 | |
| 7 | |
| 7 | |
| 7 |
| User | Count |
|---|---|
| 50 | |
| 45 | |
| 25 | |
| 25 | |
| 23 |