Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I just got confirmation from Microsoft's support that any semantic model originating from Fabric data cannot be deployed to another workspace, even if it was created with Power BI Desktop (which was their first suggestion).
Has any of you been able to overcome this issue? How?
Thanks to all for your help 🙂
What I've ended up doing is to create a script in dev to extract the schema of the semantic model (that's the only part that matters) and then recreate it with another script in prod, so that a colleague can more easily reproduce the Power BI report. At least they now have the right tables, columns, and data types.
The data from the original semantic model is not relevant (anonymized).
Hi @ramonsuarez
Just following up to check whether your query got resolved. If not we will revert back with more details and try to help.
Thanks
Not really. We found a way to work but it is manual and prone to error, specially for the Power BI report part.
I read here that it is not supported to use Direct lake semantic model with Deployment pipelines.
Did you try to use the Git integration instead of deployment pipeline? I think/hope that the Git integration works with lakehouse, default semantic model and report?
However if you are not using the default semantic model, but creating a new direct lake semantic model from the lakehouse, then it seems to be not supported by Git integration.
I've created delta tables in a lakehouse, moved the data through the medallion, and after modelling it I created a dataset based on the default semantic model and a Power BI report. I want to deploy the report and model to a production workspace.
PS : great videos Andy, thanks a lot for all your hard work.
I would try to use Power BI desktop to connect to the existing semantic model in Fabric workspace, and create a report which I publish to another workspace. This report will have a live connection to the semantic model in Fabric workspace.
I think the semantic model itself needs to be in the same workspace as the lakehouse, if you want to use Direct Lake. But I think a report with a live connection to that semantic model, can be published to another workspace. I would try that at least.
I guess you could also create a new report directly in another workspace (create the report in Power BI service) and choose your existing semantic model in the Fabric workspace as the source.
If you want to use Import mode, then you can connect to the Lakehouse (SQL Analytics Endpoint) from Power BI desktop and create an import mode semantic model which you can publish to another workspace.
Tried to create the semantic model in Desktop, by myself and following Microsoft support's instructions: it does not work. As things stand, there's no deploying of any model that links to a model in Fabric. The screenshot in my post is the message from support saying that it cannot be done.
I don't want to make two reports and update them separately l, specially because I don't have access to the production workspace, but I will make a test workspace and run your ideas. Thanks!
Yes, creating a direct lake semantic model won't work in Power BI desktop.
However, I think you can connect to a direct lake semantic model from Power BI desktop. In Power BI desktop, use Get Data -> Power BI Semantic Model, or Data Hub -> OneLake Data hub, to find and connect to your semantic model which already exists in your Fabric workspace.
This would give you the ability to create a report in Power BI desktop, with a live connection to the semantic model in Fabric workspace, and publish that report to another workspace.
Or you can just open a workspace in Power BI Service and click New -> Power BI Report and select your existing semantic model in the Fabric workspace as the data source.
I don't think you need to create two reports.
I am suggesting to create the report in another workspace, and only keep the semantic model in the Fabric workspace.
I just tested this on my side and it worked successfully:
1. Create Lakehouse in workspace A (Fabric trial license). This will automatically create a default semantic model. You can create relationships and measures in the SQL Analytics Endpoint -> Model.
2. Create a new report in another workspace (Pro license). Select the semantic model from workspace A as the data source for the new report.
Also worked fine using Power BI Desktop to connect to the Fabric semantic model, then create report and publish to Pro license workspace.
Thanks a lot. Great idea.
How do you deal in this case with the change of data from anonymized to real data? Wouldn't using that dataset risk overwritting the real data with every refresh?
User | Count |
---|---|
24 | |
15 | |
5 | |
5 | |
2 |
User | Count |
---|---|
49 | |
44 | |
18 | |
8 | |
6 |