Thanks for the update. It will definitely help others in the community as well. Please stay connected and continue participating in future discussions.
Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi Guys,
I have the issue like that after copying semantic model from one workspace into another using TE3:
Underlying Error: {"code":"ASOperationExceptionError","pbi.error":{"code":"ASOperationExceptionError","parameters":{},"details":[{"code":"ModelingServiceError_Reason","detail":{"type":1,"value":"We cannot process the request as we encountered a transient issue when trying to determine user permissions defined in OneLake. Please wait a few minutes and try again.\r\n{\"RootActivityId\":\"F8E98186-BD57-4C34-AF5A-7A1E622709A4\"}\r\n"}},{"code":"ModelingServiceError_Location","detail":{"type":1,"value":"ModelingEngineHost"}},{"code":"ModelingServiceError_ExceptionType","detail":{"type":1,"value":"ModelingASOperationException"}},{"code":"ModelingServiceError_Message","detail":{"type":1,"value":"We cannot process the request as we encountered a transient issue when trying to determine user permissions defined in OneLake. Please wait a few minutes and try again.\r\n{\"RootActivityId\":\"F8E98186-BD57-4C34-AF5A-7A1E622709A4\"}\r\n"}},{"code":"ModelingServiceError_UserErrorCategory","detail":{"type":1,"value":"Unknown"}},{"code":"ModelingServiceError_AdditionalErrorCode","detail":{"type":1,"value":"PFE_UNIVERSAL_SECURITY_TRANSIENT_ERROR"}},{"code":"ModelingServiceError_AdditionalErrorCode","detail":{"type":1,"value":"PFE_EXTRA_INFORMATION"}}]}}
Activity ID: 214813cc-6569-460c-95b6-53b86b70206c
Correlation ID: cff3a314-f5ab-6958-f962-e9c0d1a12c3f
Request ID: 908782dc-96b5-8672-8e62-cb8b82eb9dde
Status code: 500
Time: Tue Nov 11 2025 16:37:22 GMT+0100 (Central European Standard Time)
Service version: 13.0.27046.36
Client version: 2511.1.26629-train
Cluster URI: https://wabi-us-central-b-primary-redirect.analysis.windows.net/
How to move semantic model using only microsoft tools?
Does anybody knows the error?
Best,
Jacek
Admin: please move this topic to Developer forum section.
Solved! Go to Solution.
Problem solved,
The issue was that I had workspaces within different regions. To move it I have used Tabular Editor 3, after deploying model, I need to change a shared connection string to point to the lakehouse already created within target workspace.
Best,
Jacek
Problem solved,
The issue was that I had workspaces within different regions. To move it I have used Tabular Editor 3, after deploying model, I need to change a shared connection string to point to the lakehouse already created within target workspace.
Best,
Jacek
Thanks for the update. It will definitely help others in the community as well. Please stay connected and continue participating in future discussions.
The error you're seeing: "We cannot process the request as we encountered a transient issue when trying to determine user permissions defined in OneLake" with code PFE_UNIVERSAL_SECURITY_TRANSIENT_ERROR—typically means the Microsoft Fabric service had a temporary problem checking the OneLake permissions required by your semantic model or associated lakehouse/shortcut. This is a short-lived platform or security resolution issue, so simply retrying the operation after several minutes usually resolves it.
Deployment Pipelines: Use Fabric deployment pipelines when moving semantic models between workspaces in the same tenant, provided both workspaces are on Premium or Fabric capacity. This is the official, supported method for migrating semantic models along with data and partitions.
Power BI REST APIs: For large models or CI/CD scenarios, export from one workspace using the Export API, then import to the target workspace with the Import API. This method preserves model data, and is ideal for Fabric and large Direct Lake models.
SSMS/XMLA Endpoints: You can use SQL Server Management Studio (SSMS) to connect to the XMLA endpoint of the target workspace and restore the model using TMSL. This is recommended for exact copies within the same tenant for models too large for PBIX export/import.
ALM Toolkit or PBIP Structure: For small models or metadata-only moves, you can use ALM Toolkit or PBIP deployments, though these won't copy lake or import model data. Avoid .pbix direct publishing for large Direct Lake models.
Confirm both source and target workspaces are in the same region/capacity and that necessary permissions on OneLake items are granted and propagated for the service principal or user.
If using TE3 or third-party tooling, transient errors are more likely; using Microsoft-native methods (pipelines, REST APIs, SSMS) makes failures less frequent.
For repeated errors, review OneLake security settings, check for capacity assignment issues, and make sure no recent platform outages are impacting the region.
Retry first, and if the problem persists, use deployment pipelines or REST API export/import for the most robust and supported migrationn path.
Message highlights: PFE_UNIVERSAL_SECURITY_TRANSIENT_ERROR, OneLake permissions, Status code 500, ModelingEngineHost. This typically points to a short-lived platform/security resolution issue when the service tries to evaluate OneLake permissions for the item your semantic model depends on (Lakehouse/Warehouse/Shortcut). Microsoft class it as transient—so a retry often succeeds—but you can harden your setup:
Below are four Microsoft-native approaches. Pick the one that best matches your environment (Premium/PPU vs Pro, need for CI/CD, cross-workspace/tenant, etc.).
Best for: moving between Dev → Test → Prod workspaces inside the same tenant/capacity with rules to switch connections/parameters.
Docs: Get started with deployment pipelines.
Best for: source control and repeatable promotion; works with Azure DevOps or GitHub (both Microsoft). Uses the Power BI Project (.pbip) format with TMDL for the semantic model.
Why this is robust: it moves metadata (model + report) as text, supports code review/PRs, and plays nicely with pipelines.
Best for: exact copy of a semantic model between workspaces/capacities in the same tenant, including large models.
TMSL restore example (run in SSMS XMLA query window against the target workspace):
{
"restore": {
"database": "SalesModel-Prod",
"file": "https://<storage>.dfs.core.windows.net/power-bi-backup/SalesModel.abf",
"allowOverwrite": true,
"readWriteMode": "readWrite"
}
}
Best for: scripting promotions where models already exist, or when reports must be pointed at a model in another workspace.
Note: There isn’t a single REST call that “clones a dataset” end-to-end; for creating new models via automation, combine PBIP/Git or XMLA (Create/Restore). The enhanced Refresh API can manage table/partition refreshes post-move.
|
✔️ If my message helped solve your issue, please mark it as Resolved! 👍 If it was helpful, consider giving it a Kudos! |
thank you very much. Let me follow those steps and try!
Ok so the issue is that region is not the same within second capacity.
Next question about it, How to build a semantic model from a scratch within different capacity with different region - not to copy it ?
Best,
Jacek
Run in SSMS → XMLA query window against the target workspace. Adjust names, data sources, and partitions for your environment.
{ "createOrReplace": { "object": { "database": "SalesModel_Target" }, "database": { "name": "SalesModel_Target", "compatibilityLevel": 1600, "model": { "culture": "en-US", "dataSources": [ { "name": "Lakehouse_DS", "type": "structured", "connectionDetails": { "protocol": "tds", "address": { "server": "<your-warehouse-or-lakehouse-endpoint>", "database": "<db-name>" } }, "credential": { "AuthenticationKind": "OAuth2" } } ], "tables": [ { "name": "DimDate", "partitions": [ { "name": "DimDate", "source": { "type": "m", "expression": "let Source = ... in Source" } } ] }, { "name": "FactSales", "partitions": [ { "name": "FactSales_Full", "source": { "type": "m", "expression": "let Source = ... in Source" } } ] } ], "roles": [ { "name": "Readers", "modelPermission": "read", "members": [ { "memberName": "user@contoso.com" } ] } ] } } } }
After creation, issue a refresh (full or by table/partition) or trigger a dataset refresh from the Service.
Hi,
I am using directLAke over Onelake, can not publish anything, it is just a remote semantic model.
And I am not using GIT integration. Plus how this can switch underlying capacities?
What else can I do ?
Best,
Jacek
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 8 | |
| 7 | |
| 5 | |
| 4 | |
| 3 |
| User | Count |
|---|---|
| 15 | |
| 14 | |
| 11 | |
| 8 | |
| 8 |