Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
We are seeing error when we try to Connect to the Power BI Semantic model created in the Fabric Lakehouse using Power BI Desktop OneLake data hub Power BI semantic model. A payload exceeded for the allowed size. Reduce the number of warehouses or artifacts within the workspace and try again. The exception was raised by the IDbConnection interface.
We are using the DirectLake mode or DirectQuery mode for the Semantic Model. We we try to even create single visual or filter or slicer in the report we are getting error as below. We cannot process the request because we encountered a transient issue requiring fallback to DirectQuery mode. Fallback to DirectQuery is disabled in this semantic model. if this issue persists, consider enabling fallback to DirectQuery mode and try again.
We are using Small Semantic model storage format and Fabric capacity. Please let us know the limitations for this Fabric Capacity in terms of number of Lakehouses or Semantic Models and artifacts supported. Please let us know the limitations or size of the semantic model supported for Small Semantic model storage format and any other limitations which is causing the performance issues.
Does the error message you get mention anything about how many megabytes of memory is being used or how many megabytes of memory the current limit is?
Perhaps this blog post series contains some useful information also:
https://blog.crossjoin.co.uk/2024/04/28/power-bi-semantic-model-memory-errors-part-1-model-size/
Hi @mcrreddyt
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thank you.
Hi @mcrreddyt
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others. If you have any question relating to the current thread, please do let us know and we will try out best to help you. In case if you have any other question on a different issue, we request you to open a new thread.
Thank you.
Hi @mcrreddyt
Thanks for using Microsoft Fabric Community.
The error message indicates that the Power BI Desktop is struggling to handle the size or complexity of the Power BI Semantic model you're trying to connect to in the Fabric Lakehouse using the OneLake data hub.
This error typically occurs when the size of the data being processed exceeds the allowed limit. Reducing the number of warehouses or artifacts within the workspace. This could involve simplifying the semantic model by removing unused tables, columns, or measures.
If modifying the model directly isn't feasible, check if you have permissions to enable fallback to DirectQuery mode in the Fabric portal. This allows Power BI Desktop to bypass the semantic model and query the lakehouse directly for data, albeit with potential performance trade-offs.
Note : Enabling DirectQuery might lead to slower report loading and interaction times. Evaluate if the performance penalty is acceptable compared to the current error.
For additional information please refer : Fallback
For semantic model limitations please refer : Known issues and limitations
I hope this information helps.
Thank you.
Thanks for sharing the details. We are using Small Semantic model storage format and Fabric capacity. Please let us know the limitations for this Fabric Capacity with CU of F512 units. in terms of number of Lakehouses and Semantic Models and artifacts can be created. Please let us know the limitations or size of the semantic model supported for Small Semantic model storage format and any other limitations which is causing the performance issues.
I think the Fallback link has some information about the limits:
Ref. also Semantic model SKU limitation
What is the size (number of rows) of the lakehouse tables you are attempting to query in Direct Lake mode?
And how many columns from the lakehouse table are involved in your queries (=visuals)?
This blog post explains about paging and memory consumption when using Direct Lake.
I am not sure if the "Small Semantic model storage format" or "Large Semantic model storage format" matters for Direct Lake semantic models.
I think maybe it just matters if you store the data within the semantic model (e.g. Import mode semantic model).
Ref. Solved: Storage format for workspace with Fabric direct la... - Microsoft Fabric Community
However this thread could potentially imply that the "Small Semantic model storage format" may have an effect also on Direct Lake queries: Solved: Change semantic model size from small to large - Microsoft Fabric Community
I must admit I am unsure about the role of Small vs. Large semantic model storage format when it comes to Direct Lake. Hope someone can clarify.
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
Check out the November 2024 Fabric update to learn about new features.
User | Count |
---|---|
16 | |
12 | |
9 | |
9 | |
4 |