The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I have a dataset which gets its data from dataverse. Recently while all scheduled refresh have failed along with the manual refreshes. I have changed the setting to large semantic model storage system and even then and also checked through power shell query of the dataset being of Premium Capacity. I am unable to understand how to increase the processing memory for my account. Any help will be appreciated
Solved! Go to Solution.
Hi @Swiya ,For increased Performance could you check these please
Hello @Swiya
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @v-karpurapud I have tried all the solutions mentioned by you and @Akash_Varuna apart from upgrading the account. Rest of them didnt work for me. I have raised it to my respective IT team waiting for them to upgrade my account.
Thanks for all the help
Hello @Swiya
Thank you for reaching out. It seems your dataset refresh is failing due to memory constraints, even with the Large Semantic Model storage enabled and hosted in Premium Capacity.
I appreciate your @Akash_Varuna insights.
To troubleshoot this issue, please follow these steps:
Since you have already enabled the Large Semantic Model storage and confirmed Premium Capacity, I recommend monitoring memory usage first to determine if an upgrade is necessary.
If my response has resolved your query, please mark it as the Accepted Solution to assist others. Additionally, a 'Kudos' would be appreciated if you found my response helpful.
Thank you!
Hi @Swiya ,For increased Performance could you check these please