Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Dataflow refresh fails with cache size limit in premium per user capacity

Hi,

I have a dataflow which can successfully import and refresh in my shared premium capacity workspace. Although when trying to import same definition to another workspace with PPU license I'm getting the following error.

I can't find anywhere actualcahe size or how to increase it.  Any help on this would be much appreciated.

 

Error: Encountered user gateway exception: ' We're sorry an error occurred during evaluation. [DM_ErrorDetailNameCode_UnderlyingErrorCode]=-2147467259 [DM_ErrorDetailNameCode_UnderlyingErrorMessage]= We're sorry an error occurred during evaluation. [DM_ErrorDetailNameCode_UnderlyingHResult]=-2147467259 [InnerType]=ErrorException The evaluation reached the allowed cache entry size limit. Try increasing the allowed cache size. [GatewayPipelineErrorCode]=DM_GWPipeline_Gateway_MashupDataAccessError [ErrorShortName]=GatewayClientErrorResponseException[GatewayId=728138]/MashupDataAccessException[ErrorCode=-2147467259 HResult=-2147467259]/Wrapped(InternalMashupException)[ErrorCode=-2147467259 HResult=-2147467259]/Wrapped(ErrorException)[HResult=-2146233088] [ExceptionErrorShortName]=GatewayClientErrorResponseException[GatewayId=728138]'. RootActivityId = d2608fc1-7a2b-4e97-94f8-0dfb4cc49656.Param1 = We're sorry an error occurred during evaluation. [DM_ErrorDetailNameCode_UnderlyingErrorCode]=-2147467259 [DM_ErrorDetailNameCode_UnderlyingErrorMessage]= We're sorry an error occurred during evaluation. [DM_ErrorDetailNameCode_UnderlyingHResult]=-2147467259 [InnerType]=ErrorException The evaluation reached the allowed cache entry size limit. Try increasing the allowed cache size. [GatewayPipelineErrorCode]=DM_GWPipeline_Gateway_MashupDataAccessError [ErrorShortName]=GatewayClientErrorResponseException[GatewayId=728138]/MashupDataAccessException[ErrorCode=-2147467259 HResult=-2147467259]/Wrapped(InternalMashupException)[ErrorCode=-2147467259 HResult=-2147467259]/Wrapped(ErrorException)[HResult=-2146233088] [ExceptionErrorShortName]=GatewayClientErrorResponseException[GatewayId=728138] Request ID: 411e203f-dbb6-637c-0d14-6b1066e41527.

Status: Investigating
Comments
Anonymous
Not applicable

Hi  @kamsmial 

 Is your dataflow Gen 1 or Gen 2? PPU is designed to support enterprise workloads including Power BI items with size limits equivalent to that of a P3 , equivalent to SKU F256. F256 has a model size limit of 100GB. Unless your PPU usage exceeds this limit, there is no memory shortage. Can you create a new PPU workspace and retest your operation?

https://learn.microsoft.com/en-us/power-bi/developer/embedded/embedded-capacity#sku-computing-power

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-per-user-faq#considerations-an...

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is#semantic-model-sku-lim...

 

vyetao1msft_0-1721268297822.png

vyetao1msft_1-1721268311417.pngvyetao1msft_2-1721268340980.png

 

 

Best Regards,
Community Support Team _ Ailsa Tao

kamsmial
Regular Visitor

Thanks @Anonymous 

I'm not sure what SKU has been assigned as it's managed by admin in my organization. But the problem apparently has gone away when enhanced compute has been turned on on the dataflow. The default is Optimized and never triggered since the dataflow doesn't link to other ones. 

kamsmial_0-1721286350210.png