Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi
I would need help please:
I get error message:
Data source error:{"error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","pbi.error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","parameters":{},"details":[],"exceptionCulprit":1}}}Cluster URI:WABI-WEST-EUROPE-redirect.analysis.windows.netActivity ID:e11213d3-dda1-44c5-b555-88b48663828bRequest ID:333b39d1-06af-4147-92f7-47e3193ab95bTime:2023-08-22 21:21:56Z
I use 2 data sources: Sharepointfolder and Redshift (with select statement to limit data pulled). Sharpoint folder contains 2100 files each 10MB uncompressed text files; PBIX file has a size of 76 MB.
All this seems not a challenge. It stopped working when I changed to select statements for redshift (through advanced) and import to limit info loaded. No issues with desktop refresh.
Any advice would be helpfull.
Thanks
Rene
Solved! Go to Solution.
It seeems the issue is not related to redshift. The Sharpoint files read is creating the [problem. I changed it to Sharepoint content read and this resolved the issue. Now I need to add incremental refresh and then I should be all set.
It seeems the issue is not related to redshift. The Sharpoint files read is creating the [problem. I changed it to Sharepoint content read and this resolved the issue. Now I need to add incremental refresh and then I should be all set.
Thanks I will check on this tomorrow. Appreciate the fast response.
Rene
Hi @rturnheim
I would make sure that you have got the latest version of the redshift ODBC Driver installed.
Also ensure that you have got the latest version of the Power BI Gateway installed?
Then I would go into the advanced settings for Redshift and ensure that you have not put the settings to high?
In my client settings I have set the option to Single Row Mode which is still fast.
Hi
I extended my testing and created seperate PBIX files for the redshift data versus Sharepoint. It seems now the error comes from Sharepoint and the amount of files there. The error stays with the Sahrepoint data source and not redshift. I will now reduce the files in each folder and go from there thanks for your help. Rene
Thanks for letting us know you are getting closer to the resolution!
Still waiting from support for the changes. Will keep you in the loop.
Thanks