Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreShape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.
Need Help: Manage storage is showing dataset size is 37 MB and the overall size is 10 GB. but online refresh fails with the following error:
"Data source error: {"error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","pbi.error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","parameters":{},"details":[]}}} Table: AfterSales File Size."
Knowing that the table "AfterSales File Size" is from folder and was working fine before.
We had the same issue; we had pulled all files in a folder into Power Query to obtain the latest file date (for display in PBI).
- The Compressed data Size condition (error) did not occur within the desktop verion when we refreshed there
- The Compressed data Size condition (error) did not occur within Power Query when we refreshed there.
- The Compressed data Size condition (error) did occur when a refresh was processed on the Power BI service (Premium)
As was suggested above, we simply used Remove Columns for all colums not used for the latest file date output. We also used a DESC Sort to get the latest date and then used a Keep Top Rows (=1) to return the latest date.
The Remove Columns, it seems, reduced the length of row size enough to eliminate the Compressed data Size condition when a refresh was processed by the Power BI (Fabric...whatever the name is today) service.
I was able to resolve the same error by shrinking down my text columns as Power BI supports a maximum of 32766 characters in a column of text data type
I've got the same error for data source that was working finer before
Hello @Maysam and @tryniti84
Was there any solution for it? I am encountering the same issue.
Thank you
I solved the problem. The problem in my case was that:
- I imported a complete carpet in Power query. I did it to be able to read from one file the date of creation of specific file in the carpet.
- When I imported the complete carpet I started to see the dataseize exceed.
To solve the problem I filtered the chart with the list of the files included in the carpet and I removed thanks to the filter the other files not neeeded. In this way the data size was correct and no problems anymore.
Folder NOT carpet. I mean that I import the complete folder with all files inside.
I have same error of data size but the data set is 20mb.
Data source error: {"error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","pbi.error":{"code":"DM_GWPipeline_Gateway_CompressedDataSizeForPacketExceededError","parameters":{},"details":[]}}} Table: Attachments.
User | Count |
---|---|
35 | |
32 | |
21 | |
11 | |
8 |
User | Count |
---|---|
54 | |
43 | |
28 | |
13 | |
11 |