Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
I have a Power BI file published to the service from Power BI Desktop. The data in the model is retrieved from Azure Blob Storage. I can refresh the model in Power BI Desktop with no issues. However, when I try to refresh it after publishing the model to PowerBI.com I get the error "An error occurred while processing the data in the dataset".
I have four files in one container all in .csv format that are combined in the query using combine binaries and two individual files in Excel format on a separate conainer that are accessed individually and loaded as tables.
I have updated the credentials on PowerBI.com to include my Azure Blob Storage key.
Are there any differences or requirements when refreshing from blob storage in the PowerBI.com service? For example, do I need to save them with different encoding or is Excel format not supported?
To add on to my own topic, the issue seems to be with data volume. I have five files in the blob storage container that add up to about 580 MB. I can refresh the report if I include only three of them (so between 300 and 400 MB). As soon as I add in the fourth file, the refresh fails. I have tried with different sets of files and I have refreshed from each individually so I don't believe that the problem is with any one file. So far, it seems like, if I keep the amount of data under 500 MB, it will refresh fine. If I use
Is there a size limitation on the amount of data that can be refreshed from blob storage?
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
41 | |
23 | |
21 | |
20 | |
13 |
User | Count |
---|---|
67 | |
53 | |
44 | |
28 | |
22 |