Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hi,
Current situation: we can upload pbix files which are more than 1GB in size and we can manually download them.
Target: automate process of dataset download with more than 1GB in pbix format.
Issues: 1GB limitation, not pbix format.
Details:
We have datasets which are 2-3 GB in size. As we have nightly jobs which import data to database and then refresh datasets, we are interested to add a step to backup datasets before they are refreshed. Using Power BI API we hit limitation of 1 GB.
At the moment there is option to backup to Azure Storage, which works, but we don't get pbix file. We would need to use Tabular Editor for any changes. Do you have any information if there is a way to automate process of having backups (1GB + in size) and to retain pbix file format?
Another question, if we go with incremental refresh and large models, there is no download option. What would be your preferred way to backup datasets with incremental refresh?
Thanks in advance!
Best regards,
Nemanja
Hi. I'm not very familiarized with the best practices for backups, but I can say two things.
You can export your pbix file with the Rest API. For files smaller than 1gb it's straight forward, but for those with higher size you need a storage. You can read about the request in here: https://learn.microsoft.com/en-us/rest/api/power-bi/reports/export-report-in-group
Regarding incremental refresh I think you can use this backup practice and it might work. It's different than just downloading a file. I hope it helps: https://www.youtube.com/watch?v=RbMFQNthlGM
Regards,
Happy to help!
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.