The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi,
I have a sematic model of about 40 interrelated tables from various sources of live data. I would like to be able to save copies of this at a point in time. (The intention is then to load the exports into another model and see trends over time.) I'm aware that a full data warehouse solution would be better for this but I'm not set up for that. I just want to be able to dump 40 raw tables of data out.
I can "Get Data" in excel and connect to my semantic model but this is only one table at a time, doesn't include tables hidden in report view, and creates a live connection to each table I'd rather not have. Is there a good option with a quick process to export all my tables?
Thanks.
You could use the Power BI REST API to automate this. Register an Azure AD app with Power BI api permissions, then use export to file API or XMLA endpoints to extract your tables. Power Automate to automate process (or python)
You might be able to use Power Automate to setup up exports of data. It would take some work to get flows of each individual table, but once it was setup you could schedule it to run at regular intervals. Depending on the amount of data in each table, you may need to upgrade your license for Power Automate, but there are connectors in Power Automate to get data from a published Power BI semantic model.
User | Count |
---|---|
63 | |
56 | |
54 | |
51 | |
31 |
User | Count |
---|---|
180 | |
88 | |
70 | |
46 | |
43 |