March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hello Community,
I don't know if it is possible, but here's what I am trying to do:
I have built many distinct Power BI reports so far and I wish to cherrypick tables from these reports to build a recapitulative report that would use all of the table manipulations (merges and conditionnal columns in Power Query, calculated columns, etc.), so I don't have to repeat all the previous manipulations and hence, save time.
Many of these tables are results of merges of data coming from various data sources (SQL Requests, Excel files, txt files, etc.).
I have found a link that shows how to extract and generate and Excel file for every table contained in Power BI Desktop Report, using DAX Studio: Exporting Data from Power BI Desktop to Excel and CSV – Part 1: Copy & Paste and DAX Studio Methods .... This is great stuff, as you can use all these files to create a new data model from specific selected tables.
The issue is, these Excel files are only a "snapshot" of the last refresh of the Pbix dataset, so the underlying data of these tables (which I set a regular refresh schedule for through a Power Automate Flow in Power BI Web Service), will not stay up-to-date over time, because it came from a punctual connection between Power BI Desktop and DAX Studio.
What I would need is way to get the latest data from these exctrated tables without manually downloading and manually extract these dozens of tables everyday, which is unefficient and I don't have time for.
Would there be any ways to dynamically extract these tables from the refreshed Power BI Service dataset, so the data is continually up-to-date?
Thanks,
Solved! Go to Solution.
Hi @Anonymous
One way you could create this is to store all the indiviual tables using Dataflows and then connect the different Power BI reports to the dataflows?
You could then cherry pick the ones for each dataset?
Hi @Anonymous ,
I don't find a way to make DAX Studio automatically. Maybe you can try the following ways:
1. Use R or Python scripts to automatically export data.
2. Use DirectQuery to connect multiple datasets.
If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.
Best Regards,
Winniz
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,
Thanks for your feedback, but the the R/Python solution does not meet our companys security standards as a Public privacy level is required.
As for the DirectQuery solution, there are too much limitations as of now and again it doesn't meet our companys needs in terms of data management.
As proposed by the previous reply by GilbertQ, we are exploring the Dataflows solution, which seems to meet our company's requirements better.
Have a nice day
Hi @Anonymous
One way you could create this is to store all the indiviual tables using Dataflows and then connect the different Power BI reports to the dataflows?
You could then cherry pick the ones for each dataset?
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
37 | |
22 | |
20 | |
10 | |
9 |
User | Count |
---|---|
60 | |
56 | |
22 | |
14 | |
12 |