Hello Community,
I don't know if it is possible, but here's what I am trying to do:
I have built many distinct Power BI reports so far and I wish to cherrypick tables from these reports to build a recapitulative report that would use all of the table manipulations (merges and conditionnal columns in Power Query, calculated columns, etc.), so I don't have to repeat all the previous manipulations and hence, save time.
Many of these tables are results of merges of data coming from various data sources (SQL Requests, Excel files, txt files, etc.).
I have found a link that shows how to extract and generate and Excel file for every table contained in Power BI Desktop Report, using DAX Studio: Exporting Data from Power BI Desktop to Excel and CSV – Part 1: Copy & Paste and DAX Studio Methods (biinsight.com). This is great stuff, as you can use all these files to create a new data model from specific selected tables.
The issue is, these Excel files are only a "snapshot" of the last refresh of the Pbix dataset, so the underlying data of these tables (which I set a regular refresh schedule for through a Power Automate Flow in Power BI Web Service), will not stay up-to-date over time, because it came from a punctual connection between Power BI Desktop and DAX Studio.
What I would need is way to get the latest data from these exctrated tables without manually downloading and manually extract these dozens of tables everyday, which is unefficient and I don't have time for.
Would there be any ways to dynamically extract these tables from the refreshed Power BI Service dataset, so the data is continually up-to-date?
Thanks,