I needed a story to compute some metrics [ e.g. count of records with foo, bar and foobar present ] on the 1st and middle of every month.
I found that i could use power bi dataflow legacy to schedule execution of sql statement that produced a single record with each of those count based metrics in a column that it stuffed into an azure data lake storage gen2 container entry. Then i was able to use power bi data source connector support for azure data lake storage gen2 container to pull in the set of all those individual 1 record files and create a trend lines chart for all thos metrics as the months ticked by.
Now with power bi data factor dataflows gen2 solution they no longer rely on an azure data lake storage gen2 container so this maneuver to get at all the scheduled metric computations as a data set isn't going to work.
Any insights as to how i should be accomplishing this metrics computation and display of the set over time in current power bi experience?
↧
dataflow or datamart or dataset solution for creating table of point in time metrics
↧