Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
I needed a story to compute some metrics [ e.g. count of records with foo, bar and foobar present ] on the 1st and middle of every month.
I found that i could use power bi dataflow legacy to schedule execution of sql statement that produced a single record with each of those count based metrics in a column that it stuffed into an azure data lake storage gen2 container entry. Then i was able to use power bi data source connector support for azure data lake storage gen2 container to pull in the set of all those individual 1 record files and create a trend lines chart for all thos metrics as the months ticked by.
Now with power bi data factor dataflows gen2 solution they no longer rely on an azure data lake storage gen2 container so this maneuver to get at all the scheduled metric computations as a data set isn't going to work.
Any insights as to how i should be accomplishing this metrics computation and display of the set over time in current power bi experience?
Hi @myusrn
As far as I am aware you can use Dataflow Gen2 to get data from an Azure storage account?
Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn
Hi @GilbertQ thanks for the follow up response.
In this case i'm not looking to have data factory > dataflow gen2 process get data from azure storage account.
I'm looking for it to get data from mysql database and write the results into azure storage account configured as standard issue storage account or for data lake storage gen2 functionality, i.e. hierarchical namespace.
Then in power bi desktop app i want to configure a data source that pulls the data directly from that azure storage account populated by data factor > dataflow gen2 process vs from the dataflow gen2 source.
The reason being is using the data factory > dataflow gen2 source, as was the case with the power bi > dataflow <gen1> source, as the power bi data source pulls back only the last record [ set ] written and not the cumulative record set from all timed executions of the data flow.
Again the reason for this is my dataflow mysql query produces a single record output where every column is a point in time calculated metric that i cannot compute at a later time down the road.
Displaying the combination of all these point in time calculated metric output rows is what allows me to produce a trending data charge.
Make sense?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
57 | |
28 | |
25 | |
22 | |
21 |
User | Count |
---|---|
63 | |
47 | |
24 | |
24 | |
18 |