Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
myusrn
Regular Visitor

dataflow or datamart or dataset solution for creating table of point in time metrics

I needed a story to compute some metrics [ e.g. count of records with foo, bar and foobar present ] on the 1st and middle of every month.

I found that i could use power bi dataflow legacy to schedule execution of sql statement that produced a single record with each of those count based metrics in a column that it stuffed into an azure data lake storage gen2 container entry. Then i was able to use power bi data source connector support for azure data lake storage gen2 container to pull in the set of all those individual 1 record files and create a trend lines chart for all thos metrics as the months ticked by.

Now with power bi data factor dataflows gen2 solution they no longer rely on an azure data lake storage gen2 container so this maneuver to get at all the scheduled metric computations as a data set isn't going to work.

Any insights as to how i should be accomplishing this metrics computation and display of the set over time in current power bi experience?

2 REPLIES 2
GilbertQ
Super User
Super User

Hi @myusrn 

 

As far as I am aware you can use Dataflow Gen2 to get data from an Azure storage account?

 

Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hi @GilbertQ thanks for the follow up response.

In this case i'm not looking to have data factory > dataflow gen2 process get data from azure storage account. 

I'm looking for it to get data from mysql database and write the results into azure storage account configured as standard issue storage account or for data lake storage gen2 functionality, i.e. hierarchical namespace.  

Then in power bi desktop app i want to configure a data source that pulls the data directly from that azure storage account populated by data factor > dataflow gen2 process vs from the dataflow gen2 source.

The reason being is using the data factory > dataflow gen2 source, as was the case with the power bi > dataflow <gen1> source,  as the power bi data source pulls back only the last record [ set ] written and not the cumulative record set from all timed executions of the data flow.

Again the reason for this is my dataflow mysql query produces a single record output where every column is a point in time calculated metric that i cannot compute at a later time down the road.

Displaying the combination of all these point in time calculated metric output rows is what allows me to produce a trending data charge.

Make sense?

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.