Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hello,
for my Power Bi reports I make extensive use of Dataflows. Those Dataflow mostly fall in one of those two categories:
Case A: Dataflows that extract and transform data that is being sent to Email Mailboxes. Power Automate is being used to copy those (daily) xlsx/csv reports to different folders (mostly Sharepoint-folders but also folders on our internal servers). Dataflows than combine all the files stored within those folders, transform the data make the result available to the PBI reports.
Case B: Dataflows that connect to web sources, (once per day), query data from the website, transform this data and make it available to the PBI reports.
For both of those use cases I am wondering if I could use Azure Data Lake Storge Gen2 to improve things! However I do not have any experience with ADLS, so I'm not sure if it will do what I have in mind:
Case A: Instead of using Dataflows that currently query (increasingly large!) folders, I wonder if I could set up new dataflows that incrementally add the data from the daily xlsx/csv-reports to the ADLS. The ADLS would than hold all the data from the reports received via mail. This seens better that to query large folders and to combine all the data from all the files every day to refresh the data in the PBI reports.
Case B: for those web sources I currently lack any historical data: the dataflow refreshes daily and it gets the current data from the website. But the data it got yesterday is obviously lost. I would like to store this historcal data. The idea would be to set up a dataflow that connects to those sites and saves the data (including a datestamp) to the ADLS.
Never having used ADLS my question is: can I accomplish those goals with ADLS? If so, can you maybe point me to acticles/documentation describing the fundamentals needed to set this up! Is there maybe a solution that is much more suitable to accomplish those goals?
Thanks for any help!
Solved! Go to Solution.
Hi @chris__1
I think you could do this using Azure Data Factory which would allow you to store the data into ADLS.
This should help get you started: Load data into Azure Data Lake Storage Gen2 - Azure Data Factory | Microsoft Learn
Hi @chris__1
I think you could do this using Azure Data Factory which would allow you to store the data into ADLS.
This should help get you started: Load data into Azure Data Lake Storage Gen2 - Azure Data Factory | Microsoft Learn
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.