Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowLearn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hello,
for my Power Bi reports I make extensive use of Dataflows. Those Dataflow mostly fall in one of those two categories:
Case A: Dataflows that extract and transform data that is being sent to Email Mailboxes. Power Automate is being used to copy those (daily) xlsx/csv reports to different folders (mostly Sharepoint-folders but also folders on our internal servers). Dataflows than combine all the files stored within those folders, transform the data make the result available to the PBI reports.
Case B: Dataflows that connect to web sources, (once per day), query data from the website, transform this data and make it available to the PBI reports.
For both of those use cases I am wondering if I could use Azure Data Lake Storge Gen2 to improve things! However I do not have any experience with ADLS, so I'm not sure if it will do what I have in mind:
Case A: Instead of using Dataflows that currently query (increasingly large!) folders, I wonder if I could set up new dataflows that incrementally add the data from the daily xlsx/csv-reports to the ADLS. The ADLS would than hold all the data from the reports received via mail. This seens better that to query large folders and to combine all the data from all the files every day to refresh the data in the PBI reports.
Case B: for those web sources I currently lack any historical data: the dataflow refreshes daily and it gets the current data from the website. But the data it got yesterday is obviously lost. I would like to store this historcal data. The idea would be to set up a dataflow that connects to those sites and saves the data (including a datestamp) to the ADLS.
Never having used ADLS my question is: can I accomplish those goals with ADLS? If so, can you maybe point me to acticles/documentation describing the fundamentals needed to set this up! Is there maybe a solution that is much more suitable to accomplish those goals?
Thanks for any help!
Solved! Go to Solution.
Hi @chris__1
I think you could do this using Azure Data Factory which would allow you to store the data into ADLS.
This should help get you started: Load data into Azure Data Lake Storage Gen2 - Azure Data Factory | Microsoft Learn
Hi @chris__1
I think you could do this using Azure Data Factory which would allow you to store the data into ADLS.
This should help get you started: Load data into Azure Data Lake Storage Gen2 - Azure Data Factory | Microsoft Learn
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 20 | |
| 18 | |
| 11 | |
| 11 | |
| 7 |
| User | Count |
|---|---|
| 42 | |
| 38 | |
| 21 | |
| 21 | |
| 17 |