Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedJoin us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now
In our Data Lake (Bronze Layer), we have a hierarchical folder structure organized by year, month, and day. Every day, a new folder with potentially several new JSON files is added for that particular day. The goal is to manage this dynamic structure and implement a method that allows for step-by-step file loading to Silver-Layer Table. The selection of the file to be loaded should be controlled by a trigger in a pipeline (for example using a foreach loop)
Year
|-- Month
|-- Day
|-- file1.json
|-- file2.json
|-- ...
Introducing Parameters Filled by the Trigger:
Is there any best Practice for loading from evolving hierachical folderstructure with Dataflow Gen2? Is it even possible?
Hi @KKO ,
Did you got a chance to check the previous response on this thread?
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
Hello @KKO ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Thanks
I'm not sure I follow. The desired requirements talk about using a Pipelines-centric mechanism with even reading the folder structure, but this was posted in the Dataflows forum and with a title of Dynamic Loading in Dataflow Gen2. Whats the logic that you want to implement in Dataflow Gen2?
You could use a metadata driven approach. You could read, from "somewhere", what folder or set of files to read and then apply a set of filters to the queries that you create in Power Query that will leverage the values from that "somewhere" and thus reaching the set of files or folders that you wish to use for your Dataflow Gen2.
Hi @KKO ,
Thanks for using Fabric Community.
Please feel free to refer this doc to gets some insights on your query -
Azure Data Factory Get Metadata Example (mssqltips.com)
2. Get File Names from Source Folder Dynamically in Azure Data Factory (youtube.com)
Hope this might provide some insights. Please do let me know incase of further queries.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the February 2025 Fabric update to learn about new features.
User | Count |
---|---|
6 | |
5 | |
2 | |
2 | |
2 |