Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello Every one,
I'm working on a request where the files(.csv's) will be present in a Azure container folders, these folders are created Dynamically in ADF when ever there is any new test case was implemented for the received file. Every time a new folder will be created and a csv file will be droped in that folder, so I need to create a Power BI report on those csv files that are present in the container folder. I tried to look in the forum if there is any way of connecting to the container folder dynamically and ready the files dynamically, but I did not find any thing. Can any one please help me with this scenario?
Thanks,
AB
Solved! Go to Solution.
Hey @Anonymous ,
I guess you mean azure blob storage (=container folder).
To access the data you will need the key from your storage account.
After connecting to the data you can figure out a pattern to update your data on every refresh run.
Maybe you have a naming pattern? Would be nice in your case.
Or you grab all csv inside your storage account. You can do this by adjusting PQ code a bit.
Since I don't know your exatly case it's a bit challenging to figure out your target.
Regards
Hey @Anonymous ,
I guess you mean azure blob storage (=container folder).
To access the data you will need the key from your storage account.
After connecting to the data you can figure out a pattern to update your data on every refresh run.
Maybe you have a naming pattern? Would be nice in your case.
Or you grab all csv inside your storage account. You can do this by adjusting PQ code a bit.
Since I don't know your exatly case it's a bit challenging to figure out your target.
Regards
This is not the correct solution @sergej_og that I was looking at. I'm still searching for my result.
Hey @Anonymous ,
as I wrote above "Since I don't know your exatly case it's a bit challenging to figure out your target".
Try to describe a bit more precise what is your goal.
Maybe you can share some screens.
Regards
Here is my requirment:
We are having different folders for different clients
Ex: Client A has folder called A and every time the ADF pipelines triggers there will be a new folder with unique name like 123abcd in that there will be three sub folders Success, Fail, Summary.
same structure for Client B and so on.
Now my requirement is when ADF pipelines has completed for Client A it need to send the folder path as input connection string to Power BI and read Success Files, Fail Files from Folders of Client A.
Client B has different ADF pipelines, here also same when ever ADF pipelines has completed it has to send ClientB information as connection string as input to Power BI and read files from Client B subfolders.
simillar approach for remaining clients.
Is this something doable in power BI?
As for now, I copied the files into a one drive and developing report from that location once it was completed I want to change the source to StorageAccount Folders.
Please let me know.
Regards
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!