Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

How to connect to Azure folder Containers and files present in the containers dynamically?

Hello Every one,

 

I'm working on a request where the files(.csv's) will be present in a Azure container folders, these folders are created Dynamically in ADF when ever there is any new test case was implemented for the received file. Every time a new folder will be created and a csv file will be droped in that folder, so I need to create a Power BI report on those csv files that are present in the container folder. I tried to look in the forum if there is any way of connecting to the container folder dynamically and ready the files dynamically, but I did not find any thing. Can any one please help me with this scenario? 

 

Thanks,

AB 

1 ACCEPTED SOLUTION
sergej_og
Super User
Super User

Hey @Anonymous ,
I guess you mean azure blob storage (=container folder).
To access the data you will need the key from your storage account.
After connecting to the data you can figure out a pattern to update your data on every refresh run.
Maybe you have a naming pattern? Would be nice in your case.
Or you grab all csv inside your storage account. You can do this by adjusting PQ code a bit.

Since I don't know your exatly case it's a bit challenging to figure out your target.

Regards

View solution in original post

4 REPLIES 4
sergej_og
Super User
Super User

Hey @Anonymous ,
I guess you mean azure blob storage (=container folder).
To access the data you will need the key from your storage account.
After connecting to the data you can figure out a pattern to update your data on every refresh run.
Maybe you have a naming pattern? Would be nice in your case.
Or you grab all csv inside your storage account. You can do this by adjusting PQ code a bit.

Since I don't know your exatly case it's a bit challenging to figure out your target.

Regards

Anonymous
Not applicable

This is not the correct solution @sergej_og that I was looking at. I'm still searching for my result.

Hey @Anonymous ,
as I wrote above "Since I don't know your exatly case it's a bit challenging to figure out your target".
Try to describe a bit more precise what is your goal.
Maybe you can share some screens.

Regards

Anonymous
Not applicable

Here is my requirment:

We are having different folders for different clients

Ex: Client A has folder called A  and every time the ADF pipelines triggers there will be a new folder with unique name like 123abcd in that there will be three sub folders Success, Fail, Summary.

same structure for Client B and so on.

Now my requirement is when ADF pipelines has completed for Client A it need to send the folder path as input connection string to Power BI and read Success Files, Fail Files from Folders of Client A.

Client B has different ADF pipelines, here also same when ever ADF pipelines has completed it has to send ClientB information as connection string as input to Power BI and read files from Client B subfolders.

simillar approach for remaining clients.

Is this something doable in power BI? 

As for now, I copied the files into a one drive and developing report from that location once it was completed I want to change the source to StorageAccount Folders.

 

Please let me know.

 

Regards

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors