Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
jaryszek
Post Prodigy
Post Prodigy

Performance and branch handling using azure blob storage

Hello,

I am wondering to switch azue blob storage into azure data lake storage in order to get folder management possibilities. 

I have github files which i will push into azure blob storage (using java code or github actions) but i need to keep git branch structure. 
After that I will need to connect to specific folders and files (csv ones) in power bi. 

What power query should i write in order to maintain the best performance if i will use azure data lake storage? 
i should connect to specific folder = branch name directly and list all files using dynamic parameter? 

Or should I connect strightforward to file using connector like here and using parameters for branch name, folder name and files names? 
https://www.sqlbi.com/blog/marco/2020/05/29/optimizing-access-to-azure-data-lake-storage-adls-gen-2-... 

in order to maitain great performance?

Best
Jacek

1 ACCEPTED SOLUTION
V-yubandi-msft
Community Support
Community Support

Hi @jaryszek ,

Thank you for engaging with the Microsoft Fabric Community. Switching from Azure Blob Storage to Azure Data Lake Storage Gen2 is a solid decision, especially if you want better folder management and need to preserve Git branch structures for downstream use in Power BI.

 

If your scenario involves dynamically retrieving files from branch specific folders, you can use Power Query parameters along with the AzureStorage. DataLakeContents function. This allows you to list files within a specific folder dynamically without loading unnecessary data. This method is helpful when users need to explore or select different branches interactively.

Reference: Azure Data Lake Storage Gen2 - Power Query | Microsoft Learn

 

However, if you already know the exact file paths, the best practice for performance is to connect directly to those files using the ADLS Gen2 connector with parameters. This avoids scanning folder contents, significantly reduces metadata overhead, and improves data refresh speed, especially in large file systems.

 

— Yugandhar
Community Support Team.

View solution in original post

2 REPLIES 2
V-yubandi-msft
Community Support
Community Support

Hi @jaryszek ,

Thank you for engaging with the Microsoft Fabric Community. Switching from Azure Blob Storage to Azure Data Lake Storage Gen2 is a solid decision, especially if you want better folder management and need to preserve Git branch structures for downstream use in Power BI.

 

If your scenario involves dynamically retrieving files from branch specific folders, you can use Power Query parameters along with the AzureStorage. DataLakeContents function. This allows you to list files within a specific folder dynamically without loading unnecessary data. This method is helpful when users need to explore or select different branches interactively.

Reference: Azure Data Lake Storage Gen2 - Power Query | Microsoft Learn

 

However, if you already know the exact file paths, the best practice for performance is to connect directly to those files using the ADLS Gen2 connector with parameters. This avoids scanning folder contents, significantly reduces metadata overhead, and improves data refresh speed, especially in large file systems.

 

— Yugandhar
Community Support Team.

thanks! 

One more question: Can i make parameters and use one Base Path to Azure Data Lake Storage and add in the end & "table1.csv" as dynamic variable within each query? 

Best,
Jacek

Helpful resources

Announcements
June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors