Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
MattKarriker2
Helper II
Helper II

Scheduling refresh using Azure Data Lake Storage Gen2 connector

Good day forum
I am working on a project to pull data directly from from our data lake into Power BI.  Using PBI Desktop I am able to make the required connections and retireve desired data.  When I publish this model to PBI Service and try to configure the data source connections I receive the below error.  The credentials are not incorrect.  I am using Account Key for access.  

MattKarriker2_0-1670536922451.png

 

I found an article on the web mentioning limitations.  The below limiatation I what I feel is causing the issue as I am entering a URL containing subfolders in Power BI Desktop.  The reason I am using subfolders is because our data lake is very large.  Connecting at a container level and using a filter on Folder Path is very slow and most of the time does not finish before timing out.  


Limitiatons
Currently, in Power Query Online, the Azure Data Lake Storage Gen2 connector only supports paths with container, and not subfolder or file. For example, https://<accountname>.dfs.core.windows.net/<container> will work, while https://<accountname>.dfs.core.windows.net/<container>/<filename> or https://<accountname>.dfs.core.windows.net/<container>/<subfolder> will fail.

 

Does anyone have a way of connecting to data lake at the container level and being able to navigate the file structure withour having to the use a filter on Folder path?

3 REPLIES 3
GilbertQ
Super User
Super User

Hi @MattKarriker2 

 

You should be able to connect at the container level, and then in the first step in Power Query then filter it to the folder level. That should be really quick?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

@GilbertQ 
That is what I am trying now.  I have the source to container and then applying a filter to the Folder Path field. The filter being applied is a "begins with" due to me needing files from subfolders under the main folder.
The issue I feel is we are querying every file in the container and there are a lot.  In PBI desktop connecting directly to the folder was very quick.  I need a way to do the same but while in the service so we can set up refresh.  

Hi @MattKarriker2 

 

What happens if you try and use the SAS Token?

 

I have successfully used the container name (storage name) in Power BI and successfully refreshed the data quickly.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

Top Solution Authors
Top Kudoed Authors