Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Good day forum
I am working on a project to pull data directly from from our data lake into Power BI. Using PBI Desktop I am able to make the required connections and retireve desired data. When I publish this model to PBI Service and try to configure the data source connections I receive the below error. The credentials are not incorrect. I am using Account Key for access.
I found an article on the web mentioning limitations. The below limiatation I what I feel is causing the issue as I am entering a URL containing subfolders in Power BI Desktop. The reason I am using subfolders is because our data lake is very large. Connecting at a container level and using a filter on Folder Path is very slow and most of the time does not finish before timing out.
Limitiatons
Currently, in Power Query Online, the Azure Data Lake Storage Gen2 connector only supports paths with container, and not subfolder or file. For example, https://<accountname>.dfs.core.windows.net/<container> will work, while https://<accountname>.dfs.core.windows.net/<container>/<filename> or https://<accountname>.dfs.core.windows.net/<container>/<subfolder> will fail.
Does anyone have a way of connecting to data lake at the container level and being able to navigate the file structure withour having to the use a filter on Folder path?
You should be able to connect at the container level, and then in the first step in Power Query then filter it to the folder level. That should be really quick?
@GilbertQ
That is what I am trying now. I have the source to container and then applying a filter to the Folder Path field. The filter being applied is a "begins with" due to me needing files from subfolders under the main folder.
The issue I feel is we are querying every file in the container and there are a lot. In PBI desktop connecting directly to the folder was very quick. I need a way to do the same but while in the service so we can set up refresh.
What happens if you try and use the SAS Token?
I have successfully used the container name (storage name) in Power BI and successfully refreshed the data quickly.