Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply

Invalid Credentials Error in Power-BI Service Despite Successful Connection in PBI Desktop. (ADLS)

I have a Power BI report that pulls data from Azure Data Lake Storage (ADLS). I can load and refresh the data smoothly using Power BI Desktop, but I encounter an issue when publishing the report to the Power BI web service. I'm unable to schedule a data refresh because the credentials are being flagged as invalid.

 

I've taken a few steps to troubleshoot the issue. I've confirmed that the credentials used in both Power BI Desktop and the web service are identical, I've ensured that the privacy level setting in the web service matches that of the Desktop.

I even tried deleting and reuploading the report in case the problem stemmed from a token caching issue, but unfortunately, that didn't resolve the issue.

 

The authentication method I'm using is OAuth. As I don't have admin rights for the ADLS data source, I can't generate a Shared Access Signature (SAS) token, and the process to obtain permissions can be time-consuming within my company, so I'd prefer to explore alternative solutions.

 

I'm wondering if anyone else has encountered a similar problem or knows what might be causing it and how to resolve it. I had a similar issue before which came down to the data being on-premise but that is not the case here.

 

The error reads: "

Failed to update data source credentials: The credentials provided for the AzureDataLakeStorage source are invalid. (Source at https://datasource.dfs.core.windows.net/path/to/folder)"

 

Any assistance would be greatly appreciated. 

12 REPLIES 12
amien
Helper V
Helper V

But this is a topic from May and still not fixed?

amien
Helper V
Helper V

I have the same problem. Invalid Credentials. No matter which Authentication kind i choose. Is it pretty safe to conclude that the storage account is reachable from PowerBI Services?

 

@v-nokumar Any website where we can track this incident? It's very inconvenient. 

saumya_sri2311
New Member

Hi,
Has there been any updates on this issue or fix?

 

Thanks.

Hi @saumya_sri2311 

 

This issue has not been resolved at this time.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Unfortunately, still getting the same error on the service.

Hi @saumya_sri2311 

 

I went to check this again.

 

Please try the following programs:

 

Do Power Query Online and Azure Storage have the same region? If yes, please refer to 

Azure Data Lake Storage Gen2 - Power Query | Microsoft Learn

 

You need to use either an On-premises data gateway or a Virtual Network (VNet) data gateway to access Azure Storage from Power Query Online in the same region.

 

Do Power Query Online and Azure Storage have different regions? If yes, you need to add the IP addresses of the Power Query Online region to the Azure Storage Firewall in the 'Networking' settings.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I've noticed on the documentation you've shared that it says the following relating to ADLS Gen2 connectors:

 

Limitations

Subfolder or file not supported in Power Query Online

Currently, in Power Query Online, the Azure Data Lake Storage Gen2 connector only supports paths with container, and not subfolder or file. For example, https://<accountname>.dfs.core.windows.net/<container> will work, while https://<accountname>.dfs.core.windows.net/<container>/<filename> or https://<accountname>.dfs.core.windows.net/<container>/<subfolder> will fail.

 

Does this mean that if I have multiple parquet files stored within a subfolder of this directory, that this might explain the refresh error? If so, is there a way around this?

Hi @stevenmcginnis 

 

If feasible, consider reconfiguring the datastore by moving the necessary files to the root of the container.

 

For more complex scenarios or when working with large amounts of data across multiple subfolders, consider using Azure Data Factory.

 

Regards,

Nono Chen 

v-nuoc-msft
Community Support
Community Support

Hi @stevenmcginnis 

 

This issue has been confirmed as a known issue internally.

 

Please be patient to wait for fixing.

 

If there is any news, I will update it here.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

This is not a Fabric capacity specific problem right? also with Premium capacity

Hi @v-nuoc-msft,

 

Thanks for your response. I'm relieved to know its not a me problem! How long do you estimate it'll take to resolve? Do these type of issues typically take hours/days/weeks? I'm just trying to figure out my next steps.

 

Thanks,

Steven

Hi @stevenmcginnis 

 

The incident has now been mitigated, but there is no exact time frame for a fix.

 

If there is an update, I'll let you all know here.

 

Regards,

Nono Chen

Helpful resources

Announcements
OCT PBI Update Carousel

Power BI Monthly Update - October 2024

Check out the October 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors