- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Access token has expired
We're pulling data into a dataflow gen2 from an onprem SQL Server via a Vnet connection. There is one table in particular that is important and also relatively large at about 52 million rows and 97 columns of various datatypes (text, decimal, boolean, datetime and integer). Everytime I try to pull it in, this error happens:
Error Code: 999999, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Downstream service call to url 'https://api.powerbi.com/powerbi/globalservice/v201606/clusterdetails' failed with status code 403. Details: Reason = DataSource.Error;Error = [code=TokenExpired,message=Access token has expired, resubmit with a new access token];ErrorCode = TokenExpired;
If I reduced the number of columns to 20, the refresh succeeds. As I understand, credentials can expire in about 30 minutes during a dataflow refresh. I can see that if an onprem table is large enough, Fabric can attempt to bring it in but whatever it is doing takes too long and the creds expire. Is there a workaround, perhaps some kind of override or other way?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
definitely continue with the support team to inquire more about the refresh of your credentials. If you are effectively using username/password (aka basic auth), then there isn't anything to refresh in terms of tokens, because the basic authentication never needs any sort of refresh. It simply uses the same username/password.
However, if you are leveraging a destination such as a LakeHouse or a staging mechanism, then the tokens for those could potentially expire while the refresh is happening.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey Jim,
As you've mentioned, if you are using a gateway then there are some limitations / considerations associated with it when it comes to online sources that have expiring tokens. You can read more from the link below:
Data Factory limitations overview - Microsoft Fabric | Microsoft Learn
There isn't a workaround, but the team is actively working to address this, but its definitely a good idea to raise a support ticket as that sort of signal helps us identify how many users are being impacted (and how impactful it can be) by this limitation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @miguel,
We tried ingesting the table in its entirety with both Vnet and onprem gateway, with unsuccessful outcomes. With the onprem gateway, a ticket was filed and the recommendation is to create a multinode cluster. While we don't rule that out, the Vnet is preferred because it's less maintenance. We don't use OAuth with the Vnet, which is why we wonder the creds expire; we know expiration happens, but we were hoping that with basic auth, it would last longer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
definitely continue with the support team to inquire more about the refresh of your credentials. If you are effectively using username/password (aka basic auth), then there isn't anything to refresh in terms of tokens, because the basic authentication never needs any sort of refresh. It simply uses the same username/password.
However, if you are leveraging a destination such as a LakeHouse or a staging mechanism, then the tokens for those could potentially expire while the refresh is happening.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@miguel Thanks much for the feedback. We are running with staging disabled but we are using a lakehouse as the destination. So, just to make sure that I understand correctly, the error I am experiencing could be coming from the token used for connecting to the lakehouse and it's a built-in limitation?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It could be. It really depends on how long your refresh is running. The logs of the gateway could you give you more information to see if thats the case and if the token just expired and thats why you can't connect to the Lakehouse. It could also be something else, but its a good idea to have the support team help you review those logs.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @ebjim
Thanks for using Fabric Community.
Apologies for the issue you have been facing. For understanding the issue better, we would need the logs. Please go ahead and raise a support ticket to reach our support team: Link
After creating a Support ticket please provide the ticket number as it would help us to track for more information.
Thanks.
Helpful resources
Join us at the Microsoft Fabric Community Conference
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
We want your feedback!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Microsoft Fabric Community Conference 2025
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
Subject | Author | Posted | |
---|---|---|---|
10-01-2024 04:07 AM | |||
11-28-2024 01:55 AM | |||
05-02-2024 04:54 PM | |||
12-12-2024 06:27 AM | |||
06-06-2024 05:35 AM |
User | Count |
---|---|
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |