Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello
I have a bigger data flow which has the following Architecture:
Workspace 1: (Premium Capacity, new workspace type)
1:1 copy of a SQL Source. 10 Dataflows with each one entity. Each dataflow is for example one year of data.
no transformation happens, it is up and running good without error.
Workspace 2: (Premium Capacity, new workspace type)
I linked each of the 10 dataflow to one single dataflow in this workspace. this dataflow has therefore 10 enties linked. after that i created a new query in this flow, appening the 10 entities.
Its about 50 GB of uncompressed data. When I try to refresh this dataflow now, sometimes after 1 hour, sometines after 5 hours i get the following error message:
Failed,Error: Access token has expired resubmit with a new access token.. RootActivityId = xxxxxxxxxxx .Param1 = Access token has expired resubmit with a new access token Request ID: yyyyyyyyyyy
any Idea?
Thank you in advance for any hint
Klaus
Any solutions for this?
Do you now know the solution?
No Ruben, still waiting for a reply.
Hi @klabir ,
There are several points you may considerate:
1.Please check if the user has exceeded the amount that can be generated on a premium capacity.
2.Please check if the user is the admin or member of the group (workspace).
3.Please check if the Azure AD auth token expired.
4.Please check if the query data Expired token
If you still have this issue for Power BI, you'd better create a support ticket in Power BI Support , Scroll down and click "CREATE SUPPORT TICKET", to get further help.
Best Regards,
Amy
Community Support Team _ Amy
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
How do I check each of these?
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.