Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
AnthonyGenovese
Resolver III
Resolver III

Datasource crednetials missing or Invalid AFTER data is being loaded

We are trying to load some larger on prem SQL Server tables to the Lakehouse via Gen2 Dataflow. We are receiving the below error many minutes after the dataflow has ran. 

 

Error Code: Challenge Error, Error Details: Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again. (Request ID: 17e029f6-256d-4763-b9f1-0940f17c14de).

 

The thing is, we see the query run on prem. The error looks to be thrown after the amount of time we would expect the data to take to load to Fabric. So from our perspective, it looks like it connects, runs the query, returns data, then throws a credential error!
See screenshots below. 

AnthonyGenovese_0-1689885451742.png

 

AnthonyGenovese_1-1689885469726.png

 

 

1 ACCEPTED SOLUTION
SidJay
Microsoft Employee
Microsoft Employee

For Gateway-based refreshes we have an existing limitation with token refresh that causes Gateway jobs over an hour to fail. It's not quite as strict as the entire job needing be under an hour, but if portions of the job take more than an hour, the limitation is hit.

 

Are you temporarily able to partition your refreshes so that the Gateway-based queries are partitioned into jobs that take less than an hour? You can then append (union) the partitions via a separate dataflow that runs in the cloud.

 

Thanks

View solution in original post

4 REPLIES 4
AnthonyGenovese
Resolver III
Resolver III

Still having this issue.

hey! have you created a support ticket for it? if yes, could you please share that ticket identifier?

SidJay
Microsoft Employee
Microsoft Employee

For Gateway-based refreshes we have an existing limitation with token refresh that causes Gateway jobs over an hour to fail. It's not quite as strict as the entire job needing be under an hour, but if portions of the job take more than an hour, the limitation is hit.

 

Are you temporarily able to partition your refreshes so that the Gateway-based queries are partitioned into jobs that take less than an hour? You can then append (union) the partitions via a separate dataflow that runs in the cloud.

 

Thanks

AnthonyGenovese
Resolver III
Resolver III

I wonder if this is also another issue related just to the data set size?

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors