The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
This is the error message:
<ccon>Expression.Error: The key didn't match any rows in the table.. Microsoft.Data.Mashup.ErrorCode = 10061. Key = [workspaceId = "f5989c2b-3d5d-4534-bfe9-e99709f997ee"]. Table = #table({"workspaceId", "workspaceName", "workspaceType"}, {}). </ccon>;<ccon>The key didn't match any rows in the table.</ccon>. The exception was raised by the IDbCommand interface. Table: contact_flow.
The dataflow is set up by connect through ODBC Amazon Athena with basic authentication. The semantic model connecting to this dataflow schedule refresh cannot proceed due to this. What is a way to solve this?
Here's the screenshot of the connection of our dataflow:
Solved! Go to Solution.
Hi @Brenz Yes, it might be able to solve the issue, as Gen2 supports mid-stream token refresh,unlike Gen 1 where OAuth2 tokens expire after an hour. So Gen2 will be preventing token expiration during refresh.
Hi @Brenz ,
Thanks for reaching out on Microsoft Fabric community forum.
I hope you were able to resolve the issue.
Please consider marking Accept as solution or give kudos to the post that helped you so that other users can benefit from it.
If you still need any help, feel free to reach out to us.
Hi @Brenz It seems to be a key mismatch error in your ODBC Amazon Athena Dataflow, preventing the scheduled refresh.Please try the below to resolve your issue
Check Table & Schema: Ensure contact_flow exists and matches the expected schema.
Verify Authentication: Confirm credentials have proper access.
Test ODBC Connection: Ensure it works outside Power BI.
Check Gateway Setup: If using a gateway, confirm correct mapping.
Hello @Akash_Varuna, thank you for your response. Upon checking your resolutions above, all are configured correctly. As additional context, whenever I go to the data source credential settings on the semantic model and sign in again, the on-demand refresh will work. The credentials are expired after every hour or so because of OAuth2, I believe this could be the reason why.
Hi @Brenz Ok, this might be the reason for error. The OAuth2 token is likely expiring, causing authentication failures during scheduled refresh.
Could you try using service principal authentication or checking if your data source allows automatic token refresh?
Hello Akash, thank you for your response. We will explore on using service principal authentication. Additionally, our dataflow is set up in Gen1 currently, if we migrate to Gen2 would that solve the problem? This is based on the documentation I found regarding this.
"When using OAuth2 credentials in Dataflows Gen1, the gateway doesn't support refreshing tokens automatically when access tokens expire. Tokens typically expire 1 hour after the refresh starts, but can expire in less than 1 hour, depending on the data source and the tenant policies. Dataflows Gen2, Semantic models, Data pipelines are able to refresh tokens mid-stream and should not be impacted due to this."
https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem-faq#why-do-i-get-t...
Hi @Brenz Yes, it might be able to solve the issue, as Gen2 supports mid-stream token refresh,unlike Gen 1 where OAuth2 tokens expire after an hour. So Gen2 will be preventing token expiration during refresh.