Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
jessimica1018
Helper II
Helper II

Desktop connected to Dataflow Gen2 Key didn't match row Error

Hey All! 
I keep getting an error I can't quite figure out and it's causing all the issues. I have a Dataflow Gen2 connected to a sql server, that dataflow is connected as an import (not direct connect). Any applied step past navigations is showing "Expression.Error: The key didn't match any rows in the table."

jessimica1018_1-1743193107621.png

 


jessimica1018_0-1743192839704.png

 

This error isn't constant, I have been able to get it to load but it's not predictable and the connection is incredibly slow. Sometimes upward of 30 minutes just to display the preview. 
Column count matches, names of the columns match,  no fully blank columns or rows.

Any ideas on what this might be and how to solve it 


Thank you!  

 

1 ACCEPTED SOLUTION
v-hashadapu
Community Support
Community Support

Hi @jessimica1018 , Thank you for reaching out to the Microsoft Community Forum.

 

Since disconnecting/reconnecting and moving transformations to the Dataflow didn’t fully work, let’s try below:

 

Verify the `Assignment` table by running this SQL query in SQL Server: SELECT TABLE_NAME, COLUMN_NAME, DATA_TYPE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'Assignment'. Then, in Dataflow Gen2, right-click the "Navigation" step, select "Refresh Preview," or use "Refresh Schema" to update metadata. To fix the intermittent error due to concurrency, schedule your Dataflow refresh during low-activity periods. If the table changes often, create a staging table in SQL Server with stable data and query that instead.

 

For the slowness, push the `Text.ToLower` transformation to SQL Server with this view: CREATE VIEW vw_Assignment AS SELECT LOWER(AM_Email) AS AM_Email, LOWER(Rec_Email) AS Rec_Email, LOWER(CM_Email) AS CM_Email, * FROM Assignment. Connect your Dataflow to `vw_Assignment`.Further optimize by applying early filters to reduce data volume, enabling "Enhanced Compute Engine" in Dataflow settings, adding indexes to the `Assignment` table’s email columns, and ensuring your On-Premises Data Gateway runs on a high-performance machine. Test the refresh to confirm the error is gone and load times are better.

 

If none of the above work, the best next step is to report the issue to Microsoft Support, as it may be a bug or a backend issue. Provide them with all the troubleshooting steps you've already taken to help them understand the issue better and provide a solution.
 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

 

View solution in original post

10 REPLIES 10
v-hashadapu
Community Support
Community Support

Hi @jessimica1018 , Thank you for reaching out to the Microsoft Community Forum.

 

Since disconnecting/reconnecting and moving transformations to the Dataflow didn’t fully work, let’s try below:

 

Verify the `Assignment` table by running this SQL query in SQL Server: SELECT TABLE_NAME, COLUMN_NAME, DATA_TYPE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'Assignment'. Then, in Dataflow Gen2, right-click the "Navigation" step, select "Refresh Preview," or use "Refresh Schema" to update metadata. To fix the intermittent error due to concurrency, schedule your Dataflow refresh during low-activity periods. If the table changes often, create a staging table in SQL Server with stable data and query that instead.

 

For the slowness, push the `Text.ToLower` transformation to SQL Server with this view: CREATE VIEW vw_Assignment AS SELECT LOWER(AM_Email) AS AM_Email, LOWER(Rec_Email) AS Rec_Email, LOWER(CM_Email) AS CM_Email, * FROM Assignment. Connect your Dataflow to `vw_Assignment`.Further optimize by applying early filters to reduce data volume, enabling "Enhanced Compute Engine" in Dataflow settings, adding indexes to the `Assignment` table’s email columns, and ensuring your On-Premises Data Gateway runs on a high-performance machine. Test the refresh to confirm the error is gone and load times are better.

 

If none of the above work, the best next step is to report the issue to Microsoft Support, as it may be a bug or a backend issue. Provide them with all the troubleshooting steps you've already taken to help them understand the issue better and provide a solution.
 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

 

That makes complete sense! So if I can't set this up in sql just because of the merging of a few tables and the steps/transformation I have the dataflow doing, I also need the data refreshing as frequently as possible. What is the best way to make sure when it's refreshing I don't lose info and cause the key error?  Connecting desktop to a lakehouse or warehouse would remove my ability to further filter that dataflow, correct? Same with creating a semantic model (don't think that would solve it, it'd be the same issue). 

Hi @jessimica1018 , Thank you for reaching out to the Microsoft Community Forum.

 

First, modify your Dataflow to write its output to a Fabric Lakehouse table after merging tables and applying transformations like Text.ToLower. In Dataflow Gen2, set the destination to a Lakehouse, creating a stable table. This ensures Power BI queries a consistent dataset, eliminating key errors caused by SQL Server changes during refreshes. Next, enable incremental refresh in Dataflow Gen2 to support frequent updates without data loss. Use a date column in your Assignment table to track changes and set the policy to refresh only new or updated rows. Schedule refreshes as often as needed to keep the Lakehouse table up-to-date efficiently.

 

In Power BI Desktop, connect to the Lakehouse table using DirectLake mode for faster queries. You can still filter this data in Power Query or with DAX in a semantic model. To optimize performance, apply early filters in your Dataflow before writing to the Lakehouse and enable "Enhanced Compute Engine" in Dataflow Gen2 settings. Test the setup to confirm the error is resolved and load times are reduced, and check gateway logs in Power BI Service under Manage Gateways for stability.

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

lbendlin
Super User
Super User

Text.ToLower is an extremely expensive transform.  Do you really need that?

Sadly yes, connecting UKG (email address are upper and lower. Data consistency doesnt matter to some) to AD (all lower case) for RLS requires it. I have tried also removing the first few steps and still the same error. 

Do these expensive transforms in the dataflow, not in the semantic model.

I moved those transformations to the dataflow. For now it is connecting without error but I said in the post the error isn't constant. This isn't the only file doing it, so I am hesitant to believe the applied steps are the issue. I am not seeing how a few expensive transformations would be causing the error "Expression.Error: The key didn't match any rows in the table.".  Defintely be apart of the slow down, but not the cause, it's still extremely slow. 

I saw this when I first got the error and have tried many times disconnecting and reconnecting, no luck. 

If you have a Pro license you can open a Pro ticket at https://admin.powerplatform.microsoft.com/newsupportticket/powerbi
Otherwise you can raise an issue at https://community.fabric.microsoft.com/t5/Issues/idb-p/Issues .

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.