Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Repeated error message when refreshing a data flowError message
The data flow loads a fact table from the Lakehouse, aggregates the values by user_id in 2 different ways, joins these tables together and should write the result back into the Lakehouse into a new table.
These are the solutions I already tried (none of them helped)
removed “null” and ““ rows in the logical columns
changed the logical columns to text
deleted and re-created the table in the Lakehouse
renamed the table to write it into a new table
tested manual and automatic schema settings (see screenshots)
Now it's Friday evening and my deadline is next Wednesday, so I hope someone in another timezone can send me a solution.
I've also contacted our ICT team to raise a support ticket with MS, as I don't have access IDs to create one 😕
Thanks for reading my post, and I hope the global brain can come up with a solution.
Solved! Go to Solution.
In the end, I had to replace all null and 0 values in all columns (not just logical) and remove a date column that had null values that couldn't be replaced or filtered.
Hi @eneri,
Thank you for reaching out to the Microsoft fabric community forum. I have identified few workarounds that may help resolve the issue. Please follow these steps:
Thanks for the detailed update. Based on your description and the error details (specifically: "Cannot convert the value null to type Logical", Error Code 10277), the issue is caused by a schema type mismatch between your transformed data and the Lakehouse table you're writing to.
When using aggregations and joins in Dataflow Gen2, certain columns especially ones expected to be of type Logical (Boolean) may end up containing null, blank, or inconsistent values after transformation. Even if these columns are handled in earlier steps, Dataflow Gen2 validates column types strictly at write time to the Lakehouse. If any logical column contains a null or value of a different type (e.g., string), the write will fail.
This is a known limitation in how Dataflow Gen2 works with Lakehouse schema enforcement. Boolean fields are especially sensitive, as the engine expects them to only contain true, false, or valid nullable Booleans.
Open the Dataflow and Navigate to the Data Destination Step: Edit your dataflow and proceed to the final step where the data is written to the Lakehouse.
Turn Off Use Automatic Settings: At the top of the destination settings, switch off “Use automatic settings” to manually configure the column mappings. Check Columns That Cause Issues: Identify any columns inferred as Logical type (e.g., has_other_markers, has_fji_only, fl_started_tf). In the column mapping table, change the "Destination type" of these columns to Text or Whole Number, depending on your storage preference. Using Text offers the most flexibility and avoids Boolean conversion issues.
Alternatively, if you expect only 1/0 or true/false values, map to Whole Number. Let the Dataflow Create or Overwrite the Table: If the destination table already exists in the Lakehouse, either delete it to allow the dataflow to create a new table with updated types, or confirm the column types in the existing Lakehouse table match the updated schema (e.g., Logical → Text).
Save and Rerun: After updating the column mapping, save the dataflow and rerun it. It should now complete successfully without the type conversion error.
Refer to the below link: What is a lakehouse? - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you for using Microsoft Community Forum.
In the end, I had to replace all null and 0 values in all columns (not just logical) and remove a date column that had null values that couldn't be replaced or filtered.
Thanks @v-kpoloju-msft for the feedback.
But all these steps I have already done - as listed above:
changed the logical columns to text
deleted and re-created the table in the Lakehouse
renamed the table to write it into a new table
tested manual and automatic schema settings (see screenshots)
Today I tried additionally:
But the new flow shows the same error in the in-between-step (does not even attempt to write the data into the Lakehouse)
I submitted a support ticket now ...
Hi @eneri,
Thank you for the detailed follow up and for sharing all the steps you've already tried. Given that the failure is occurring before writing to the Lakehouse and even simplified flows are encountering the same mid-step error, it is likely that this issue goes beyond configuration and may relate to an underlying problem with the dataflow engine or schema inference.
Since you have already submitted a support ticket, I recommend continuing with that route so the support team can check the backend logs and telemetry for deeper insights.
In the meantime, if you have not already, consider enabling detailed logging or checking the Diagnostics tab (if available in your workspace) for any additional error context.
Please consider raising a Microsoft support ticket. You can create a Microsoft support ticket using the link below: https://learn.microsoft.com/en-us/power-bi/support/create-support-ticket
Please let us know if you receive any resolution or workaround from the support team, as it would be valuable for others who might encounter similar issues.
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you for using Microsoft Community Forum.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
1 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |