Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DysfunctionalDF
New Member

My dataflows return empty columns when published to a lakehouse in Fabric.

I have multiple dataflows that are working perfectly fine and display the intended data. I published these dataflows to a lakehouse which also worked fine and build some reports on them. Now after a while (approximately 3 months) without changing anything in the dataflows, the columns in the tables in my lakehouse are empty, disrupting the reports. I have no idea why they are empty all of a sudden and how I could fix it. Has anyone faced a similar problem or know how to solve this issue.

Here are some things I already tried to resolve the issue:

 

- Delete the tables from the lakehouse and disconnect the destination in the dataflow so I could republish it again.

- Tried to republish the dataflow again in the lakehouse under a new name such that it wouldn't accidently refer to the old corrupted table

- Making a completly new dataflow and copy pasting my query steps in it and publishing this new dataflow (with & without GIT)

- Build the dataflow from scratch (same name as old DF)

 

Our current workaround uses notebooks to directly import the data from the original CSV in the lakehouse. However, I would prefer to use dataflows.

 

My main questions are:

 

-Has anyone faced a similar problem or know how to solve this issue.

-Have there been any changes in Dataflows behaviour in writing parquet files to the lakehouse? Have any of you experienced similar problems and have a solution? 

-Are there any dataflow limitations that we missed with dataflow gen2 (withouth CI/CD)?

1 ACCEPTED SOLUTION
v-sgandrathi
Community Support
Community Support

Hi @DysfunctionalDF,

 

Thank you for your follow-up, and we truly appreciate the steps you've already taken to troubleshoot the issue.

Given that all standard remediation efforts (schema remapping, table recreation, and dataflow rebuilding) have not resolved the problem, this may indicate a deeper backend issue potentially linked to metadata corruption or a persistence issue in how the Dataflow Gen2 writes to the Lakehouse.

As a temporary measure, continue using your notebook-based workaround but consider exporting the functional notebook logic into a Data Pipeline that can replicate the flow in a more governed manner while we investigate the root issue.

Or, we recommend to Raise a support ticket through the Microsoft Fabric support portal so the engineering team can investigate the issue more closely with your tenant and environment details.

 

To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

If this post helps, please give us Kudos and consider marking it Accept as solution to assist other members in finding it more easily.
Thank you for being a part of Microsoft Fabric Community Forum!

View solution in original post

10 REPLIES 10
v-sgandrathi
Community Support
Community Support

Hi @DysfunctionalDF,

 

Thank you for your follow-up, and we truly appreciate the steps you've already taken to troubleshoot the issue.

Given that all standard remediation efforts (schema remapping, table recreation, and dataflow rebuilding) have not resolved the problem, this may indicate a deeper backend issue potentially linked to metadata corruption or a persistence issue in how the Dataflow Gen2 writes to the Lakehouse.

As a temporary measure, continue using your notebook-based workaround but consider exporting the functional notebook logic into a Data Pipeline that can replicate the flow in a more governed manner while we investigate the root issue.

Or, we recommend to Raise a support ticket through the Microsoft Fabric support portal so the engineering team can investigate the issue more closely with your tenant and environment details.

 

To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

If this post helps, please give us Kudos and consider marking it Accept as solution to assist other members in finding it more easily.
Thank you for being a part of Microsoft Fabric Community Forum!

HI @DysfunctionalDF,

 

Could you please confirm if the issue has been resolved after raising a support case? If a solution has been found, it would be greatly appreciated if you could share your insights with the community. This would be helpful for other members who may encounter similar issues.

Thank you for your understanding and assistance.

Hi @DysfunctionalDF,

 

We are following up once again regarding your query. Could you please confirm if the issue has been resolved through the support ticket with Microsoft?

If the issue has been resolved, we kindly request you to share the resolution or key insights here to help others in the community. If we don’t hear back, we’ll go ahead and close this thread.

Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.

 

Thank you for your understanding and participation.

If you have a Pro license you can open a Pro ticket at https://admin.powerplatform.microsoft.com/newsupportticket/powerbi
Otherwise you can raise an issue at https://community.fabric.microsoft.com/t5/Issues/idb-p/Issues .

v-sgandrathi
Community Support
Community Support

Hi @DysfunctionalDF Thank you for bringing up this detailed case.

The solutions provided by the Super Users  correctly address the issue you are encountering Thank you for the response. 

Schema mismatches or loss of schema mapping between Dataflows and the Lakehouse destination can cause columns to appear empty, even without changes to the dataflows themselves.
It is recommended to remove spaces from field names, explicitly set column data types, refresh the destination schema mapping, and if needed, recreate the affected tables or the Lakehouse.

Kindly implement these suggestions, as they align with known and frequently reported issues in Fabric Dataflows behavior.

 

Thanks for reaching out! If this answer was helpful, please consider marking it as Accepted Solution and giving a Kudos, it helps the community!

 

Regards,
Sahasra.

Hi @DysfunctionalDF,

  

We wanted to follow up since we haven't heard back from you regarding our last response. We hope your issue has been resolved.

If my answer resolved your query, please mark it as "Accept Answer" and give Kudos if it was helpful.

If you need any further assistance, feel free to reach out.

Thank you for being a valued member of the Microsoft Fabric Community Forum!

Hi @DysfunctionalDF,

 

Since we haven't heard back from you yet, I'd like to confirm if you've successfully resolved this issue or if you need further help?
If you've already resolved the issue, you can mark the helpful reply as a "solution" so others know that the question has been answered and help other people in the community. Thank you again for your cooperation!
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.

Hi everyone, 

 

Thank you for all the responses. I have tried all of your suggestions, but unfortunately my issue remains unresolved. Like I said before, I am currently working with a workaround using notebooks, but would still like to make the dataflow work. If there are any more suggestions I would love to hear them. 

Kind regards. 

CGIT
New Member

We have been fighting this exact issue for a week and think we just figured it out. Try removing the spaces in your field names. It worked for us (even though the spaces in the field names were OK up until now).

nilendraFabric
Community Champion
Community Champion

Hello @DysfunctionalDF 

 

this issue has been reported many times in this forum.

 

A frequent cause is a mismatch or loss of schema mapping between the dataflow and the Lakehouse destination. Even if the dataflow preview shows data correctly, the mapping to the Lakehouse table may become invalid, especially after schema changes or updates to Fabric. Sometimes, columns are missing entirely or all values show as NULL

Recommendations
• If you have not already, try explicitly setting all column data types and refreshing the destination schema mapping in your dataflow.
• If the problem persists, delete the affected table(s) from the Lakehouse and republish the dataflow.
• Consider creating a new Lakehouse as a destination if corruption is suspected.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.