Reply
Gmeza
Frequent Visitor

Dataflow Gen2 Dynamic Schema Not Updating

I have a Dataflow Gen2 item that does light transformation from a CSV file and loads it into a Lakehouse using the Dynamic schema on publish option. Over time, additional columns have been added to this file, and the dataflow would appropriately add them to my Lakehouse table. But for the last few weeks, my dataflow has stopped recognizing new columns added from the data source. Essentially, my schema is no longer "dynamic" when running the dataflow, causing a loss in data columns. What can I do to get it back to working like before?

5 REPLIES 5
lisadeijlen
Regular Visitor

We are having exactly the same problem. I tried it with several dataflows, but in none of them new columns show up at the data destination mapping anymore. A few weeks ago this was still working.

Did you happen to get any information from Microsoft about this?

 

Gmeza
Frequent Visitor

Thanks for the reply, Nono. I ensured the "Dynamic Schema on Publish" was selected and ran a manual refresh of the data flow. Neither action added the new column upon refresh.

 

Deleting the table and rerunning the dataflow works, but what is the point of the dynamic schema option if I have to still do that manually? It seems like the feature is not working as intended.

Are you able to consistently reproduce this behavior on a completely new Dataflow? If yes, could you share the repro steps so I can try it on my own? If you could use a sample public data source (such as OData NorthWind) that would ease things as well.

 

Regardless, I'd recommend raising a support ticket and reaching out to the support team so you can get direct help from our engineers to troubleshoot the issue. Below is the link to the support channel:

https://support.fabric.microsoft.com/support/

Gmeza
Frequent Visitor

After creating a few new dataflows and updating existing ones with new columns, I can confirm that this behavior was erratic and not easily reproduced. Sometimes, the new columns would be added upon refresh, and sometimes, they wouldn't. Manually dropping the table and reapplying the data destination settings would fix the issue, but this leads to an unreliable experience.

 

What I did find this morning is when adding a new column and not specifying the column type (i.e, column type "Any"), the new column would not show part of the new schema. Maybe the root cause is the new column type is not being accepted by the Lakehouse. I suggest trying to change the column type to something supported by the Lakehouse. I'll update this thread if I run into this again and can actually test this.

v-nuoc-msft
Community Support
Community Support

Hi @Gmeza 

 

Here are some steps you can try:

 

Ensure that the “Dynamic schema on publish” option is still enabled in your dataflow settings. Sometimes settings can get reset or changed inadvertently.

 

Try refreshing your dataflow manually to see if it picks up the new columns. Sometimes, an automatic refresh might not capture schema changes immediately.

 

If the above steps do not work, you may need to delete the existing table in Lakehouse and let the data flow recreate it.

 

Ensure that the column mapping in your dataflow destination is updated to reflect the new columns. 

 

Dataflows Gen2 data destinations and managed settings | Microsoft Fabric Blog | Microsoft Fabric

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

avatar user

Helpful resources

Announcements
MarchFBCvideo - carousel

Fabric Monthly Update - March 2025

Check out the March 2025 Fabric update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors (Last Month)
Top Kudoed Authors (Last Month)