Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hey Community,
I have this automated pipeline:
The DataFlow performs some transformations and dumps data into the Warehouse. And from the warehouse, I create my report. The pipeline ran successfully, however, the new columns I added in my dataflow are not reflected in my warehouse. Why is this happenning? And how can I make sure the data reflects? Cause I'm getting a lot of data and want it to see the warehouse and dashboard updated.
Thanks !
Hello, I had the same issue some days ago.
I have a Dataflow Gen2 that reads many tables from a Lakehouse, makes some transformation and finally write modified data in a Warehouse overwriting the destination tables.
Although execution logs indicate that data was read and written without errors, tables in the Warehouse doesn't contain any new or updated data, as if no overwrite happened. Except for the CALENDAR table, that is generated inside the Dataflow Gen2 using only M code (no source from the Lakehouse).
In the Lakehouse, new and updated records are available. Checking Dataflow steps today, I can see those records.
I would know why Dataflow didn't update my Warehouse.
hello, I have noticed the same
My dataflow runs successfully but sometimes the new rows (or just some) are not being reflected in the warehouse table. First I copy tables to a lakehouse and then I do some transformations in the dataflow and push it to the warehouse. When I see that new lines are not reflected, I check if the information is in the lakehouse, and it is there in fact but for some reason it gets lost in the data flow. I have also checked if it is available in the datalfow (to make sure the transformations are not excluding them somehow) but they are there.
Is there something I can do to make sure the data is correctly updated?
Do you mean new columns or new rows?
When using a warehouse table as destination for your dataflow gen 2, I think you need to set up a mapping of columns from the dataflow gen 2 vs. the columns in the data warehouse table. This mapping is done when you set up the table destination in the dataflow gen 2.
I don't think this dynamically changes if you add more columns in your dataflow.
I think you need to add the new columns to the table in the data warehouse, and then edit the mapping of the columns in the dataflow gen 2 so that the columns in the dataflow gen 2 matches the columns in your data warehouse table.
When using Lakehouse I have experienced problems when adding columns in a lakehouse table (the problem wasn't actually in the Lakehouse, but in the SQL analytics endpoint and semantic model of the Lakehouse - the table had disappeared).
My solution was to create a new table from the Dataflow Gen2 which included the new columns . Maybe you will need to do the same thing in Warehouse, if you want to add columns to a table, but I haven't tested that. Maybe you don't need to do that.
Hello @supri03 ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Hi @supri03 ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!