Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

October 28 & 29: Experts share their secrets on how to pass the Fabric Analytics Engineer certification exam—live. Learn more

Reply
frithjof_v
Resident Rockstar
Resident Rockstar

Dataflow Gen2 - Dynamic schema

Hi,

 

The documentation for Dynamic schema in Dataflow Gen2 says:

 

"When the dataflow is refreshed, your table is dropped and recreated. Your dataflow refresh fails if you have any relationships or measures added to your table."

 

However, when I tested Dataflow Gen2 Dynamic schema on a Lakehouse table - which is part of a direct lake semantic model - the refresh did not fail even if I have relationships and measure added to that table. So it works very nice, even more convenient than the description in the documentation 😀
This way, I am able to add and remove columns from the table, by using Dataflow Gen2. And the table still works fine in the direct lake semantic model 😀

 

I am curious, is the documentation not updated, or am I missing something?

 

Thank you 😀

 

 

I am also wondering if the word "flow" here should be removed. I don't understand why the word "flow" is there:
"Dynamic schema: When choosing dynamic schema, you allow for schema changes in the data destination when you republish the dataflow. Because you aren't using managed mapping, you still need to update the column mapping in the dataflow destination flow when you make any changes to your query. When the dataflow is refreshed, your table is dropped and recreated. Your dataflow refresh fails if you have any relationships or measures added to your table."

1 ACCEPTED SOLUTION

Hi @frithjof_v 
The internal team has received the feedback. They are going to make changes in the document soon. The document would be revised and made changes accordingly. 
Thank you for the valuable feedback.

View solution in original post

5 REPLIES 5
JFTxJ
Advocate III
Advocate III

I can confirm this behavior as well using a Custom Semantic Model built of the SQL Endpoint of my Lakehouse.

 

I wonder if we would get the error if the relationships are created in the Default Semantic Model instead of a Custom Semantic Model?

If I remember correctly, I tried both the default and the custom semantic model. 

 

I also tried creating another lakehouse in another workspace, where all the tables were just shortcuts pointing to the original lakehouse.

 

These also didn't fail after I ran the Dataflows Gen2 and changed the schema of the table in the original lakehouse.

 

Which is great 😃 I just feel it's contradicting the information in the documentation, so I'm wondering if it's a supported feature or not.

v-nikhilan-msft
Community Support
Community Support

Hi @frithjof_v 
Thanks for using Fabric Community and bringing this to our notice.
At this time, we are reaching out to the internal team to get some help on this. We will update you once we hear back from them.
Thanks 

Hi @frithjof_v 
The internal team has received the feedback. They are going to make changes in the document soon. The document would be revised and made changes accordingly. 
Thank you for the valuable feedback.

@v-nikhilan-msft 

 

I did some further testing. It may seem that while the relationships on the tables still work afterwards, the foreign key and unique constraints on the tables are missing afterwards (after refreshing a Dataflow Gen2 with Dynamic schema):

 

https://community.fabric.microsoft.com/t5/General-Discussion/SQL-Analytics-Endpoint-Table-constraint...

Helpful resources

Announcements
Sept Fabric Carousel

Fabric Monthly Update - September 2024

Check out the September 2024 Fabric update to learn about new features.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.