Reply
JoeCrozier
Helper II
Helper II

Dealing with "record" columns in a Dataflow gen2

I have this dataflow that pulls from a Snowflake destination:

JoeCrozier_0-1728998666964.png

As you can see in the screenshot, a few of the columns are of "record" type.  Fabric doesnt seem to like these formats and automatically deletes them sometimes by adding code that looks like this:

 

Table.RemoveColumns(#"From Value", Table.ColumnsOfType(#"From Value", {type table, type record, type list, type nullable binary, type binary, type function}))


Makes sense, as best as I can understand it the data needs to be flat, almost csv like.

So the way I understand it, I need to expand out those records, maybe in their own table.  So I copied that query and selected just two columns:

JoeCrozier_1-1728998949485.png


When I tell Fabric to expand that contact identifier column and grab every column inside of it:

JoeCrozier_2-1728999019452.png


it doesn't "really" expand everything.  Here's what I mean, here's whats visible immediately after that:

JoeCrozier_3-1728999116945.png

 

That "step is not supported by fast copy" and still wont be able to save into a Lakehouse.  What do I do?

1 ACCEPTED SOLUTION

filter out all null value, then expand the record column again.

View solution in original post

3 REPLIES 3
lbendlin
Super User
Super User

looks like this might be a record inside a record. See if you can grab "value"  instead.

Silly question, how?

filter out all null value, then expand the record column again.

avatar user

Helpful resources

Announcements
March PBI video - carousel

Power BI Monthly Update - March 2025

Check out the March 2025 Power BI update to learn about new features.

March2025 Carousel

Fabric Community Update - March 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors (Last Month)
Top Kudoed Authors (Last Month)