Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
J-Dixon
Regular Visitor

Issue with Dataflow gen 2 -> Lakehouse - numeric fields missing

Hi,

 

I'm trying out the new fabric features for my org. I'm having issues with the following flow of data:

 

Synapse dedicated pool -> fabric dataflow gen 2 -> fabric lakehouse 

 

I set up 10 queries in my data flow and I'm finding that some columns are just not loading into the datalake version of the table. The only pattern I can see is the columns that are missing are all either whole number or decimal number types

 

I noticed at first because my surrogate keys were mostly missing, then discovered that also some decimal fields were gone too

 

I've tried rebuilding the lakehouseand dataflow several times, deleting tables from the lakehouse and re-running the dataflow (the fact you can't add new columns in the dataflow and it be reflected in the lakehouse is frustrating), recreating the whole thing in a new workspace, checking the mappings in the data destination settings in the dataflow

 

Anyone else noticed something similar, or any way to surefire get all my columns into my lakehouse tables? 

7 REPLIES 7
WomanToBlame
Advocate I
Advocate I

@kudz - thank you for the reply. Yes, I do see that. All looks good until I click Publish. The resulting table does not include the field.

Do you have a space in the column name, just curious

Hi @kudz ,

 

Sorry for the delay in replying. No space in the column, but what I didn't realize is that after I published the 2nd time, the table had failed. After successfully publishing, the integer and decimal columns are there. So confirming that, for whatever reason, you need to review the settings/mappings for those data types to publish.

 

Thank you for your help!

WomanToBlame
Advocate I
Advocate I

Also having the same issue with Dataflow Gen2 with ADLS Gen2 as the source and Data Lakehouse as the destination. Also seems to be with decimal fields. They appear in the dataflow configuration, but after publishing they are not in the lakehouse.

After selecting the lakehouse do you see the destination settings as in the image below? My type for amount was retained in the lake house

kudz_0-1695842915743.png

 

kudz
Frequent Visitor

@J-Dixon I fixed this by clicking the settings cog below the query properties, if dataflow was started in Lakehouse or you could add destination 

kudz_0-1693250595238.png

kudz_1-1693250656634.png

After that the only issues i had were with column names with spaces

 

 

kudz
Frequent Visitor

I am having the same issue with fabric dataflow gen2; however a pipeline data copy brings in all the columns

Helpful resources

Announcements
July 2024 Power BI Update

Power BI Monthly Update - July 2024

Check out the July 2024 Power BI update to learn about new features.

July Newsletter

Fabric Community Update - July 2024

Find out what's new and trending in the Fabric Community.