Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi,
I'm trying out the new fabric features for my org. I'm having issues with the following flow of data:
Synapse dedicated pool -> fabric dataflow gen 2 -> fabric lakehouse
I set up 10 queries in my data flow and I'm finding that some columns are just not loading into the datalake version of the table. The only pattern I can see is the columns that are missing are all either whole number or decimal number types
I noticed at first because my surrogate keys were mostly missing, then discovered that also some decimal fields were gone too
I've tried rebuilding the lakehouseand dataflow several times, deleting tables from the lakehouse and re-running the dataflow (the fact you can't add new columns in the dataflow and it be reflected in the lakehouse is frustrating), recreating the whole thing in a new workspace, checking the mappings in the data destination settings in the dataflow
Anyone else noticed something similar, or any way to surefire get all my columns into my lakehouse tables?
@kudz - thank you for the reply. Yes, I do see that. All looks good until I click Publish. The resulting table does not include the field.
Do you have a space in the column name, just curious
Hi @kudz ,
Sorry for the delay in replying. No space in the column, but what I didn't realize is that after I published the 2nd time, the table had failed. After successfully publishing, the integer and decimal columns are there. So confirming that, for whatever reason, you need to review the settings/mappings for those data types to publish.
Thank you for your help!
Also having the same issue with Dataflow Gen2 with ADLS Gen2 as the source and Data Lakehouse as the destination. Also seems to be with decimal fields. They appear in the dataflow configuration, but after publishing they are not in the lakehouse.
After selecting the lakehouse do you see the destination settings as in the image below? My type for amount was retained in the lake house
@J-Dixon I fixed this by clicking the settings cog below the query properties, if dataflow was started in Lakehouse or you could add destination
After that the only issues i had were with column names with spaces
I am having the same issue with fabric dataflow gen2; however a pipeline data copy brings in all the columns
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.