Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.
Hi,
I'm trying out the new fabric features for my org. I'm having issues with the following flow of data:
Synapse dedicated pool -> fabric dataflow gen 2 -> fabric lakehouse
I set up 10 queries in my data flow and I'm finding that some columns are just not loading into the datalake version of the table. The only pattern I can see is the columns that are missing are all either whole number or decimal number types
I noticed at first because my surrogate keys were mostly missing, then discovered that also some decimal fields were gone too
I've tried rebuilding the lakehouseand dataflow several times, deleting tables from the lakehouse and re-running the dataflow (the fact you can't add new columns in the dataflow and it be reflected in the lakehouse is frustrating), recreating the whole thing in a new workspace, checking the mappings in the data destination settings in the dataflow
Anyone else noticed something similar, or any way to surefire get all my columns into my lakehouse tables?
@kudz - thank you for the reply. Yes, I do see that. All looks good until I click Publish. The resulting table does not include the field.
Do you have a space in the column name, just curious
Hi @kudz ,
Sorry for the delay in replying. No space in the column, but what I didn't realize is that after I published the 2nd time, the table had failed. After successfully publishing, the integer and decimal columns are there. So confirming that, for whatever reason, you need to review the settings/mappings for those data types to publish.
Thank you for your help!
Also having the same issue with Dataflow Gen2 with ADLS Gen2 as the source and Data Lakehouse as the destination. Also seems to be with decimal fields. They appear in the dataflow configuration, but after publishing they are not in the lakehouse.
After selecting the lakehouse do you see the destination settings as in the image below? My type for amount was retained in the lake house
@J-Dixon I fixed this by clicking the settings cog below the query properties, if dataflow was started in Lakehouse or you could add destination
After that the only issues i had were with column names with spaces
I am having the same issue with fabric dataflow gen2; however a pipeline data copy brings in all the columns
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
30 | |
27 | |
22 | |
12 | |
11 |