Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Reply
BeyzaKzlky
Frequent Visitor

Cannot Load All Table to Lakehouse After Transformation on Dataflow

Hi all,

I'm fairly new at Fabric. 

I'm getting data from on-prem to Lakehouse with dataflow gen2. I have created a separate dataflow.

Then I have created a second dataflow and in this dataflow I'm getting data from first dataflow and applying some transformations. At the end of this transformations I'm combining more than 2 tables. 

Then I want to publish this new table to lakehouse to use in Power BI reports. But when I publish, I see that I cannot see all the columns. There are too many missing columns. 

I couldn't find any way.

Is there anyway to solve this?

Thanks

4 REPLIES 4
v-cboorla-msft
Community Support
Community Support

Hi @BeyzaKzlky 

 

Thanks for using Microsoft Fabric Community.

Apologies for the issue that you are facing here.

Here are the possible reasons for missing columns in your published Dataflow Gen2 table:

Incomplete Schema Inference:

  • Check for Unsupported Data Types: Dataflow might skip columns with unsupported data types. Ensure all columns have compatible types (e.g., string, number, date/time).
  • Enable Explicit Schema Definition: If automatic inference is unreliable, explicitly define the schema in the source dataflow or sink settings.

Transformation Issues:

  • Review Transformation Steps: Examine each step for filtering, projections, or aggregations that might inadvertently remove columns.
  • Verify Joins: Ensure all necessary columns are included in the output of joins.

Output Sink Configuration:

  • Inspect Output Settings: Double-check the sink configuration to guarantee that all desired columns are mapped to the destination table.
  • Confirm Column Mapping: Verify that columns are correctly mapped in the output settings, especially after schema changes.

I hope this information helps. Please do let us know if you have any further questions.

Thanks for your message but the points you have mentioned didn't worked. 

I think it's a Fabric bug, because I have also tried data warehouse in Fabric. But I couldn't get data from on-prem server via data pipeline, I tried again gen2, but it didn't get all tables this time. 

 

Thanks

Have a nice day.

Hi @BeyzaKzlky 

 

Apologies for the delay in response.

This might require a deeper investigation from our engineering team and they can guide you better.

Please go ahead and raise a support ticket to reach our support team:

https://support.fabric.microsoft.com/support
Please provide the ticket number here as we can keep an eye on it.

Hope this is helpful. Please let me know incase of further queries.

Hi @BeyzaKzlky 


We haven’t heard from you on the last response and was just checking back to see if you have got a chance to raise support ticket. Otherwise, will respond back with the more details and we will try to help.


Thanks

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebFBC_Carousel

Fabric Monthly Update - February 2025

Check out the February 2025 Fabric update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors