Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
PraveenVeli
Frequent Visitor

Unable to identify these objects as tables. To keep these objects in the lakehouse, move them to Fil

Hi,

I'm loading data into a lakehouse using a data pipeline from an on-premises SQL Server to a new schema table in the lakehouse. I'm encountering two issues here.

  1. The table is not loading into the new schema, even though I specified the correct destination.
    1. PraveenVeli_2-1733763078401.png

       

  2. When trying to access table i get the folowing error. I did refresh and it didn't work aswell. 
    1. PraveenVeli_1-1733762640285.png

       

Surprisingly, it is available through the SQL endpoint, though under dbo schema. I'm not sure what I'm doing wrong here.

PraveenVeli_3-1733763138230.png

 

2 ACCEPTED SOLUTIONS

I believe I figured out the issue over the weekend, but will be doing some additional testing tomorrow.  Previously, I was connected to SQL server on prem using an on-premise gateway in Fabric using gateway version 3000.234 on one of our VMs which was writing data to the Unidentified data folder inside of it's own schema in Fabric and was unable to write the data correctly into the lakehouse.  The delta parquet files were still visible inside of the sql end point, but the lakehouse looked jacked up like in the screenshot PraveenVeli posted above.  We were able to update the gateway version over the wekeend to 3000.246 on our VM and it appears it has fixed the issue and I am now able to see the data in the lakehouse exactly as required.

View solution in original post

Thank you @joseph_campbell . Updating Gateway did the trick and resolved the issue. Thanks alot for your help on this.

View solution in original post

6 REPLIES 6
Anonymous
Not applicable

Hi @PraveenVeli ,

 

Thanks for the reply from joseph_campbell .

 

Did the above suggestions help with your scenario? if that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Anonymous
Not applicable

Hi @PraveenVeli ,

 

The error message indicates that Lakehouse does not recognize these objects as tables and therefore cannot load the data correctly.

 

I ran the test:

vhuijieymsft_0-1733817410567.png

 

I was able to reproduce your error when setting the pipeline destination to a normal lakehouse without schema preview enabled.

vhuijieymsft_1-1733817419030.png

vhuijieymsft_2-1733817419035.png

 

Set destination to lakehouse with schema preview enabled, and manually specify the table to load data into under the dbo schema, which can be run successfully, the table is displayed normally, and no error is reported.

vhuijieymsft_3-1733817427399.png

vhuijieymsft_4-1733817427401.png

 

Table run successfully and load properly even if the schema name is specified manually.

vhuijieymsft_5-1733817437084.png

vhuijieymsft_6-1733817437086.png

 

There are two workarounds:

 

  1. Like the test screenshot I showed earlier, you can try loading the tables in SQL Server into a lakehouse with schema preview enabled.
  1. Use dataflow Gen2 to import the data from SQL Server to the lakehouse.

vhuijieymsft_7-1733817450909.png

vhuijieymsft_8-1733817450913.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Hey, I just submitted a post here recently and am running into the exact same issue.

 

https://community.fabric.microsoft.com/t5/Fabric-platform/Fabric-Pipeline-Creating-Schemas-with-Unid...

 

The Lakehouse I have does have schema preview enabled.  And I have an existing pipeline that can specify schema and insert data into a table vs creating a schema based off the table name.  I created a second pipeline to bring in data from a different sql server source and am running into this issue where I can't even load the data into the default schema, nevermind a secondary or custom schema. 

 

After the pipeline runs it creates a new schema based off the table name and puts the delta file in an Unidentified folder.  I don't see a solution for this, is this a bug, because it seems like this is a common and very critical issue without a solution...

I believe I figured out the issue over the weekend, but will be doing some additional testing tomorrow.  Previously, I was connected to SQL server on prem using an on-premise gateway in Fabric using gateway version 3000.234 on one of our VMs which was writing data to the Unidentified data folder inside of it's own schema in Fabric and was unable to write the data correctly into the lakehouse.  The delta parquet files were still visible inside of the sql end point, but the lakehouse looked jacked up like in the screenshot PraveenVeli posted above.  We were able to update the gateway version over the wekeend to 3000.246 on our VM and it appears it has fixed the issue and I am now able to see the data in the lakehouse exactly as required.

Thank you @joseph_campbell . Updating Gateway did the trick and resolved the issue. Thanks alot for your help on this.

Hi @Anonymous , Thank you for getting back on this. Issue still persists for me. Surprisingly it never goes to dbo schema for me or any other schema that i give in there. Something very similart to this link https://community.fabric.microsoft.com/t5/Data-Pipeline/Copy-activity-into-Lakehouse-with-schemas-enabled/m-p/4324306. Coming to my data load issue i tried these two options 

PraveenVeli_0-1734361280272.pngPraveenVeli_1-1734361366306.png

And it still comes up as below

PraveenVeli_2-1734361760514.png

 

 Not sure if capacity region makes a difference. I'm using trail version with region as North Central US.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.