Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I'm loading data into a lakehouse using a data pipeline from an on-premises SQL Server to a new schema table in the lakehouse. I'm encountering two issues here.
Surprisingly, it is available through the SQL endpoint, though under dbo schema. I'm not sure what I'm doing wrong here.
Solved! Go to Solution.
I believe I figured out the issue over the weekend, but will be doing some additional testing tomorrow. Previously, I was connected to SQL server on prem using an on-premise gateway in Fabric using gateway version 3000.234 on one of our VMs which was writing data to the Unidentified data folder inside of it's own schema in Fabric and was unable to write the data correctly into the lakehouse. The delta parquet files were still visible inside of the sql end point, but the lakehouse looked jacked up like in the screenshot PraveenVeli posted above. We were able to update the gateway version over the wekeend to 3000.246 on our VM and it appears it has fixed the issue and I am now able to see the data in the lakehouse exactly as required.
Thank you @joseph_campbell . Updating Gateway did the trick and resolved the issue. Thanks alot for your help on this.
Hi @PraveenVeli ,
Thanks for the reply from joseph_campbell .
Did the above suggestions help with your scenario? if that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @PraveenVeli ,
The error message indicates that Lakehouse does not recognize these objects as tables and therefore cannot load the data correctly.
I ran the test:
I was able to reproduce your error when setting the pipeline destination to a normal lakehouse without schema preview enabled.
Set destination to lakehouse with schema preview enabled, and manually specify the table to load data into under the dbo schema, which can be run successfully, the table is displayed normally, and no error is reported.
Table run successfully and load properly even if the schema name is specified manually.
There are two workarounds:
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hey, I just submitted a post here recently and am running into the exact same issue.
The Lakehouse I have does have schema preview enabled. And I have an existing pipeline that can specify schema and insert data into a table vs creating a schema based off the table name. I created a second pipeline to bring in data from a different sql server source and am running into this issue where I can't even load the data into the default schema, nevermind a secondary or custom schema.
After the pipeline runs it creates a new schema based off the table name and puts the delta file in an Unidentified folder. I don't see a solution for this, is this a bug, because it seems like this is a common and very critical issue without a solution...
I believe I figured out the issue over the weekend, but will be doing some additional testing tomorrow. Previously, I was connected to SQL server on prem using an on-premise gateway in Fabric using gateway version 3000.234 on one of our VMs which was writing data to the Unidentified data folder inside of it's own schema in Fabric and was unable to write the data correctly into the lakehouse. The delta parquet files were still visible inside of the sql end point, but the lakehouse looked jacked up like in the screenshot PraveenVeli posted above. We were able to update the gateway version over the wekeend to 3000.246 on our VM and it appears it has fixed the issue and I am now able to see the data in the lakehouse exactly as required.
Thank you @joseph_campbell . Updating Gateway did the trick and resolved the issue. Thanks alot for your help on this.
Hi @Anonymous , Thank you for getting back on this. Issue still persists for me. Surprisingly it never goes to dbo schema for me or any other schema that i give in there. Something very similart to this link https://community.fabric.microsoft.com/t5/Data-Pipeline/Copy-activity-into-Lakehouse-with-schemas-enabled/m-p/4324306. Coming to my data load issue i tried these two options
And it still comes up as below
Not sure if capacity region makes a difference. I'm using trail version with region as North Central US.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
9 | |
5 | |
4 | |
3 | |
2 |