Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DALI007
Helper II
Helper II

Schema recovery problem in lakehouse

Hello,

I use the copy component of a data pipeline to copy a lakehouse table (in my case, it is called Numer_jalon_oc). See attached image named: Table_source exists in LAKEHOUSE.
When retrieving the table, I was confronted with an error message. The error message indicates that the table is not present, but as can be seen on the attached image called Erreur_import_mapping, the table data are present in the lakehouse.
I can’t find a solution to this problem. Is there any way you could help me solve this problem, please?
Erreur_import_mapping.pngTable_source existe dans LAKEHOUSE.png
Thank you in advance for your help.


1 ACCEPTED SOLUTION

Just to remove the chance and doubt of source being the problem 

You can simulate a source with entirely static data by defining Additional columns in the Copy activity, effectively creating a “dummy” source with any columns and values you want

 

 

this will prove the point that problem lies with oracle if data is still bot copied

 

you can literally go to oracle db and chekc the structure of table and see if something is really odd

 

keep me posted 

View solution in original post

7 REPLIES 7
nilendraFabric
Super User
Super User

Hello @DALI007 

 

there are a few potential solutions:
1. Refresh the SQL endpoint:
• Go to the Lakehouse page
• Click “…” next to the SQL endpoint
• Select “Refresh”
2. Run table maintenance:
• Right-click the table in Lakehouse explorer
• Select “Maintenance”
• Check all options and click “Run now

 

Use Spark to read and rewrite the table:
df = spark.read.table("Numer_jalon_oc")
df.write.mode("overwrite").saveAsTable("Numer_jalon_oc")

Limitations
Some current limitations of lakehouse schemas include:
• Not supported for shared lakehouses
• Non-Delta managed tables don’t show schema info
• External Spark tables not supported
• Public APIs not available for schema-enabled lakehouses

Please accept this solution and give kudos if this helps 

 

 

Hello @nilendraFabric,

I have made the following manipulations :
1. Refresh the SQL endpoint:
• Go to the Lakehouse page
• Click “…” next to the SQL endpoint
• Select “Refresh”
2. Run table maintenance:
• Right-click the table in Lakehouse explorer
• Select “Maintenance”
• Check all options and click “Run now

The error message persists despite I see the table in my lakehouse (see attached image). you will have another solution please.

Affichage_table_lakehouse.pngTable dans LakeHouse.pngmessage persiste.png
Thank you
 

Hello @DALI007 

 

can you run this command successfully in sql endpoint 

 

SELECT * FROM dbo.Numer_jalon_oc

 

 

Hello @nilendraFabric ?

I can execute this request in the sql endpoint. you will find attached the result
dbo.nombre_jalon_oc.png

Thank you

Thanks. Please share target and mapping as well

The target is a table in an Oracle database (photo attached). But the mapping does not work yet I can’t import the mapping source (the table in the lakehouse). Always the same error message
Thank you

Cible oracle.png

Just to remove the chance and doubt of source being the problem 

You can simulate a source with entirely static data by defining Additional columns in the Copy activity, effectively creating a “dummy” source with any columns and values you want

 

 

this will prove the point that problem lies with oracle if data is still bot copied

 

you can literally go to oracle db and chekc the structure of table and see if something is really odd

 

keep me posted 

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.