Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
nicolasbejar
Microsoft Employee
Microsoft Employee

Can't setup copyjob to read partitioned delta table

I'm unable to set up a copyjob to read a partitioned delta table located in an ADLS account. This is my current set up 

nicolasbejar_0-1763680431209.png

 

5 REPLIES 5
v-karpurapud
Community Support
Community Support

Hi @nicolasbejar 

I wanted to check if you’ve had a chance to review the information provided. If you have any further questions, please let us know. Has your issue been resolved? If not, please share more details so we can assist you further.

Thank You.

Ugk161610
Resolver II
Resolver II

Hi @nicolasbejar 

 

You're not doing anything wrong — the issue is simply that the Copy Job UI doesn’t currently expose a “Delta” option at all. Even if your folder clearly contains _delta_log and the partition folders, the Copy Job still only treats ADLS Gen2 sources as “files” (Parquet, CSV, JSON, etc.). Because of that, it will always try to read the Parquet files directly and skip the Delta transaction log, which is why the load fails.

 

So to answer your questions:

  1. Pointing to the root is correct, and the presence of _delta_log means the table is valid — the Copy Job just doesn’t understand it natively.

  2. You don’t need another connection type. ADLS Gen2 is correct. The limitation is in the Copy Job reader, not the connection.

  3. It’s expected that you don’t see “Delta” in the format list — the UI hasn’t added Delta support yet for ADLS sources. That’s why the “Delta” option doesn’t appear.

At the moment, the only reliable way to copy from a Delta Lake table stored in ADLS is to use Dataflows Gen2 or a notebook (PySpark or pandas) instead of the Copy Job wizard. Those fully understand the Delta transaction log, partitions, and schema.

 

In short:
Your setup is right — the Copy Job just doesn’t support Delta formats from ADLS yet. Once Microsoft updates the Copy Job to recognize Delta automatically, the same path you’re using should work without any changes.

 

Hope this helps.

 

– Gopi Krishna

v-karpurapud
Community Support
Community Support

Hi @nicolasbejar 

Thank you for reaching out to the Microsoft Fabric community forum.

 

The problem occurs because the Copy Activity source is set to ADLS Gen2 with the file format as Parquet, but the path actually leads to a Delta Lake table with a _delta_log directory and partitioned folders. Reading a Delta table as Parquet skips the transaction log, which prevents correct schema reading, partition detection, and version control, leading to load failures.

 

To resolve this, update the source to point to the Delta table’s root folder the directory that contains the _delta_log rather than an individual partition path such as year=…/month=…, and change the file format from Parquet to Delta (Delta Lake). When configured correctly, the Delta reader handles partition pruning automatically without the need for wildcards or recursive searches.

 

Also, check that the pipeline’s Managed Identity (either Fabric workspace MI or ADF MI) has the required permissions, such as Storage Blob Data Contributor RBAC and suitable read/traverse ACLs on the container and table root. If the storage account uses a Private Endpoint or VNet, make sure the pipeline runtime can access the DFS endpoint. After these changes, both Preview and full Copy should work with the partitioned Delta table.

 

I hope this information is helpful. If you have any further questions, please let us know. we can assist you further.

 

Regards,

Microsoft Fabric Community Support Team.
 

Hi !

 

Thank you for engaging 

 

I'm still unable to read it .
1. My original path was already pointing to the folder that contains both the _delta_log and also the partitioned subfolders containning the parquet data. 
2. Should the connection change from ADLS gen2 to something else ? 
3. I can't see a format that says Delta. 


Hi @nicolasbejar 

I want to assure you that there is nothing wrong with the way you originally configured your source. Pointing to the folder that contains _delta_log is exactly what you’re supposed to do for a Delta table, and based on what you described, you were already following the correct Delta Lake conventions. My earlier explanation focused on how Delta tables are normally read  from the table root and through the Delta transaction log  which is why I highlighted those details.

The key insight, as the @Ugk161610  pointed out, is that the limitation isn’t with your setup at all, but with the current capabilities of the Copy Job UI. Even if the path and structure are correct, the Copy Job simply doesn’t recognize Delta Lake tables in ADLS right now, which is why the “Delta” format isn’t appearing for you and why the load continues to fail. Their clarification complements my explanation by making it clear that the issue lies with the tool rather than anything you misconfigured.

So while your approach is right, the Copy Job currently can’t process ADLS-backed Delta tables. Once Delta support is added to the Copy Job, your configuration should work exactly as expected.

I would also like to thank you @Ugk161610 for your active participation and for sharing solutions within the community forum.

Regards,

Microsoft Fabric Community Support Team.



Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors