Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowFabric Data Days Monthly is back. Join us on March 26th for two expert-led sessions on 1) Getting Started with Fabric IQ and 2) Mapping & Spacial Analytics in Fabric. Register now
Hi beautiful people,
I'm working on an application to update a Lakehouse Delta table in Microsoft Fabric. My app needs to take source system credentials and write them to the Lakehouse.
I've already:
Could you please advise on any other considerations for using the Microsoft Fabric APIs to update Lakehouse Delta tables? Am I missing anything crucial? Thanking in advance 🙂
Solved! Go to Solution.
Hi @RonaldBalza-943 ,
Thanks for posting in Microsoft Fabric Community.
When building an application to update a Lakehouse Delta table using a Service Principal, it’s important to note that Microsoft Fabric does not currently support direct insert, update, or delete operations on Delta tables through REST APIs.
A supported method is to first upload data files such as CSV or Parquet into the Lakehouse Files section. After that, the Load to Tables API can be used to ingest the data into a Delta table. This is an asynchronous operation, and the load status can be tracked via the operation ID returned.
Another approach is to trigger a Spark notebook using the Notebook Public API. The notebook can perform necessary transformations using PySpark and write the results to the Delta table using standard Spark operations such as merge, overwrite, or append modes.
Ensure that the Service Principal has the Contributor role at the workspace level and sufficient permissions on the Lakehouse and its schemas.
For a broader overview of managing Lakehouse assets via API, refer to Managing Lakehouse using Fabric APIs.
The following threads may be helpful:
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to give a kudos and Accept as the solution to help the other members find it more quickly.
Thank you.
Hi @RonaldBalza-943 ,
Thanks for posting in Microsoft Fabric Community.
When building an application to update a Lakehouse Delta table using a Service Principal, it’s important to note that Microsoft Fabric does not currently support direct insert, update, or delete operations on Delta tables through REST APIs.
A supported method is to first upload data files such as CSV or Parquet into the Lakehouse Files section. After that, the Load to Tables API can be used to ingest the data into a Delta table. This is an asynchronous operation, and the load status can be tracked via the operation ID returned.
Another approach is to trigger a Spark notebook using the Notebook Public API. The notebook can perform necessary transformations using PySpark and write the results to the Delta table using standard Spark operations such as merge, overwrite, or append modes.
Ensure that the Service Principal has the Contributor role at the workspace level and sufficient permissions on the Lakehouse and its schemas.
For a broader overview of managing Lakehouse assets via API, refer to Managing Lakehouse using Fabric APIs.
The following threads may be helpful:
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to give a kudos and Accept as the solution to help the other members find it more quickly.
Thank you.
Thanks very much v. Appreciate your insights!
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 23 | |
| 12 | |
| 9 | |
| 7 | |
| 7 |
| User | Count |
|---|---|
| 47 | |
| 41 | |
| 25 | |
| 15 | |
| 14 |