Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
BW_RFA
Helper I
Helper I

LivyHttpRequestFailure error when updating Delta Table.

I am updating a delta table using a spark sql merge statement. 

 

The data I am using to merge can contain duplicates on the ID, so I am splitting the data into deduplicatged portions and running the merge statement multiple times in sequence. The merge runs ok with the majority of the data 2/3 times, then when there are only 1 or 2 records to merge I get a Failure error as below.

 

BW_RFA_0-1712053064449.png

 

Advice on the error and also the process I am using to solve this problem welcome. 

 

Below is what I am trying to do.

BW_RFA_2-1712053641337.png

1) Is the error due to capacity/memory?

2) Should I combine the update into 1 row before I merge with the existing table instead of sequential merges? 

 

1 ACCEPTED SOLUTION

Thanks, I have merged the duplicate rows and have not faced the issue again. Performance is much better.

Thanks

View solution in original post

2 REPLIES 2
v-cboorla-msft
Community Support
Community Support

Hi @BW_RFA 

 

Thanks for using Microsoft Fabric Community.

Apologies for the inconvenience.

 

The LivyHttpRequestFailure error you are encountering while using a spark sql merge statement indicates that there was an issue with the Livy service while processing your request. The HTTP status code 500 suggests that the server encountered an internal error during the request processing. This error can occur due to various reasons, such as network connectivity issues or resource constraints. 

 

Wait and Retry: Sometimes, the issue might be temporary. Consider waiting and trying the operation again later. If the problem persists, proceed to the next steps.

Check Resource Utilization: Verify that your Synapse Analytics workspace has sufficient resources available to run your job. Insufficient resources could lead to errors. You can check the resource utilization by navigating to the “Monitoring hub” section of your Synapse Analytics workspace.

Ensure that the assigned Apache Spark pool has enough capacity to handle the data you’re processing. If the pool is under-provisioned, it could result in errors.

Combine Updates into One Row: Instead of performing sequential merges for individual records, consider combining the updates into a single row before merging with the existing table. This approach can reduce the overhead of multiple merge operations and potentially improve performance.

 

You can refer this thread which might help you : One Large Text File Parsing 200MB Error 500.

 

I hope this information helps. Please do let us know if you have any further questions.

 

Thanks.

Thanks, I have merged the duplicate rows and have not faced the issue again. Performance is much better.

Thanks

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.