Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi
I am setting up a delta load from a REST API, where I start by using a lookup to get the timestamp of the latest record in the destination and then use this value as a parameter a copy activity.
Works fine, unless there are now newer records in the API in which case I get an empty response, which causes the copy activity to fail.
I then tried to do Lookup to get number of rows from the API and use this result in a condition, but now the lookup fails because of the empty reponse.
Any input here on how to handle this empty response from the API?
Thanks in advance?
Solved! Go to Solution.
You can configure the lookup to perform an alternative action upon a failed connection, such as exiting the flow or bypassing the copy action. This will mark the overall flow run as successful, although the details will still indicate the failed lookup.
you can add an activity on failure, this way even the copy activity fails pipeline will run successfully.
Example:
I have added an activity to store the error code on failure.
@string(activity('Copy data1').Error.errorCode)
The below blog explains this concept in detail :
Note : ADF and data pipelines in fabric are similar concepts
You can configure the lookup to perform an alternative action upon a failed connection, such as exiting the flow or bypassing the copy action. This will mark the overall flow run as successful, although the details will still indicate the failed lookup.
@ThomasPyndt - Sounds you may be limited by the API. Does the API allow you to use between conditions? Sounds like what you really want are new/updated records based on the Max Date in the destination and the date/time the Pipeline kicks off? Is that correct?
The only way I've found to handle situations like that when the source API is not the greatest is to gracefully fail and try to catch the changes the next go round. I worked with an API where an entire page woud just drop out of the results and return emtpy. We would fail, retry the same query and page and it would return data.
I would check the Azure Data Factory support forums as well. I did a quick scan and most of what I see falls into the fail/retry gracefully scheme.