Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
D_PBI
Post Partisan
Post Partisan

Azure Data Factory - help needed to ingest data, that uses an API, into a SQL Server instance

Hi,
I need to ingest 3rd-party data, that uses an API, into my Azure SQL Server instance (ASSi) using Azure Data Factory (ADF).
I'm a report developer so this area is unfamiliar to me, although I have previously integrated external, on-premise SQL Server data into our ASSi using ADF so I do have some exposure to the tool (just not API connections).

The 3rd-party data belongs to a company named 'iLevel' (in case this is relevant). iLevel have provided some API documentation which is targeted for an experienced data engineer that understands API connections. This documentation has just a few sections before the connection focused details end. I'll list them below:
1) It mentions to download 'Postman Collection' and mentions no more than this. I've never heard of Postman Collection and I don't know why it's needed. Through limited exposure online, I don't understand its purpose, certainly not in my scenario.

2) It has the title 'Access to the API' and then lists four URLs which are the endpoints (I don't know which to use and will need to ask iLevel of this but, I guess, any will do for testing purposes).

3) Authentication and Authorization
a) Generate a 'client id' and 'client secret' by logging into iLevel and generating these values with some clicks of a button. I've successfully generated both these values.
b) Obtain an Access Token - it'll be easier to screenshot the instructions for this (I've blanked part of the URL for confidentiality).

D_PBI_1-1769602929348.png


These are all the instructions on connection to the 3rd-party data. Unfortuntely for me, my lack of experience in this area means these instrcutions don't help me. I don't believe I'm any closer to connecting to the 3rd-party source data.

Taking into consideration the above instructions but choosing to try and error in ADF, a tool I'm a little bit more familiar with, I've performed the following steps:
1) Created a Linked Service.
I understand the iLevel solution is in the cloud and therefore the 'AutoResolveIntegrationRuntime' option has been selected as the 'Connect via integration runtime' value. For the 'Base URL' I've entered one of the four URL endpoints that were listed in the documentation (again, I will need to confirm which endpoint to use).

 

D_PBI_3-1769603443029.png

The 'Test Connection' returns a successful result but I think it means nothing because if I were to placed 'xxx' at the end of the Base URL and test the connection, it stills returns successful when I know the URL with the 'xxx' post-fix isn't legit.

2) Create an ADF Pipeline containing 'Web activity' and 'Set variable' objects.
The only configuration under the Web activity is the 'Settings' pane which has:

D_PBI_4-1769609402491.png

 

The 'Body' property has (the client id and client secret taken from the iLevel solution are included in the body but blanked out):

D_PBI_5-1769609501227.png


If the Web activity is successful then the Pipeline's next object (the Set variable) should assign the access token to a variable - as I understand this is what the Web activity is doing:

D_PBI_6-1769610077194.png

The 'Value' property has:

D_PBI_7-1769610414037.png

 

This is as far as I've got into my efforts of this integration task because the Web activity object fails when executed. The error message does state it is to do with an invalid 'client id' or 'client secret' - see below:

D_PBI_8-1769610598674.png

 

You may direct to focus on the incorrect client id or client secret, however I don't have any confidence that I understand how to configure ADF to obtain an access token, and I'm maybe missing something if I see no need for Postman Collection use.

What is Postman Collection and do I need it for what I'm trying to achieve?  If yes, can anyone provide training material that suits my need?
Have I configured ADF correctly and it is indeed an issue with the client id or client server, or is the error message received just a byproduct of an incorrect ADF configuration?

Your help will be most appreciated. Many thanks.

3 REPLIES 3
D_PBI
Post Partisan
Post Partisan

@Anonymous Thanks for your reply. What you describe isn't a dissimiliar path to what I'm expecting is required. The problem I face is unless I get every configuration correct, it won't work - just like the norm in the techie world. I don't understand APIs and their needed configurations to be able to complete my aim.
Are you aware of any material that will explain my need, guidance me on what to do?  I've reviewed several articles/videos and each has its own uniqueness which isn't suited to my end-to-end, unless I am misunderstanding the subject which is entirely possible.

Hi @D_PBI 

Thank you for the confirmation. Since your issue appears to be specific to Azure Data Factory, I would recommend posting it in the Azure Data Factory.  That forum is more focused on ADF-related topics, and you're likely to receive more targeted guidance from experts there.

Anonymous
Not applicable

Hi @D_PBI ,

Thank you for reaching out to the Microsoft Fabric Community Forum. 

 

When working with an API source in Azure Data Factory, the approach differs from loading data directly from a database, as ADF must first authenticate before accessing any data. This is why the vendor referenced using a client ID and client secret most APIs require you to generate an access token via an authentication endpoint. The Postman collection provided serves as a reference, outlining the necessary requests, headers, and authentication flow to replicate within ADF.

To configure this, begin by creating a pipeline and adding a Web activity to obtain the access token from the vendor’s token URL, typically using a POST request with the required client credentials. Once you receive the token, store it in a pipeline variable using a Set Variable activity. Next, use a Copy Data activity with the REST connector to call the actual API endpoint, passing the token dynamically in the Authorization header as a Bearer token. The resulting data can then be written directly into your Azure SQL table by setting up the sink and mapping the fields as needed.

Additionally, verify if the API response is paginated or contains nested JSON, as this may require further configuration to ensure complete data ingestion. With these steps, you establish a secure and efficient method for loading API data into Azure SQL using Azure Data Factory.

 

Thank you.

 

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.

Top Kudoed Authors