The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I'm developing a fabric pipeline where I have to bring in data from a table and run a for each loop within which I'm making an API call for every record by dynamically building the body of the api call. And I have a rate limit of 200 calls per hour for the endpoint I'm accessing. So I'm looking to bring in 200 records at a time for a pipeline run(something like select * from limit 200).I will be running the pipeline multiple times depending on the total number of records. As I'm not executing for each loop sequentially, I couldn't use a counter variable for the loop and stop processing once it reaches 200. So my question is how do I bring in specific number of records from the table?
I explored lookup and copy activity but neither of them let me add custom query
Thanks in advance for any suggestion
Solved! Go to Solution.
Hi @jwinchell40 Thanks for your suggestions. My source is a lakehouse table and I ended up using a notebook to copy certain number of records to another table which I'm reading in the pipeline with a lookup. Not the best solution but got it to work this way
@nkailasamurthy - Is your source table a Lakehouse Table? You can use the Lookup Activity to limit the number of records returned when using a SQL Data Source. We do this today to retrieve a list of table names that we need to loop through to sync data from a different source. We will often limit that query to the Top 5 rows when doing testing in Dev. However, when connected to the Lakehouse I do not think you can do a limit. What I have not tried is setting up a SQL Data Connection to the Lakehouse to see whether or not that will behave like a SQL endpoint.
If it is a Lakehouse Table, you could also look at calling a Notebook from the Pipeline. That would allow you to limit the # of rows, retrieve data from the API and write the results wherever it is needed. I am currently doing this to sync data from a 3rd party API.
Jeremy
Hi @jwinchell40 Thanks for your suggestions. My source is a lakehouse table and I ended up using a notebook to copy certain number of records to another table which I'm reading in the pipeline with a lookup. Not the best solution but got it to work this way
@nkailasamurthy - Glad to hear you found a solution. I took the same approach you did when I first ran into this issue except I loaded the data into a DW Table and used that in my lookup. Hopefully, the lookup up activity will allow for queries on Lakehouse Tables.
Hi @nkailasamurthy ,
Lookup activity: Lookup activities in Microsoft Fabric can return up to 5000 rows, but do not allow you to add custom query directly to limit the number of rows.
ForEach activity: You can specify whether items in a batch should be processed sequentially or in parallel, and set the maximum number of items to be processed concurrently via Batch count. Batch count: You can specify whether items in a batch should be processed sequentially or in parallel, and set the maximum number of items to be processed simultaneously through Batch count.
For more details, you can refer to below document:
Lookup activity - Microsoft Fabric | Microsoft Learn
ForEach activity - Microsoft Fabric | Microsoft Learn
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
2 | |
2 | |
2 |