Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Iam trying to validate image loaded into Lakehouse using API call . The API is in a VM Server . i will be getting 40000 rows /day . When iam trying to call the API in a batch of 50 rows/batch my notebook is failiing in 1000th row( 25th batch ) with the above row . Iam uisng F4 license , what could be the issue . I tried with batch size of 10 , 20 and 50 . even tried using rdd and map partitition but not able to prpcess more than 1000 row every time it fails with some error or the other , what am i missing here
@Lakssh Can you provide the way how you are extracting data from REST API? Since Spark does not have a native way to connect REST API and we usually Python request library to fetch data from the API.
If you are using a python UDF, it is not the most efficient way and can cause errors. I would recommend splitting the extraction logic by implementing it in Python requests or Data factory pipeline or DF g2 and then run the transformation process.
Hi @Lakssh
It seems like this might be an issue related to query limits or data volume limits.
You could check the relevant API documentation to see if there are any query limits. For example, some APIs might limit the number of read and write requests per minute. Additionally, some APIs might restrict the maximum amount of data that can be queried per request.
It might also be related to the computing power of the F4 SKU. You could try temporarily upgrading the Fabric capacity to F8 to see if it yields better results.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
4 | |
4 | |
3 | |
2 | |
2 |