Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
goldenarm253
Frequent Visitor

Fabric Lakehouse with F4 license; Spark API rate limit vs. Warehouse no issues

Introduction:

I am new to the Fabric Community and have three years of experience with Power BI. I have passed the PL-300 exam and have a good understanding of the Power BI Pro license.

 

Fabric Subscription:

Currently using the F4 Fabric license.

 

Challenge:

We recently migrated a workspace to Fabric and are facing some challenges. We are importing Salesforce tables into Dataflow Gen2, and from there, copying them into a Lakehouse.

 

  • Dataflow Gen2 to Lakehouse

    • Encountering a Spark API rate limit, even though the user table has fewer than 500 records.
  • Dataflow Gen2 to Warehouse

    • No issues using this feature, but we are unable to use Spark to query the data.

Any suggestions?

1 ACCEPTED SOLUTION
audreygerred
Super User
Super User

Hello! The rate limit isn't based solely on the size of the data. The F4 sku has limits on the number of concurrent Spark jobs and API calls that can be made, so that may have soemthing to do with your issue. When the limits are exceeded you could get errors indicating the limit has been reached. F4 enforces concurrency limits and queue sizes so only a certain number of jobs can run at the same time. You can hit rate limits because of frequent API calls (frequent calls in a short amount of time could cause the rate limit to be reached regardless of data size), concurrent jobs, complex operations and burst usage. You can open look into spreading the calls out over a longer period of time or upgrading to a higher SKU. Also, here is a link where you can open a support ticket if needed: Microsoft Fabric Support and Status | Microsoft Fabric





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





View solution in original post

1 REPLY 1
audreygerred
Super User
Super User

Hello! The rate limit isn't based solely on the size of the data. The F4 sku has limits on the number of concurrent Spark jobs and API calls that can be made, so that may have soemthing to do with your issue. When the limits are exceeded you could get errors indicating the limit has been reached. F4 enforces concurrency limits and queue sizes so only a certain number of jobs can run at the same time. You can hit rate limits because of frequent API calls (frequent calls in a short amount of time could cause the rate limit to be reached regardless of data size), concurrent jobs, complex operations and burst usage. You can open look into spreading the calls out over a longer period of time or upgrading to a higher SKU. Also, here is a link where you can open a support ticket if needed: Microsoft Fabric Support and Status | Microsoft Fabric





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.