Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
gregahren
Helper I
Helper I

Spark Connector for Data Warehouse - Fails with FabricSparkTDSHttpFailure 429

Hi everyone,

 

We are trying to use synapse spark connector https://learn.microsoft.com/en-us/fabric/data-engineering/spark-data-warehouse-connector

 

We have some TSQL queries that we are executing in parallel. Previously those queries were written in spark sql. Now we would like to use tsql with the connector.

 

But after some queries run I get this errors: `An error occurred while calling o55682.synapsesql.
: com.microsoft.spark.fabric.tds.error.FabricSparkTDSHttpFailure: Artifact ID inquiry attempt failed with error code 429. Request Id - .`

Is there some limitation on how many queries we can run in the session? And how to solve this?

 

Thanks for your help

1 ACCEPTED SOLUTION

Thanks for sharing the details @gregahren

 

which F SKU you are on?

 

Here are my thoughts 

 

The SQL analytics endpoint (used for T‑SQL queries) in Fabric Data Warehouse imposes throttling rules (or smoothing policies) that can lead to errors like 429 when too many requests are submitted concurrently over a short period. This mechanism is intended to manage resource saturation during peak usage.

 

 


• While it appears your workload exceeds an unofficial threshold (around 50 requests per 50 seconds), this limit isn’t formally documented as a “50req/50sec” rule. Instead, it’s part of the broader “compute capacity smoothing and throttling” policies that control background operations and ask users to manage concurrency when using T‑SQL via the connector


https://learn.microsoft.com/bs-latn-ba/fabric/data-warehouse/compute-capacity-smoothing-throttling

 

hope this is helpful.

 

if yes please accept the answer 

 

 

View solution in original post

5 REPLIES 5
nilendraFabric
Super User
Super User

Hello @gregahren 

 

Could you please share more details about the usecase.

Yes of course.

 

We are running some queries in parallel threads when processing our data, which works for sparks sql, when for tsql it throws 429 errors. I quickly added an example you can run in notebook so you can see where the issue is:

 

import com.microsoft.spark.fabric
from com.microsoft.spark.fabric.Constants import Constants
from concurrent.futures import ThreadPoolExecutor

WORKSPACE_ID = "your_workspace_id"
LAKEHOUSE_NAME = "lakehouse_name"

def run_spark_multiple_threads(
    target_function, args_list, number_of_threads = 8
):
    with ThreadPoolExecutor(number_of_threads) as pool:
        futures = [pool.submit(target_function, *args) for args in args_list]
        results = [future.result() for future in futures]

    return results

def run_tsql(i):
    df = (
                        spark.read.option(
                            Constants.WorkspaceId,
                            WORKSPACE_ID,
                        )
                        .option(
                            Constants.DatabaseName,
                            LAKEHOUSE_NAME,
                        )
                        .synapsesql("""SELECT 1 as Temp""")
                    )
    print(i, df.count())

def run_sparksql(i):
    df = spark.sql("""SELECT 1 as Temp""")
    print(i, df.count())


run_spark_multiple_threads(
                run_sparksql,
                [(i,) for i in range(100)],
                number_of_threads=8
            )
print("done with spark sql")

run_spark_multiple_threads(
                run_tsql,
                [(i,) for i in range(100)],
                number_of_threads=8
            )
print("done with tsql")

 

 

Here you will see that spark sql finishes normally for 100 queries, when tsql stops working after 50 or some queries. I believe there is rate limit 50req/50sec set, but its not mentioned in limitations of Spark connector for Microsoft Fabric Data Warehouse.

 

If you need more info still please let me know.

Thanks for sharing the details @gregahren

 

which F SKU you are on?

 

Here are my thoughts 

 

The SQL analytics endpoint (used for T‑SQL queries) in Fabric Data Warehouse imposes throttling rules (or smoothing policies) that can lead to errors like 429 when too many requests are submitted concurrently over a short period. This mechanism is intended to manage resource saturation during peak usage.

 

 


• While it appears your workload exceeds an unofficial threshold (around 50 requests per 50 seconds), this limit isn’t formally documented as a “50req/50sec” rule. Instead, it’s part of the broader “compute capacity smoothing and throttling” policies that control background operations and ask users to manage concurrency when using T‑SQL via the connector


https://learn.microsoft.com/bs-latn-ba/fabric/data-warehouse/compute-capacity-smoothing-throttling

 

hope this is helpful.

 

if yes please accept the answer 

 

 

Thank you for your answer, I guess we will just have to take those limits into account.

We are on F4 capacity currently.

 

Thanks for your quick answer again!

@gregahren Glad it was helpful. Could you please accept the solution as this will be beneficial for community to find the right information quickly 

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.