Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The ultimate Microsoft Fabric, Power BI, Azure AI & SQL learning event! Join us in Las Vegas from March 26-28, 2024. Use code MSCUST for a $100 discount. Register Now

"We couldn't load your SQL endpoint" Batch was cancelled error

I can't load any lakehouse or warehouse SQL endpoint in Fabric. It waits about a minute, not loading anything, and then shows the error message below. I've had this happen for the past week with no solution. Querying from SSMS is very slow and queries time out as well. 

 

souldish_0-1695311201785.png

 

Status: Needs Info

Hi @souldish 

Have you made any changes in the meantime? If you connect directly to the data source, can you load the data? If you recreate a lakehouse using that data source and then load the data from the lakehouse, will you still have this problem?

 

Best Regards,
Community Support Team _ Ailsa Tao

 

Comments
v-yetao1-msft
Community Support
Status changed to: Needs Info

Hi @souldish 

Have you made any changes in the meantime? If you connect directly to the data source, can you load the data? If you recreate a lakehouse using that data source and then load the data from the lakehouse, will you still have this problem?

 

Best Regards,
Community Support Team _ Ailsa Tao

 

souldish
Frequent Visitor

The issue has now gotten worse. Anytime I create a warehouse or a lakehouse, the SQL endpoint does not generate and I get an error that it can't load the endpoint. When querying from SSMS, I get an error that the server does not exist and/or doesn't allow remote connections. I am now locked out of using any SQL endpoint for my tenant completely and I even tried using a trial fabric capacity, different from the premium capacity, and had the same issue occuring where the SQL endpoint would not load. 

 

I have done a lot of deletes on fabric items and workspaces over the past week and it feels like they aren't being properly deleted or disposed. I saw another thread where they mentioned SQL processes are not being killed correctly and they ran a query to find those executions...but now that I am locked out of using any SQL endpoint, I can't even determine if that's the issue.

 

When I checked the Fabric capacity analysis report, all of the Fabric items had long running duration times. 

 

I'm more concerned about the affect this has on the premium capacity, especially with the Fabric capacity changes on Oct 1. I hope that this doesn't get billed to anything while still in preview.

 

EDIT: MS Support was able to fix the issue..however, when I re-created my lakehouse and added back all the shortcuts I had to the lakehouse, I noticed the slowness again. Then, I ran a complex query and that gave me a 'This query was rejected due to current capacity constraints' error. Finally, I tried accessing the lakehouse again and it gave me the same batch error. So, back to talking with MS Support again.

Yalem
Employee

I have a similar issue. In my case, I'm trying to compare if there is any performance penalities for using shortcuts. So, after creating all my shortcuts, I run very complex query, it run successfully. Then I push the data to onelake, and run the same query, this time it give me the error "This query was rejected due to current capacity constraints." I added more capacity upto 512cu, all give the same result. the same query is working fine for shortcut, it definitly looks like a bug becuase the shortcut were able to execute it with 128cu. Any help is appreciated.

 

Thanks!

JE_test
Frequent Visitor

I get the same error message when running a stored procedure in my Warehouse. Its reading data from one of my more complicated and bigger views. No other table or view is giving me this error.

"This query was rejected due to current capacity constraints."

I get this error when running the stored pricedure from a pipeline.

How can I see what the capacity constrains are? and how can I modify them?

agneum
Helper I

Stick to Azure SQL. Don't keep your data as random files that need to be parsed every time you need to read the table.