<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How many sessions I could run in F2 Capacity? in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354705#M5953</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;Thanks for reaching out and sharing the details about the issue you're facing with running Spark jobs on the F2 capacity. After looking into it, I believe the problem you're encountering is a combination of &lt;STRONG&gt;resource contention &lt;/STRONG&gt;and &lt;STRONG&gt;persistent Spark sessions&lt;/STRONG&gt; that continue running in the background, even after the query finishes in your notebook&lt;STRONG&gt;&lt;STRONG&gt;.&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;When you run a query in the notebook, it may appear to complete successfully, but the Spark session might still be active in the background, holding onto resources. This can cause the capacity limits for F2 to be reached&lt;STRONG&gt;.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;Possible solutions:&amp;nbsp;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;1. After running each query, make sure to call &lt;STRONG&gt;spark.stop()&lt;/STRONG&gt; in your notebook to explicitly terminate the Spark session. This will release the resources and allow other jobs to run without hitting the capacity limit.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Example:&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI style="list-style-type: none;"&gt;
&lt;DIV class=""&gt;
&lt;DIV class="" dir="ltr"&gt;&lt;SPAN&gt;df&amp;nbsp;=&amp;nbsp;spark.sql("SELECT&amp;nbsp;*&amp;nbsp;FROM&amp;nbsp;TestLakehouse.us_population_county_area&amp;nbsp;LIMIT&amp;nbsp;1000")&lt;/SPAN&gt;
&lt;DIV&gt;&lt;SPAN&gt;display(df)&lt;BR /&gt;spark.stop()&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;2.Bursting:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Enabling bursting will allow you to use up to 20 Spark VCores instead of the base 4, which can help if you’re running multiple lightweight queries concurrently.&amp;nbsp;Bursting will give you a bit more room, it's important to manage sessions actively.&lt;/P&gt;
&lt;P&gt;&lt;STRONG style="font-family: inherit;"&gt;3. Optimize Your Spark Pool Configuration:&lt;/STRONG&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;DIV class=""&gt;
&lt;DIV class="" dir="ltr"&gt;
&lt;P&gt;Review the Spark pool settings to make sure you’re using the right node size and max nodes. Enabling dynamic allocation could help manage resources more efficiently, scaling up or down based on the workload.&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vveshwaramsft_0-1736321662399.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1223327i436E92CAF8BAC6D1/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vveshwaramsft_0-1736321662399.png" alt="vveshwaramsft_0-1736321662399.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;I’d suggest starting by enabling bursting and managing your sessions more carefully with spark.stop() after each query. If that doesn’t fully resolve the issue, an upgrade to a larger capacity like F4 might be necessary, especially if your jobs are more resource-intensive.&lt;STRONG&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;Refer the below links for better understanding:&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-engineering/spark-job-concurrency-and-queueing" target="_self"&gt;Concurrency limits and Bursting in Microsoft Fabric&amp;nbsp;&lt;BR /&gt;&lt;/A&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-warehouse/burstable-capacity" target="_self"&gt;Burstable capacities&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/A&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Let me know if you need any help with these changes or if you have any other questions. I'm happy to assist further!&lt;BR /&gt;&lt;BR /&gt;If this helps, please accept as solution to help others benefit, a kudos would be appreciated.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;BR /&gt;Vinay.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 08 Jan 2025 07:55:16 GMT</pubDate>
    <dc:creator>v-veshwara-msft</dc:creator>
    <dc:date>2025-01-08T07:55:16Z</dc:date>
    <item>
      <title>How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353003#M5929</link>
      <description>&lt;P&gt;I was exploring Fabric capacity and currently I am on F2, but I was only able to run 1 spark job, on F2 and as soon as I started to run another job it wont let me do it and give me too many requests error, wanted to know how many jobs I can con-currently run on and F2 and is there any documentation which stats spark job limitations?&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2025 08:07:57 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353003#M5929</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-07T08:07:57Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353468#M5938</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;- thanks for posting.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Yes, there are limits on the number of sessions that can be created based on the capacity SKU - due to the compute resources for the capacity.&amp;nbsp; Concurrency limits are based on the number of Spark VCores for the capacity - each capacity unit (CU) provides 2 Spark VCores.&amp;nbsp; The F2 capacity has 2 capacity units and therefore 4 Spark VCores.&amp;nbsp; You can run multiple sessions, but the exact number will depend on the capacity resource available and the requirements of each session. If the capacity is temporarily fully utilized then additional sessions will be pended until capacity resources are available - if the capacity is fully utilized for a prolonged period of time then the sessions could be throttled or rejected.&amp;nbsp; This can be monitored using the Capacity Monitoring Report.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Here are some links to documentation where you can read more.&lt;/P&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-engineering/spark-job-concurrency-and-queueing" target="_blank"&gt;Concurrency limits and queueing in Apache Spark for Fabric - Microsoft Fabric | Microsoft Learn&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-install?tabs=1st" target="_blank"&gt;Install the Microsoft Fabric capacity metrics app - Microsoft Fabric | Microsoft Learn&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Please let me know if there are any other questions I can answer.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jan 2025 13:18:02 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353468#M5938</guid>
      <dc:creator>jennratten</dc:creator>
      <dc:date>2025-01-07T13:18:02Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353476#M5939</link>
      <description>&lt;P&gt;So I have been running basic query for a delta table that is already created, below query I am running in notebook&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;df&lt;/SPAN&gt;&lt;SPAN&gt; = &lt;/SPAN&gt;&lt;SPAN&gt;spark&lt;/SPAN&gt;&lt;SPAN&gt;.&lt;/SPAN&gt;&lt;SPAN&gt;sql&lt;/SPAN&gt;&lt;SPAN&gt;("&lt;/SPAN&gt;&lt;SPAN&gt;SELECT&lt;/SPAN&gt;&lt;SPAN&gt; * &lt;/SPAN&gt;&lt;SPAN&gt;FROM&lt;/SPAN&gt;&lt;SPAN&gt; TestLakehouse.us_population_county_area &lt;/SPAN&gt;&lt;SPAN&gt;LIMIT&lt;/SPAN&gt; &lt;SPAN&gt;1000&lt;/SPAN&gt;&lt;SPAN&gt;")&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;display&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;df&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;for first table it runs and suceeded but this keeps on running in monitor window, and when I run same query for another table it throws capacity issues which I posted previosuly, I dont understand if the query is completed running in the notebook then why it keeps on running in the monitor table and I have to manually cancel it to fire another query in order to not face the F2 capacity request issue, and again these are just basic queries nothing fancy.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 07 Jan 2025 13:22:09 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4353476#M5939</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-07T13:22:09Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354639#M5950</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It is strange that the query keeps running after it finishes. Ideally, you shouldn't have to do this, but try adding the below code in your notebook at the end to see if that terminates the notebook.&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;spark.stop()&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;You can also try using a high concurrency session for your notebooks if queries are split across different notebooks, so that you do not face the capacity exceeded issue.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-engineering/configure-high-concurrency-session-notebooks-in-pipelines" target="_blank"&gt;https://learn.microsoft.com/en-us/fabric/data-engineering/configure-high-concurrency-session-notebooks-in-pipelines&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 07:17:59 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354639#M5950</guid>
      <dc:creator>govindarajan_d</dc:creator>
      <dc:date>2025-01-08T07:17:59Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354705#M5953</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;Thanks for reaching out and sharing the details about the issue you're facing with running Spark jobs on the F2 capacity. After looking into it, I believe the problem you're encountering is a combination of &lt;STRONG&gt;resource contention &lt;/STRONG&gt;and &lt;STRONG&gt;persistent Spark sessions&lt;/STRONG&gt; that continue running in the background, even after the query finishes in your notebook&lt;STRONG&gt;&lt;STRONG&gt;.&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;When you run a query in the notebook, it may appear to complete successfully, but the Spark session might still be active in the background, holding onto resources. This can cause the capacity limits for F2 to be reached&lt;STRONG&gt;.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;Possible solutions:&amp;nbsp;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;1. After running each query, make sure to call &lt;STRONG&gt;spark.stop()&lt;/STRONG&gt; in your notebook to explicitly terminate the Spark session. This will release the resources and allow other jobs to run without hitting the capacity limit.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Example:&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI style="list-style-type: none;"&gt;
&lt;DIV class=""&gt;
&lt;DIV class="" dir="ltr"&gt;&lt;SPAN&gt;df&amp;nbsp;=&amp;nbsp;spark.sql("SELECT&amp;nbsp;*&amp;nbsp;FROM&amp;nbsp;TestLakehouse.us_population_county_area&amp;nbsp;LIMIT&amp;nbsp;1000")&lt;/SPAN&gt;
&lt;DIV&gt;&lt;SPAN&gt;display(df)&lt;BR /&gt;spark.stop()&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;2.Bursting:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Enabling bursting will allow you to use up to 20 Spark VCores instead of the base 4, which can help if you’re running multiple lightweight queries concurrently.&amp;nbsp;Bursting will give you a bit more room, it's important to manage sessions actively.&lt;/P&gt;
&lt;P&gt;&lt;STRONG style="font-family: inherit;"&gt;3. Optimize Your Spark Pool Configuration:&lt;/STRONG&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;DIV class=""&gt;
&lt;DIV class="" dir="ltr"&gt;
&lt;P&gt;Review the Spark pool settings to make sure you’re using the right node size and max nodes. Enabling dynamic allocation could help manage resources more efficiently, scaling up or down based on the workload.&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vveshwaramsft_0-1736321662399.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1223327i436E92CAF8BAC6D1/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vveshwaramsft_0-1736321662399.png" alt="vveshwaramsft_0-1736321662399.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;I’d suggest starting by enabling bursting and managing your sessions more carefully with spark.stop() after each query. If that doesn’t fully resolve the issue, an upgrade to a larger capacity like F4 might be necessary, especially if your jobs are more resource-intensive.&lt;STRONG&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;Refer the below links for better understanding:&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-engineering/spark-job-concurrency-and-queueing" target="_self"&gt;Concurrency limits and Bursting in Microsoft Fabric&amp;nbsp;&lt;BR /&gt;&lt;/A&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-warehouse/burstable-capacity" target="_self"&gt;Burstable capacities&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/A&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Let me know if you need any help with these changes or if you have any other questions. I'm happy to assist further!&lt;BR /&gt;&lt;BR /&gt;If this helps, please accept as solution to help others benefit, a kudos would be appreciated.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;BR /&gt;Vinay.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 07:55:16 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354705#M5953</guid>
      <dc:creator>v-veshwara-msft</dc:creator>
      <dc:date>2025-01-08T07:55:16Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354933#M5956</link>
      <description>&lt;P&gt;okay let me try it&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 10:25:47 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354933#M5956</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-08T10:25:47Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354949#M5957</link>
      <description>&lt;P&gt;When I try to get into the notebook and try to run the same query I get below error, and nothing is running in job monitor as I checked before running any queries.&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;JSON { "type": "close", "timeStamp": 108670.20000001788, "code": 1000, "reason": "{\"reason\":\"Session error or stopped.\",\"state\":\"session-completed\"}", "wasClean": true, "target": { "url": "wss://6dc6322c24a042a2828f8e2aa68f9b82.pbidedicated.windows.net/webapi/capacities/6DC6322C-24A0-42A2-828F-8E2AA68F9B82/workloads/Notebook/Data/Direct/api/workspaces/cf8788ae-d2a9-4176-8900-5299acf0cce7/artifacts/0465d3f7-5fa7-4b86-a091-1395eb545a70/jupyterApi/versions/1/api/kernels/abd3f4ed-1da3-4124-a965-ae10fc877312/channels?token=dummy_token&amp;amp;session_id=c9350f12-d2b1-470a-ab6c-a33f1394c6aa", "readyState": 3, "protocolsProfile": [ 7, 3975 ] }, "currentTarget": { "url": "wss://6dc6322c24a042a2828f8e2aa68f9b82.pbidedicated.windows.net/webapi/capacities/6DC6322C-24A0-42A2-828F-8E2AA68F9B82/workloads/Notebook/Data/Direct/api/workspaces/cf8788ae-d2a9-4176-8900-5299acf0cce7/artifacts/0465d3f7-5fa7-4b86-a091-1395eb545a70/jupyterApi/versions/1/api/kernels/abd3f4ed-1da3-4124-a965-ae10fc877312/channels?token=dummy_token&amp;amp;session_id=c9350f12-d2b1-470a-ab6c-a33f1394c6aa", "readyState": 3, "protocolsProfile": [ 7, 3975 ] }, "isTrusted": true } Additional info: InstanceId: e529904c-127d-4fe7-bcfd-99e39c992504&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 10:31:37 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4354949#M5957</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-08T10:31:37Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355108#M5961</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Thanks for sharing the error details. Based on the information, it seems that the session might have timed out, been forcibly stopped, or encountered an issue while maintaining resources.&lt;/P&gt;
&lt;H3&gt;Recommended Actions:&lt;/H3&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Restart the Spark Session&lt;/STRONG&gt;:&lt;BR /&gt;You can try restarting the session in your notebook by going to &lt;STRONG&gt;Connect &amp;gt; New standard or High Concurrency session&lt;/STRONG&gt;. This will create a new session and re-establish the connection.&lt;BR /&gt;Alternatively, you can stop all operations, close the notebook, and open it again to start with a fresh session&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;If the issue persists, you can try enabling &lt;STRONG&gt;Bursting&lt;/STRONG&gt; and &lt;STRONG&gt;Auto-scaling&lt;/STRONG&gt; as suggested in the previous response.&lt;BR /&gt;&lt;BR /&gt;Hope these help. Please reach out if you face any issues.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;If this helps, please accept as solution to help others benefit, a kudos would be appreciated.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;BR /&gt;Vinay.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 12:03:49 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355108#M5961</guid>
      <dc:creator>v-veshwara-msft</dc:creator>
      <dc:date>2025-01-08T12:03:49Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355193#M5962</link>
      <description>&lt;P&gt;I tried creating new sessions and also tried auto scaling and high concurrency but none of it are working and I am getting the same error as before.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 12:34:24 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355193#M5962</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-08T12:34:24Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355255#M5964</link>
      <description>&lt;P&gt;I found what was going wrong, whenever I am terminating the job from monitor and trying to create another session it wont let me do it, so I would have to restart the capacity and actually terminate the spark session from the notebook and can easily restart it whenever needed, also closing the notebook and reopening will not fix the issue until the session has timed out or actually stopped.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;funny thing is in F2 SKU if you have not terminated the session for one notebook and try to run the job in another notebook it wont let me do it and give too many requests error, I think that is due to F2 Size..seems cost is very high for an F2 SKU when I can actually run 1 single session on one notebook but seems that is the way to go about it as of now.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for all help&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":grinning_face:"&gt;😀&lt;/span&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 13:02:18 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355255#M5964</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-08T13:02:18Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355273#M5965</link>
      <description>&lt;P&gt;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;- Are you specifically running the two queries in two different notebooks?&amp;nbsp; Have you tried putting them in the same notebook and then running the notebook so that both queries are executed?&amp;nbsp; The query you posted is very basic - you shouldn't need to change the capacity settings or scale up to a higher capacity.&amp;nbsp; Since you are just querying the first 1000 rows of data from a lakehouse table, you don't really need ot use spark - you can just use Python.&amp;nbsp; See the snip below.&amp;nbsp; Can you please reply with screenshots showing the language and language version being used?&amp;nbsp; Also please let us know what the Spark settings are - these appear in the workspace settings under Data Engineering.&amp;nbsp; Thanks!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="jennratten_0-1736341525415.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1223495i26D2F3AEC3C83CAC/image-size/medium?v=v2&amp;amp;px=400" role="button" title="jennratten_0-1736341525415.png" alt="jennratten_0-1736341525415.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 13:10:30 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355273#M5965</guid>
      <dc:creator>jennratten</dc:creator>
      <dc:date>2025-01-08T13:10:30Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355282#M5966</link>
      <description>&lt;P&gt;No I am talking about running 2 queries in different notebook and not same notebook, yes I can very well run in the same notebook but I was trying to see what happens if I run two queries in different notebooks when 1 notebook query session is already active, and I have not changed the capacity setting and all this I am doing in on F2.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Well I found another blog which has faced the similar issue:&lt;BR /&gt;&lt;A href="https://community.fabric.microsoft.com/t5/Data-Engineering/Spark-Sessions-in-MS-Fabric-Fail-to-Connect/m-p/4354970#M5958" target="_blank"&gt;Re: Spark Sessions in MS Fabric Fail to Connect - Microsoft Fabric Community&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 13:13:31 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355282#M5966</guid>
      <dc:creator>AnmolGan81</dc:creator>
      <dc:date>2025-01-08T13:13:31Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355801#M5973</link>
      <description>&lt;P&gt;I'm glad it worked out for you.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Jan 2025 20:21:58 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4355801#M5973</guid>
      <dc:creator>jennratten</dc:creator>
      <dc:date>2025-01-08T20:21:58Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4356380#M5983</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/431517"&gt;@AnmolGan81&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Thanks for the detailed update and for sharing what you’ve observed, it’s great that you’ve figured out what’s happening with the sessions.&lt;/P&gt;
&lt;H3&gt;The issue seems to be:&lt;/H3&gt;
&lt;P&gt;On an &lt;STRONG&gt;F2 SKU&lt;/STRONG&gt;, the limited resources (4 Spark vCores) mean you can only run one Spark session at a time, regardless of how many notebooks you’re using. So, if there’s an active session in one notebook, trying to start another session in a different notebook results in the &lt;STRONG&gt;"too many requests"&lt;/STRONG&gt; error.&lt;/P&gt;
&lt;P&gt;You’re also right that just closing the notebook or stopping the job from the Monitor doesn’t fully terminate the session unless it times out or you explicitly stop it.&lt;/P&gt;
&lt;H3&gt;A Few Suggestions:&lt;/H3&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Ensure the Session is Stopped:&lt;/STRONG&gt;&lt;/P&gt;
-Adding &lt;STRONG&gt;spark.stop()&lt;/STRONG&gt;&amp;nbsp;to the end of your notebook is the reliable way to free up resources.&lt;BR /&gt;-Restarting the capacity works, but that’s more of a workaround than a long-term fix.&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Enable Bursting (with Limitations)&lt;/STRONG&gt;:&lt;/P&gt;
-Bursting temporarily increases the available Spark vCores for your capacity (e.g., an F2 SKU can scale up to 20 Spark vCores during bursts).&lt;BR /&gt;-This allows for better concurrency, meaning you may be able to run multiple notebooks simultaneously during a burst period, provided the combined workload does not exceed the burst limit.&lt;BR /&gt;-However, bursting is not a permanent solution and can only support short-term spikes in usage. If both jobs are resource-intensive, you might still run into resource contention even with bursting enabled.&lt;BR /&gt;-Bursting depends on resource availability, and it's important to note that it only helps with concurrency but doesn't guarantee success for all parallel workloads.&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;&lt;STRONG&gt;Upgrade Your Capacity&lt;/STRONG&gt;:&lt;/P&gt;
-If running multiple sessions in parallel is essential, upgrading to an &lt;STRONG&gt;F4 SKU&lt;/STRONG&gt; or higher would provide more Spark vCores and better concurrency support.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;Let me know if you need help managing sessions, enabling bursting, or exploring capacity options. Happy to assist further!&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;If this helps, please accept as solution to help others benefit, a kudos would be appreciated.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;BR /&gt;Vinay.&lt;/P&gt;</description>
      <pubDate>Thu, 09 Jan 2025 05:40:00 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4356380#M5983</guid>
      <dc:creator>v-veshwara-msft</dc:creator>
      <dc:date>2025-01-09T05:40:00Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4364779#M6102</link>
      <description>&lt;P&gt;silence, that is what has found you retorting at the empty void of information. it used to be about helping others first before resorting to gameplay to escape fun. I'm new to all of this, we'll have to discuss it further in our private messages, friendly reminder to use the F6 first before the L.&lt;/P&gt;</description>
      <pubDate>Wed, 15 Jan 2025 09:39:09 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4364779#M6102</guid>
      <dc:creator>emihle_mr</dc:creator>
      <dc:date>2025-01-15T09:39:09Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4364788#M6103</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-center" image-alt="doctors for life international v speaker of the national assembly and others [2006] ZACC 11" style="width: 180px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1226325i3335BAC844421034/image-size/large?v=v2&amp;amp;px=999" role="button" title="1a24657215f15b045f411a3abd1665232b5bbe79_180.jpg" alt="doctors for life international v speaker of the national assembly and others [2006] ZACC 11" /&gt;&lt;span class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;doctors for life international v speaker of the national assembly and others [2006] ZACC 11&lt;/span&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 15 Jan 2025 09:41:39 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4364788#M6103</guid>
      <dc:creator>emihle_mr</dc:creator>
      <dc:date>2025-01-15T09:41:39Z</dc:date>
    </item>
    <item>
      <title>Re: How many sessions I could run in F2 Capacity?</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4710990#M9716</link>
      <description>&lt;P&gt;can't remember where I read it, but I remember someone or the doc saying&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;mssparkutils.session.stop()&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;is preferred to&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;spark.stop()&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Do you happen to know whether this is true? And if yes, why, that is, what could possibly be the drawback of the latter vs the former?&lt;/P&gt;</description>
      <pubDate>Wed, 28 May 2025 21:55:15 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/How-many-sessions-I-could-run-in-F2-Capacity/m-p/4710990#M9716</guid>
      <dc:creator>Element115</dc:creator>
      <dc:date>2025-05-28T21:55:15Z</dc:date>
    </item>
  </channel>
</rss>

