Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
how does this work?
And I try 2 notebooks in the same Workspace:
the second notebook will always go to the Standard Session and fail. Do I need to use below
for the second Notebook?
Solved! Go to Solution.
Hi @tan_thiamhuat ,
Yes, writing to a persistent table with saveAsTable makes the data accessible to any notebook connected to the same lakehouse and Spark environment, including both high concurrency and standard sessions. This is useful for sharing data broadly across sessions.
However, temporary views are limited to the same Spark session, so to share them between notebooks, those notebooks must join the same high concurrency session. If you need quick, session-scoped sharing, use temporary views with shared sessions; for wider access across different sessions, persistent tables are the best option.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Hi @tan_thiamhuat ,
Thanks for reaching out to the Microsoft Fabric Community.
Although both notebooks are in the same workspace and lakehouse, the error indicates they are not sharing the same Spark session, which is why the temporary view created in one notebook is not visible in the other.
To resolve this please follow the below steps:
df.write.mode("overwrite").saveAsTable("people_test")
df_shared = spark.sql("SELECT * FROM people_test")
df_shared.show()
Also thank you @suparnababu8 and @burakkaragoz for actively participating in the community forum and for the solutions you’ve been sharing in the community forum.
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
df.write.mode("overwrite").saveAsTable("people_test")
if it is written to a table, then any non-high concurrency notebook can read from that table, isn't it?
Hi @tan_thiamhuat ,
Yes, writing to a persistent table with saveAsTable makes the data accessible to any notebook connected to the same lakehouse and Spark environment, including both high concurrency and standard sessions. This is useful for sharing data broadly across sessions.
However, temporary views are limited to the same Spark session, so to share them between notebooks, those notebooks must join the same high concurrency session. If you need quick, session-scoped sharing, use temporary views with shared sessions; for wider access across different sessions, persistent tables are the best option.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
any thoughts why my two notebooks do not work on that High Currency?
If you want to run mulitple Notebooks from the same workspace, you have to tag both notebooks High concurrency session, but there are specific condiitions that must be met. The notebooks need to
So, with enabling high concuurrency mode, you will attaach multiple noteebooks to an existing Spark session, allowing for greater session utilization and faster execution
Read more abour high concuurrency mode here - Configure high concurrency mode for notebooks - Microsoft Fabric | Microsoft Learn
Thank you!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
does not work for me, both notebooks are in the same Lakehouse and workspace.
Hi @tan_thiamhuat ,
Good question. Even though high concurrency is enabled in the workspace settings, each notebook still needs to explicitly join an available high concurrency session if you want them to share the same Spark app.
So yes, for your second notebook, you should select the available session like HC_Notebook_1_6782 manually — otherwise, it defaults to a standard session, which is why you're seeing it fail.
Also make sure:
Let me know if you want help setting up a shared session programmatically.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.