The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
When the Notebook Environment is set to workspace settings with version 1.3 (Spark 3.5), attempting to access the default lakehouse table results in the following error:
"OneSecurity error: Unable to build a fully qualified table name. Required configuration is missing: Workspace of the lakehouse: LH_Projects_Bronze is missing. Please verify and ensure the configuration is correctly set."
However, when I switch to an environment with version 1.2 (Spark 3.4), it works without any issues. What could I be missing?
it was running until last thursday and it started throwing this error. when I downgraded the version and it worked.
Solved! Go to Solution.
Hello @rxt
I think Spark 3.5 requires explicit workspace identification when accessing lakehouse tables, whereas Spark 3.4 allowed implicit resolution through environment defaults.
no direct documentation to support this but
Microsoft Fabric documentation for the Spark connector also emphasizes the need to specify the workspace ID and lakehouse or warehouse item ID when accessing data across workspaces
https://learn.microsoft.com/en-us/fabric/data-engineering/spark-data-warehouse-connector
Hi @rxt ,
Thank you for reaching out to the Microsoft Fabric Community Forum.
You're correct that the issue began after upgrading to Spark 1.3 (Spark 3.5). As highlighted by @nilendraFabric thanks for the insight, Spark 3.5 enforces stricter resolution rules and requires explicit identification of the workspace and lakehouse to access tables unlike Spark 1.2, which allowed implicit context.
To resolve the error, please ensure you're using fully qualified references to your lakehouse tables or explicitly set the workspace and lakehouse context in your notebook. This change is in line with platform enhancements for access control and workspace scoping.
Downgrading to Spark 1.2 is a valid temporary workaround, but we recommend adapting to Spark 1.3 for long-term compatibility.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @rxt ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @rxt ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @rxt ,
Thank you for reaching out to the Microsoft Fabric Community Forum.
You're correct that the issue began after upgrading to Spark 1.3 (Spark 3.5). As highlighted by @nilendraFabric thanks for the insight, Spark 3.5 enforces stricter resolution rules and requires explicit identification of the workspace and lakehouse to access tables unlike Spark 1.2, which allowed implicit context.
To resolve the error, please ensure you're using fully qualified references to your lakehouse tables or explicitly set the workspace and lakehouse context in your notebook. This change is in line with platform enhancements for access control and workspace scoping.
Downgrading to Spark 1.2 is a valid temporary workaround, but we recommend adapting to Spark 1.3 for long-term compatibility.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @rxt ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hello @rxt
I think Spark 3.5 requires explicit workspace identification when accessing lakehouse tables, whereas Spark 3.4 allowed implicit resolution through environment defaults.
no direct documentation to support this but
Microsoft Fabric documentation for the Spark connector also emphasizes the need to specify the workspace ID and lakehouse or warehouse item ID when accessing data across workspaces
https://learn.microsoft.com/en-us/fabric/data-engineering/spark-data-warehouse-connector