Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I have been trying to access the lakehouse tables in one workspace from a Notebook that is in other woekspace.
I have add that lakehouse to the notebook in which i am trying to do it.
The below screenshot has the block of code how i am trying to do it.
I am Trying to use it in PySprak -spark.sql() as it is the requirement I have.
When i am executing the Cell that has the Select query i am geting the following error:
AnalysisException Traceback (most recent call last) Cell In[26], line 1 ----> 1 df = spark.sql("SELECT * FROM dataverse_foleyrelease_cds2_workspace_unqb07819951104ef119f85000d3a106.mainaccount LIMIT 1000") 2 display(df) File /opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py:1440, in SparkSession.sql(self, sqlQuery, args, **kwargs) 1438 try: 1439 litArgs = {k: _to_java_column(lit(v)) for k, v in (args or {}).items()} -> 1440 return DataFrame(self._jsparkSession.sql(sqlQuery, litArgs), self) 1441 finally: 1442 if len(kwargs) > 0: File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args) 1316 command = proto.CALL_COMMAND_NAME +\ 1317 self.command_header +\ 1318 args_command +\ 1319 proto.END_COMMAND_PART 1321 answer = self.gateway_client.send_command(command) -> 1322 return_value = get_return_value( 1323 answer, self.gateway_client, self.target_id, self.name) 1325 for temp_arg in temp_args: 1326 if hasattr(temp_arg, "_detach"😞 File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:175, in capture_sql_exception.<locals>.deco(*a, **kw) 171 converted = convert_exception(e.java_exception) 172 if not isinstance(converted, UnknownException): 173 # Hide where the exception came from that shows a non-Pythonic 174 # JVM exception message. --> 175 raise converted from None 176 else: 177 raise AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `dataverse_foleyrelease_cds2_workspace_unqb07819951104ef119f85000d3a106`.`mainaccount` cannot be found. Verify the spelling and correctness of the schema and catalog. If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog. To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.; line 1 pos 14; 'GlobalLimit 1000 +- 'LocalLimit 1000 +- 'Project [*] +- 'UnresolvedRelation [dataverse_foleyrelease_cds2_workspace_unqb07819951104ef119f85000d3a106, mainaccount], [], false
So Can Anyone assist me to solve this issue.
Solved! Go to Solution.
Hi @Shanthan118 ,
Thanks for the reply from charlyS .
From the screenshot, it looks like you are experiencing a table not found error. Please double check that the table you are trying to read exists.
I tested it by creating a workspace called FabricTest1 and creating a new notebook in it that references the ProductsTest table from the daisyTest2 lakehouse in the daisyTest1 workspace. Load the data using Spark and you will see that everything works fine.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @Shanthan118 ,
Thanks for the reply from charlyS .
From the screenshot, it looks like you are experiencing a table not found error. Please double check that the table you are trying to read exists.
I tested it by creating a workspace called FabricTest1 and creating a new notebook in it that references the ProductsTest table from the daisyTest2 lakehouse in the daisyTest1 workspace. Load the data using Spark and you will see that everything works fine.
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hello,
Looks like you are trying to access a table from a workspace that is not in the same workspace of the default lakehouse attached to the notebook.
For exemple here :
I'm querying a table from ConceptLkh but my notebook is pinned to the AmazingZoneLkh one, and it's not possible for now.
If you want to access data from another workpace as of today you can :
- Add a shortcut to the second Workspace in the default Lakehouse
- Query the data using the ABFS Path instead of the table name :
(in my exemple)
User | Count |
---|---|
13 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
8 | |
7 | |
6 | |
5 |