The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Receiving the error in the title when trying to create a delta table programmatically via notebook. I've created a new Lakehouse, created a new PL, and tried debugging using a very simple manually created dataframe. I've also tried in two different tenants, EUS and CUS.
Problem has persisted for about a week.
Any ideas?
Receiving the same error when I try to load a Lakehouse Table in my notebook.
I have tried to remove the Lakehouse and load it again.
---------------------------------------------------------------------------
AnalysisException Traceback (most recent call last)
Cell In[8], line 1
----> 1 df = spark.sql("SELECT * FROM LakehouseName LIMIT 1000")
2 display(df)
File /opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py:1034, in SparkSession.sql(self, sqlQuery, **kwargs)
1032 sqlQuery = formatter.format(sqlQuery, **kwargs)
1033 try:
-> 1034 return DataFrame(self._jsparkSession.sql(sqlQuery), self)
1035 finally:
1036 if len(kwargs) > 0:
File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
1315 command = proto.CALL_COMMAND_NAME +\
1316 self.command_header +\
1317 args_command +\
1318 proto.END_COMMAND_PART
1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
1322 answer, self.gateway_client, self.target_id, self.name)
1324 for temp_arg in temp_args:
1325 temp_arg._detach()
File /opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py:196, in capture_sql_exception.<locals>.deco(*a, **kw)
192 converted = convert_exception(e.java_exception)
193 if not isinstance(converted, UnknownException):
194 # Hide where the exception came from that shows a non-Pythonic
195 # JVM exception message.
--> 196 raise converted from None
197 else:
198 raise
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Create database for LakehouseName is not permitted using Apache Spark in Microsoft Fabric.)
Hi @WCrayger
Can you provide the specific error message and service version? Your question belongs to Synapse, we will help you to transfer the question to Synapse forum.
Best Regards,
Community Support Team _ Ailsa Tao
User | Count |
---|---|
17 | |
16 | |
7 | |
3 | |
2 |
User | Count |
---|---|
45 | |
29 | |
15 | |
9 | |
6 |