<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: 403 forbidden error on querying lakehouse table using pyspark in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4173462#M4255</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/803334"&gt;@soumya_cg&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The issue seems to be partially fixed. I can read data from a schema enabled lakehouse now but I need to pin it as the default lakehouse of my notebook. Otherwise it returns me a "&lt;SPAN&gt;[REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace&lt;/SPAN&gt;“ exception.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vjingzhanmsft_0-1727252388051.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1173014i1F446BDEBA37DCE7/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vjingzhanmsft_0-1727252388051.png" alt="vjingzhanmsft_0-1727252388051.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;I will keep tracking this issue and update if there is any further progress.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Best Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jing&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 25 Sep 2024 08:30:40 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2024-09-25T08:30:40Z</dc:date>
    <item>
      <title>403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4139965#M3963</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;Need a help. I,m trying to run below query using pyspark inside my fabric lakehouse:&lt;/P&gt;&lt;P&gt;----&amp;gt; 1 df = spark.sql("SELECT * FROM Stream_Lakehouse.dbo.FACT_KPIs_LH LIMIT 1000")&lt;BR /&gt;2 display(df)&lt;/P&gt;&lt;P&gt;Can someone help. I have truncated the full error log due to issue with space here&lt;/P&gt;&lt;P&gt;I,m getting below error:&lt;/P&gt;&lt;DIV&gt;Py4JJavaError&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Traceback (most recent call last)&lt;/DIV&gt;&lt;DIV&gt;Cell In[47], line 1&lt;/DIV&gt;&lt;DIV&gt;----&amp;gt; 1 df = spark.sql("SELECT * FROM Stream_Lakehouse.dbo.FACT_KPIs_LH LIMIT 1000")&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; 2 display(df)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py:1440, in SparkSession.sql(self, sqlQuery, args, **kwargs)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1438 try:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1439&amp;nbsp; &amp;nbsp; &amp;nbsp;litArgs = {k: _to_java_column(lit(v)) for k, v in (args or {}).items()}&lt;/DIV&gt;&lt;DIV&gt;-&amp;gt; 1440&amp;nbsp; &amp;nbsp; &amp;nbsp;return DataFrame(self._jsparkSession.sql(sqlQuery, litArgs), self)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1441 finally:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1442&amp;nbsp; &amp;nbsp; &amp;nbsp;if len(kwargs) &amp;gt; 0:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1316 command = proto.CALL_COMMAND_NAME +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1317&amp;nbsp; &amp;nbsp; &amp;nbsp;self.command_header +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1318&amp;nbsp; &amp;nbsp; &amp;nbsp;args_command +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1319&amp;nbsp; &amp;nbsp; &amp;nbsp;proto.END_COMMAND_PART&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1321 answer = self.gateway_client.send_command(command)&lt;/DIV&gt;&lt;DIV&gt;-&amp;gt; 1322 return_value = get_return_value(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1323&amp;nbsp; &amp;nbsp; &amp;nbsp;answer, self.gateway_client, self.target_id, self.name)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1325 for temp_arg in temp_args:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1326&amp;nbsp; &amp;nbsp; &amp;nbsp;if hasattr(temp_arg, "_detach"):&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /opt/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py:169, in capture_sql_exception.&amp;lt;locals&amp;gt;.deco(*a, **kw)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 167 def deco(*a: Any, **kw: Any) -&amp;gt; Any:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 168&amp;nbsp; &amp;nbsp; &amp;nbsp;try:&lt;/DIV&gt;&lt;DIV&gt;--&amp;gt; 169&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;return f(*a, **kw)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 170&amp;nbsp; &amp;nbsp; &amp;nbsp;except Py4JJavaError as e:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 171&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;converted = convert_exception(e.java_exception)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File ~/cluster-env/trident_env/lib/python3.10/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 325 if answer[1] == REFERENCE_TYPE:&lt;/DIV&gt;&lt;DIV&gt;--&amp;gt; 326&amp;nbsp; &amp;nbsp; &amp;nbsp;raise Py4JJavaError(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 327&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;"An error occurred while calling {0}{1}{2}.\n".&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 328&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;format(target_id, ".", name), value)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 329 else:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 330&amp;nbsp; &amp;nbsp; &amp;nbsp;raise Py4JError(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 331&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;"An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 332&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;format(target_id, ".", name, value))&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Py4JJavaError: An error occurred while calling o322.sql.&lt;/DIV&gt;&lt;DIV&gt;: java.lang.RuntimeException: Request failed: HTTP/1.1 403 Forbidden&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.Helpers$.executeRequest(Helpers.scala:154)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformClient.newGetRequest(PbiPlatformClient.scala:51)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformClient.newGetRequest$(PbiPlatformClient.scala:47)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformInternalApiClient.newGetRequest(PbiPlatformClient.scala:175)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformInternalApiClient.getAllWorkspaces(PbiPlatformClient.scala:199)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.InstrumentedPbiPlatformClient.$anonfun$getAllWorkspaces$1(PbiPlatformClient.scala:164)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.InstrumentedPbiPlatformClient.getAllWorkspaces(PbiPlatformClient.scala:164)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformCachingClient.$anonfun$workspaceCache$1(PbiPlatformClient.scala:117)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.google.common.base.Suppliers$ExpiringMemoizingSupplier.get(Suppliers.java:192)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformCachingClient.getWorkspace(PbiPlatformClient.scala:146)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformCachingClient.getArtifacts(PbiPlatformClient.scala:136)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformCachingClient.$anonfun$artifactCache$1(PbiPlatformClient.scala:130)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1908)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.platform.PbiPlatformCachingClient.getArtifact(PbiPlatformClient.scala:151)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.SchemaPathResolver.getArtifactRoot(pathResolvers.scala:127)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.SchemaPathResolver.getSchemaRoot(pathResolvers.scala:144)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.listSchemas(DefaultSchemaMetadataManager.scala:218)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.$anonfun$defaultSchemaPathResolver$1(DefaultSchemaMetadataManager.scala:30)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.NamespaceResolver.$anonfun$decodedSchemaNameCache$1(pathResolvers.scala:46)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1908)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.Helpers$.forceLoadIfRequiredInCachedMap(Helpers.scala:61)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.NamespaceResolver.inferNamespace(pathResolvers.scala:87)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.NamespaceResolver.$anonfun$toNamespace$1(pathResolvers.scala:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.base/java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1705)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.NamespaceResolver.toNamespace(pathResolvers.scala:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.DefaultSchemaMetadataManager.getSchema(DefaultSchemaMetadataManager.scala:73)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.MetadataManager.getSchema(MetadataManager.scala:192)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.super$getSchema(MetadataManager.scala:321)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.$anonfun$getSchema$1(MetadataManager.scala:321)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.InstrumentedMetadataManager.getSchema(MetadataManager.scala:321)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.getDatabase(OnelakeExternalCatalog.scala:78)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.catalog.OnelakeExternalCatalog.databaseExists(OnelakeExternalCatalog.scala:84)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.$anonfun$databaseExists$1(OnelakeExternalCatalog.scala:417)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.metadata.Helpers$.timed(Helpers.scala:29)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.microsoft.fabric.spark.catalog.InstrumentedExternalCatalog.databaseExists(OnelakeExternalCatalog.scala:417)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:169)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:142)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:54)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:69)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:140)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 09 Sep 2024 06:29:38 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4139965#M3963</guid>
      <dc:creator>soumya_cg</dc:creator>
      <dc:date>2024-09-09T06:29:38Z</dc:date>
    </item>
    <item>
      <title>Re: 403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4141170#M3980</link>
      <description>&lt;P&gt;Are you using schema enabled Lakehouse?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.fabric.microsoft.com/t5/Data-Engineering/403-Forbidden-when-reading-the-tables-of-the-Lakehouse-with-sql/m-p/4136891#M3941" target="_blank"&gt;https://community.fabric.microsoft.com/t5/Data-Engineering/403-Forbidden-when-reading-the-tables-of-the-Lakehouse-with-sql/m-p/4136891#M3941&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 09 Sep 2024 18:28:15 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4141170#M3980</guid>
      <dc:creator>frithjof_v</dc:creator>
      <dc:date>2024-09-09T18:28:15Z</dc:date>
    </item>
    <item>
      <title>Re: 403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4141920#M3989</link>
      <description>&lt;P&gt;Yes I,m using&lt;/P&gt;</description>
      <pubDate>Tue, 10 Sep 2024 04:53:25 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4141920#M3989</guid>
      <dc:creator>soumya_cg</dc:creator>
      <dc:date>2024-09-10T04:53:25Z</dc:date>
    </item>
    <item>
      <title>Re: 403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4142073#M3991</link>
      <description>&lt;P&gt;Okay, see if the link in my previous comment helps you.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In general I would recommend not to use the schema enabled Lakehouse as long as it's only a preview feature.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would just use normal Lakehouse without schemas.&lt;/P&gt;</description>
      <pubDate>Tue, 10 Sep 2024 06:23:42 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4142073#M3991</guid>
      <dc:creator>frithjof_v</dc:creator>
      <dc:date>2024-09-10T06:23:42Z</dc:date>
    </item>
    <item>
      <title>Re: 403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4142430#M3997</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/803334"&gt;@soumya_cg&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;This is a known issue as&amp;nbsp;&lt;SPAN&gt;frithjof_v has linked.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;The engineers are working on the fix now. But we haven't received any definite ETA yet. I will keep you updated once there is any progress. We appreciate your understanding and patience.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Best Regards,&lt;BR /&gt;Jing&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 10 Sep 2024 09:15:02 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4142430#M3997</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-09-10T09:15:02Z</dc:date>
    </item>
    <item>
      <title>Re: 403 forbidden error on querying lakehouse table using pyspark</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4173462#M4255</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/803334"&gt;@soumya_cg&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The issue seems to be partially fixed. I can read data from a schema enabled lakehouse now but I need to pin it as the default lakehouse of my notebook. Otherwise it returns me a "&lt;SPAN&gt;[REQUIRES_SINGLE_PART_NAMESPACE] spark_catalog requires a single-part namespace&lt;/SPAN&gt;“ exception.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="vjingzhanmsft_0-1727252388051.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1173014i1F446BDEBA37DCE7/image-size/medium?v=v2&amp;amp;px=400" role="button" title="vjingzhanmsft_0-1727252388051.png" alt="vjingzhanmsft_0-1727252388051.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;I will keep tracking this issue and update if there is any further progress.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Best Regards,&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Jing&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 25 Sep 2024 08:30:40 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/403-forbidden-error-on-querying-lakehouse-table-using-pyspark/m-p/4173462#M4255</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-09-25T08:30:40Z</dc:date>
    </item>
  </channel>
</rss>

