<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Write to Fabric OneLake from a Synapse Spark notebook in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4715969#M9802</link>
    <description>&lt;P data-start="0" data-end="63"&gt;Hi &lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/680346"&gt;@Krumelur&lt;/a&gt;,&lt;/P&gt;
&lt;P data-start="0" data-end="63"&gt;Thank you for reaching out to the community with your question.&lt;/P&gt;
&lt;P data-start="0" data-end="63"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="65" data-end="480"&gt;At this time, writing directly from a standalone Azure Synapse Spark notebook to Microsoft Fabric OneLake (Lakehouse) using the abfss:// endpoint is not supported. While OneLake uses a similar URI format to ADLS Gen2, its authentication model is different, and tokens issued through Synapse-linked services are not recognized by Fabric. This is why the approach that works for ADLS Gen2 does not apply to OneLake.&lt;/P&gt;
&lt;P data-start="482" data-end="974"&gt;As a supported and reliable workaround, we recommend first writing your data from Synapse Spark to a staging location in Azure Data Lake Storage Gen2. From there, you can use a Synapse pipeline with a Copy Activity to move the data into your Fabric Lakehouse. This pipeline should use a Lakehouse Linked Service configured with a service principal and certificate for secure access. This method ensures compatibility and follows Microsoft’s best practices for integrating Synapse with Fabric.&lt;/P&gt;
&lt;P data-start="482" data-end="974"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;We truly appreciate your engagement in the forum and encourage you to continue sharing your experiences and questions. Your contributions help strengthen the community for everyone.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Hope my suggestion gives you good idea, if you have any more questions, please feel free to ask we are here to help you.&lt;BR /&gt;If this post&amp;nbsp;&lt;STRONG&gt;helps&lt;/STRONG&gt;, then please consider&amp;nbsp;&lt;STRONG&gt;Accept it as the solution&lt;/STRONG&gt;&amp;nbsp;to help the other members find it more quickly.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Regards,&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Sahasra&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Community Support Team.&lt;/P&gt;</description>
    <pubDate>Mon, 02 Jun 2025 05:16:35 GMT</pubDate>
    <dc:creator>v-sgandrathi</dc:creator>
    <dc:date>2025-06-02T05:16:35Z</dc:date>
    <item>
      <title>Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4714594#M9788</link>
      <description>&lt;P&gt;I'm looking for ways to access a Fabric Lakehouse from a Synapse workspace (the standalone Synapse).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I can successfully use a Copy Activity + Lakehouse Linkedservice, and service principal + certificate for auth, as described &lt;A href="https://learn.microsoft.com/en-us/azure/data-factory/connector-microsoft-fabric-lakehouse?tabs=synapse-analytics" target="_self"&gt;here&lt;/A&gt; to write data from my Synapse workspace into a Fabric Lakehouse.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now I would to use a Spark notebook to achieve the same. I am already authenticating to a Gen2 storage account using code like this:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;spark.conf.set(f"spark.storage.synapse.{base_storage_url}.linkedServiceName", linked_service)

sc._jsc.hadoopConfiguration().set(f"fs.azure.account.oauth.provider.type.{base_storage_url}", "com.microsoft.azure.synapse.tokenlibrary.LinkedServiceBasedTokenProvider")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;baseUrl is in the format of &lt;A href="mailto:containername@storagename.dfs.core.windows.net" target="_blank"&gt;containername@storagename.dfs.core.windows.net&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I was hoping this would also work with Fabric's OneLake as it also exposes and abfss:// endpoint, but no luck.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is it possible?&lt;/P&gt;</description>
      <pubDate>Fri, 30 May 2025 17:40:14 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4714594#M9788</guid>
      <dc:creator>Krumelur</dc:creator>
      <dc:date>2025-05-30T17:40:14Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4715969#M9802</link>
      <description>&lt;P data-start="0" data-end="63"&gt;Hi &lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/680346"&gt;@Krumelur&lt;/a&gt;,&lt;/P&gt;
&lt;P data-start="0" data-end="63"&gt;Thank you for reaching out to the community with your question.&lt;/P&gt;
&lt;P data-start="0" data-end="63"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="65" data-end="480"&gt;At this time, writing directly from a standalone Azure Synapse Spark notebook to Microsoft Fabric OneLake (Lakehouse) using the abfss:// endpoint is not supported. While OneLake uses a similar URI format to ADLS Gen2, its authentication model is different, and tokens issued through Synapse-linked services are not recognized by Fabric. This is why the approach that works for ADLS Gen2 does not apply to OneLake.&lt;/P&gt;
&lt;P data-start="482" data-end="974"&gt;As a supported and reliable workaround, we recommend first writing your data from Synapse Spark to a staging location in Azure Data Lake Storage Gen2. From there, you can use a Synapse pipeline with a Copy Activity to move the data into your Fabric Lakehouse. This pipeline should use a Lakehouse Linked Service configured with a service principal and certificate for secure access. This method ensures compatibility and follows Microsoft’s best practices for integrating Synapse with Fabric.&lt;/P&gt;
&lt;P data-start="482" data-end="974"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;We truly appreciate your engagement in the forum and encourage you to continue sharing your experiences and questions. Your contributions help strengthen the community for everyone.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Hope my suggestion gives you good idea, if you have any more questions, please feel free to ask we are here to help you.&lt;BR /&gt;If this post&amp;nbsp;&lt;STRONG&gt;helps&lt;/STRONG&gt;, then please consider&amp;nbsp;&lt;STRONG&gt;Accept it as the solution&lt;/STRONG&gt;&amp;nbsp;to help the other members find it more quickly.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Regards,&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Sahasra&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Community Support Team.&lt;/P&gt;</description>
      <pubDate>Mon, 02 Jun 2025 05:16:35 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4715969#M9802</guid>
      <dc:creator>v-sgandrathi</dc:creator>
      <dc:date>2025-06-02T05:16:35Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4716188#M9807</link>
      <description>&lt;P&gt;Thanks for getting back to me. I'm not fully convinced yet &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Copying data after writing it is not an option.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;With some custom code I was able to obtain an access token from OneLake using SNI. I then passed the token to Spark. I'm now stuck with OneLake reporting a 404, stating the path I want to access would not exist, while it does.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Notice that the approach below is not using linked services to obtain an access token.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;* Is there a way to make this work or am I hitting a wall?&lt;/P&gt;&lt;P&gt;* When will Spark be supported in Synapse?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;# Goal of this code: Acquire an access token using SNI for a client ID and a cert stored in a KeyVault.&lt;BR /&gt;# The Synapse workspace managed ID has access to the KV and can read the cert.&lt;BR /&gt;&lt;BR /&gt;from cryptography.hazmat.primitives.serialization import pkcs12, Encoding, PrivateFormat, NoEncryption&lt;BR /&gt;from cryptography.hazmat.primitives import hashes&lt;BR /&gt;import base64&lt;BR /&gt;import msal&lt;BR /&gt;from pyspark.sql import SparkSession&lt;BR /&gt;from pyspark.sql.types import StructType, StructField, StringType, IntegerType&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;# Load PFX from Key Vault&lt;BR /&gt;# Tested: Client has access to workspace (is admin) and using the client and cert successfully works&lt;BR /&gt;# when doing a copy job in Synapse.&lt;BR /&gt;cert_pfx_base64 = mssparkutils.credentials.getSecretWithLS("My_KeyVault_LinkedService", "MyCertificateName")&lt;BR /&gt;cert_pfx_bytes = base64.b64decode(cert_pfx_base64)&lt;BR /&gt;&lt;BR /&gt;# Extract components&lt;BR /&gt;private_key, certificate, _ = pkcs12.load_key_and_certificates(cert_pfx_bytes, b"")&lt;BR /&gt;&lt;BR /&gt;private_key_pem = private_key.private_bytes(&lt;BR /&gt;  encoding=Encoding.PEM,&lt;BR /&gt;  format=PrivateFormat.PKCS8,&lt;BR /&gt;  encryption_algorithm=NoEncryption()&lt;BR /&gt;).decode()&lt;BR /&gt;&lt;BR /&gt;cert_pem = certificate.public_bytes(Encoding.PEM).decode()&lt;BR /&gt;thumbprint = certificate.fingerprint(hashes.SHA1()).hex()&lt;BR /&gt;&lt;BR /&gt;# MSAL setup&lt;BR /&gt;tenant_id = "TENANT_ID_WHERE_FABRIC_WORKSPACE_LIVES"&lt;BR /&gt;client_id = "CLIENT_ID_USING_SNI"&lt;BR /&gt;&lt;BR /&gt;app = msal.ConfidentialClientApplication(&lt;BR /&gt;  client_id=client_id,&lt;BR /&gt;  authority=f"https://login.microsoftonline.com/{tenant_id}",&lt;BR /&gt;  client_credential={&lt;BR /&gt;    "private_key": private_key_pem,&lt;BR /&gt;    "thumbprint": thumbprint,&lt;BR /&gt;    "public_certificate": cert_pem # &amp;lt;- this is what triggers SN/I&lt;BR /&gt;  }&lt;BR /&gt;)&lt;BR /&gt;&lt;BR /&gt;# Acquire token&lt;BR /&gt;result = app.acquire_token_for_client(scopes=["https://storage.azure.com/.default"])&lt;BR /&gt;access_token = result["access_token"]&lt;BR /&gt;# Checking the token: it's valid.&lt;BR /&gt;print(access_token[:50])&lt;BR /&gt;&lt;BR /&gt;# PASS TOKEN TO SPARK&lt;BR /&gt;&lt;BR /&gt;# The URL is copied from Fabric UI's properties of the folder in the Lakehouse&lt;BR /&gt;full_url = "abfss://4f.....@msit-onelake.dfs.fabric.microsoft.com/f....../Files"&lt;BR /&gt;base_url = "4f.......@msit-onelake.dfs.fabric.microsoft.com"&lt;BR /&gt;&lt;BR /&gt;spark.conf.set(f"fs.azure.account.auth.type.{base_url}", "OAuth")&lt;BR /&gt;spark.conf.set(f"fs.azure.account.oauth.provider.type.{base_url}", "org.apache.hadoop.fs.azurebfs.oauth2.AccessTokenProvider")&lt;BR /&gt;spark.conf.set(f"fs.azure.account.oauth2.access.token.{base_url}", access_token)&lt;BR /&gt;&lt;BR /&gt;# CREATE TEST DATA&lt;BR /&gt;&lt;BR /&gt;schema = StructType([&lt;BR /&gt;  StructField("id", IntegerType(), False),&lt;BR /&gt;  StructField("name", StringType(), False),&lt;BR /&gt;  StructField("value", IntegerType(), False)&lt;BR /&gt;])&lt;BR /&gt;&lt;BR /&gt;data = [&lt;BR /&gt;  (1, "Alpha", 100),&lt;BR /&gt;  (2, "Beta", 200),&lt;BR /&gt;  (3, "Gamma", 300)&lt;BR /&gt;]&lt;BR /&gt;&lt;BR /&gt;df = spark.createDataFrame(data, schema)&lt;BR /&gt;&lt;BR /&gt;# SAVE TEST DATA&lt;BR /&gt;&lt;BR /&gt;# Returns a 404 as if path would not exists, but not true. I'm also getting this if I do not pass a token at all... &lt;span class="lia-unicode-emoji" title=":disappointed_face:"&gt;😞&lt;/span&gt;&lt;BR /&gt;df.write.format("delta").mode("overwrite").save(full_url)&lt;BR /&gt;&lt;BR /&gt;# Error&lt;BR /&gt;# An error occurred while calling o4263.save.&lt;BR /&gt;# : java.io.FileNotFoundException: Operation failed: "NotFound", 404, HEAD, https://msit-onelake.dfs.fabric.microsoft.com/4......../?upn=false&amp;amp;action=getAccessControl&amp;amp;timeout=90&lt;BR /&gt;# at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:1436)&lt;BR /&gt;# at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.mkdirs(AzureBlobFileSystem.java:609)&lt;BR /&gt;# at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2388)&lt;BR /&gt;# at org.apache.spark.sql.delta.DeltaLog.createLogDirectory(DeltaLog.scala:467)&lt;BR /&gt;# at org.apache.spark.sql.delta.commands.WriteIntoDelta.write(WriteIntoDelta.scala:265)&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Jun 2025 06:42:42 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4716188#M9807</guid>
      <dc:creator>Krumelur</dc:creator>
      <dc:date>2025-06-02T06:42:42Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4716631#M9823</link>
      <description>&lt;P data-start="80" data-end="491"&gt;Hi &lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/680346"&gt;@Krumelur&lt;/a&gt;,&lt;/P&gt;
&lt;P data-start="80" data-end="491"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="80" data-end="491"&gt;Although an access token can be successfully acquired using a certificate and service principal (via MSAL and SNI), writing data from Synapse Spark to Microsoft Fabric OneLake using the &lt;SPAN&gt;&amp;nbsp;abfss://&amp;nbsp;&lt;/SPAN&gt;protocol is currently not supported. The 404 error encountered in this scenario stems from the inability of the ABFS driver within Synapse to resolve OneLake’s internal namespace, despite successful authentication.&lt;/P&gt;
&lt;P data-start="493" data-end="899"&gt;Microsoft Fabric’s OneLake implements a virtualized filesystem abstraction that is compatible only with Microsoft-native tools such as Fabric Spark notebooks, Dataflows, and Pipelines. External Spark engines, including Synapse Spark, are not capable of interpreting OneLake paths correctly. Consequently, token-based configurations fail at the filesystem resolution stage rather than during authentication.&lt;/P&gt;
&lt;P data-start="901" data-end="1219"&gt;As of June 2025, Microsoft has not introduced support for direct data writes from Synapse Spark to OneLake. The recommended approach is to first write the data to Azure Data Lake Storage Gen2 (ADLS Gen2), and subsequently transfer it to the Lakehouse using either a Synapse pipeline Copy Activity or a Fabric Dataflow.&lt;/P&gt;
&lt;P data-start="901" data-end="1219"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Hope my suggestion gives you good idea, if you have any more questions, please feel free to ask we are here to help you.&lt;BR /&gt;If this post&amp;nbsp;&lt;STRONG&gt;helps&lt;/STRONG&gt;, then please consider&amp;nbsp;&lt;STRONG&gt;Accept it as the solution&lt;/STRONG&gt;&amp;nbsp;to help the other members find it more quickly.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;Thank you.&lt;/P&gt;
&lt;P data-start="976" data-end="1157" data-is-last-node="" data-is-only-node=""&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Jun 2025 09:48:47 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4716631#M9823</guid>
      <dc:creator>v-sgandrathi</dc:creator>
      <dc:date>2025-06-02T09:48:47Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4751705#M10586</link>
      <description>&lt;P&gt;I'm also searching for a potential way to read and write fabric from Synapse. Is there a workable answer to this?&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/882995"&gt;@v-sgandrathi&lt;/a&gt;&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/680346"&gt;@Krumelur&lt;/a&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Jul 2025 06:02:39 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4751705#M10586</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2025-07-03T06:02:39Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4751793#M10589</link>
      <description>&lt;P&gt;Hi&amp;nbsp;@Anonymous&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you for reaching out. To assist you more effectively, could you please create a new thread and include a detailed description of the issue you are experiencing? This will allow our team to review your case thoroughly and provide support that is specific to your needs.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We appreciate your cooperation and look forward to assisting you further in the new thread.&lt;BR /&gt;Thank you.&lt;/P&gt;</description>
      <pubDate>Thu, 03 Jul 2025 07:05:47 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4751793#M10589</guid>
      <dc:creator>v-sgandrathi</dc:creator>
      <dc:date>2025-07-03T07:05:47Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4752157#M10600</link>
      <description>&lt;P&gt;No, it's currently not possible.&lt;/P&gt;</description>
      <pubDate>Thu, 03 Jul 2025 11:15:50 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/4752157#M10600</guid>
      <dc:creator>Krumelur</dc:creator>
      <dc:date>2025-07-03T11:15:50Z</dc:date>
    </item>
    <item>
      <title>Re: Write to Fabric OneLake from a Synapse Spark notebook</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/5040513#M15129</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/680346"&gt;@Krumelur&lt;/a&gt;&amp;nbsp;,&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Accessing &lt;STRONG&gt;Fabric OneLake from a standalone Synapse Spark notebook&lt;/STRONG&gt; is something many teams are exploring as they transition workloads toward Fabric. While OneLake exposes familiar endpoints, cross-service authentication and access patterns can introduce limitations that aren’t always obvious upfront.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The good news is this &lt;STRONG&gt;is achievable with the right integration approach&lt;/STRONG&gt;. We’ve helped teams enable smoother Synapse-to-Fabric data interactions and migrations using patterns and tooling that reduce friction when working across environments.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you’d like to see how this can be handled in practice, happy to share a &lt;STRONG&gt;quick demo&lt;/STRONG&gt; — feel free to reach out at &lt;STRONG&gt;&lt;A target="_blank" rel="noopener"&gt;mrunal@intellifysolutions.com&lt;/A&gt;&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 16 Feb 2026 16:14:00 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Write-to-Fabric-OneLake-from-a-Synapse-Spark-notebook/m-p/5040513#M15129</guid>
      <dc:creator>Mrunaal</dc:creator>
      <dc:date>2026-02-16T16:14:00Z</dc:date>
    </item>
  </channel>
</rss>

