<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Running a Fabric Notebook as Spark Job. in Data Engineering</title>
    <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3716408#M1768</link>
    <description>&lt;P&gt;Hello Himanshu,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;thanks for the info. I am aware of this and this is my backup option (and will probably be the solution of the problem). However, I still wanted to understand why I cannot make the Notebook work using&amp;nbsp; Spark Job. I have a lot of display() functions in there and don't know whether this could be an issue. No&amp;nbsp;&lt;SPAN&gt;mssparkutils APIs are called.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thanks &amp;amp; BR &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 22 Feb 2024 08:20:08 GMT</pubDate>
    <dc:creator>Dinosauris</dc:creator>
    <dc:date>2024-02-22T08:20:08Z</dc:date>
    <item>
      <title>Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713623#M1763</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am interested in converting a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Fabric notebook&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;that I’ve created, containing multiple transformation steps, into a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Spark job&lt;/STRONG&gt;. However, I’m struggling to find comprehensive documentation on the additional code I need to include (such as building a Spark session) and how to create a reference file if necessary (along with the required code). Additionally, I’m unsure about the modifications needed in my notebook to enable downloading it as a&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;.py&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;file and running it as a Spark job.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The motivation behind this transition is that I want the transformation code to execute every&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;2 hours&lt;/STRONG&gt;, and from what I’ve read, using Spark jobs may offer better performance for this use case.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I would greatly appreciate it if someone could provide an example of how to achieve this.&lt;/P&gt;&lt;P&gt;Thank you in advance! &lt;span class="lia-unicode-emoji" title=":smiling_face_with_smiling_eyes:"&gt;😊&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 21 Feb 2024 08:46:19 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713623#M1763</guid>
      <dc:creator>Dinosauris</dc:creator>
      <dc:date>2024-02-21T08:46:19Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713643#M1764</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/693721"&gt;@Dinosauris&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Thanks for using Fabric Community.&amp;nbsp;&lt;BR /&gt;Please refer to these documents:&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-engineering/create-spark-job-definition" target="_blank"&gt;https://learn.microsoft.com/en-us/fabric/data-engineering/create-spark-job-definition&lt;/A&gt;&lt;BR /&gt;&lt;A href="https://www.red-gate.com/simple-talk/databases/sql-server/bi-sql-server/using-spark-jobs-for-multiple-lakehouse-maintenance-in-microsoft-fabric/" target="_blank"&gt;https://www.red-gate.com/simple-talk/databases/sql-server/bi-sql-server/using-spark-jobs-for-multiple-lakehouse-maintenance-in-microsoft-fabric/&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;Hope this helps. Please let me know if you have any further questions.&lt;/P&gt;</description>
      <pubDate>Wed, 21 Feb 2024 08:51:49 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713643#M1764</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-02-21T08:51:49Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713698#M1765</link>
      <description>&lt;P&gt;Thanks for the quick reply!&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Finally, I did two things, I added the following code at the beginning of my notebook to start the spark session:&lt;/P&gt;&lt;P&gt;from pyspark.sql import SparkSession&lt;/P&gt;&lt;P&gt;spark = SparkSession.builder \&lt;BR /&gt;.appName("MySparkJob") \&lt;BR /&gt;.getOrCreate()&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And the list code line is stopping the spark session:&lt;/P&gt;&lt;P&gt;spark.stop()&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, I get this error message when trying to run the spark job:&amp;nbsp;&lt;SPAN&gt;Execution is not supported for Spark Job Definitions that do not have content.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 21 Feb 2024 09:11:54 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713698#M1765</guid>
      <dc:creator>Dinosauris</dc:creator>
      <dc:date>2024-02-21T09:11:54Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713736#M1766</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/693721"&gt;@Dinosauris&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Apologies for the issue you have been facing.&lt;BR /&gt;You can also run your notebooks using Notebook activity in pipelines. Pipeline has more powerful features. And some mssparkutils APIs are not supported in Spark Job Definition.&lt;BR /&gt;For more information please refer to this link:&lt;BR /&gt;&lt;A href="https://learn.microsoft.com/en-us/fabric/data-factory/notebook-activity" target="_blank" rel="noopener"&gt;Notebook activity - Microsoft Fabric | Microsoft Learn&lt;/A&gt;&lt;BR /&gt;&lt;A href="https://www.youtube.com/watch?v=sFVNmTZ5h4Y" target="_blank"&gt;Scheduling Notebooks in Microsoft Fabric + Reading JSON from Dynamic File Paths (youtube.com)&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;Hope this helps. Please let me know if you have any further questions.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 21 Feb 2024 09:23:23 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3713736#M1766</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-02-21T09:23:23Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3715065#M1767</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/693721"&gt;@Dinosauris&lt;/a&gt;&amp;nbsp; ,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Just to provide a bit more clarity here . Since the transformation is already there in the notebbok , you can add that to a pipeline ( as&amp;nbsp;@Anonymous&lt;/a&gt;&amp;nbsp;called out above ) or you can scheduled from the notebook pane itself . The below snapshot should help .&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="HimanshuSmsft_0-1708539205151.png" style="width: 400px;"&gt;&lt;img src="https://community.fabric.microsoft.com/t5/image/serverpage/image-id/1046855i4EEAFE31B38A5E18/image-size/medium?v=v2&amp;amp;px=400" role="button" title="HimanshuSmsft_0-1708539205151.png" alt="HimanshuSmsft_0-1708539205151.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks&amp;nbsp;&lt;BR /&gt;Himanshu&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 21 Feb 2024 18:14:03 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3715065#M1767</guid>
      <dc:creator>HimanshuS-msft</dc:creator>
      <dc:date>2024-02-21T18:14:03Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3716408#M1768</link>
      <description>&lt;P&gt;Hello Himanshu,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;thanks for the info. I am aware of this and this is my backup option (and will probably be the solution of the problem). However, I still wanted to understand why I cannot make the Notebook work using&amp;nbsp; Spark Job. I have a lot of display() functions in there and don't know whether this could be an issue. No&amp;nbsp;&lt;SPAN&gt;mssparkutils APIs are called.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thanks &amp;amp; BR &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 22 Feb 2024 08:20:08 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3716408#M1768</guid>
      <dc:creator>Dinosauris</dc:creator>
      <dc:date>2024-02-22T08:20:08Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3728134#M1769</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.fabric.microsoft.com/t5/user/viewprofilepage/user-id/693721"&gt;@Dinosauris&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;Apologies for the delay in response.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Please go ahead and raise a support ticket to reach our support team:&amp;nbsp;&lt;A href="https://support.fabric.microsoft.com/en-IN/support/" target="_blank" rel="noopener nofollow noreferrer"&gt;Link&amp;nbsp;&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;After creating a Support ticket please provide the ticket number as it would help us to track for more information.&lt;/P&gt;
&lt;P&gt;Thanks.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Feb 2024 15:56:55 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3728134#M1769</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-02-27T15:56:55Z</dc:date>
    </item>
    <item>
      <title>Re: Running a Fabric Notebook as Spark Job.</title>
      <link>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3750584#M1770</link>
      <description>&lt;P&gt;Hello!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the meantime we decided to run the Fabric notebooks via pipelines for the use case.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you anyways for the help! &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 08 Mar 2024 06:28:35 GMT</pubDate>
      <guid>https://community.fabric.microsoft.com/t5/Data-Engineering/Running-a-Fabric-Notebook-as-Spark-Job/m-p/3750584#M1770</guid>
      <dc:creator>Dinosauris</dc:creator>
      <dc:date>2024-03-08T06:28:35Z</dc:date>
    </item>
  </channel>
</rss>

