Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
With Fabric notebooks, mssparkutils.notebook.runMultiple() could run them in parallel way
How to run the spark jobs defined by spark job defintion in a similar parallel way?
Solved! Go to Solution.
Hi @yongshao ,
Thanks for using Fabric Community.
As mentioned in docs - " Notebook utilities aren't applicable for Apache Spark job definitions (SJD). "
Unfortunately we cannot run spark jobs defined by spark job defintion in a parallel way.
At present the only way to run the Spark Job Definitions is -
1. Run a Spark job definition manually by selecting Run from the Spark job definition item in the job list.
2. Schedule a Spark job definition by setting up a schedule plan on the Settings tab. Select Settings on the toolbar, then select Schedule.
Docs to refer -
Run an Apache Spark job definition - Microsoft Fabric | Microsoft Learn
Microsoft Spark Utilities (MSSparkUtils) for Fabric - Microsoft Fabric | Microsoft Learn
We use customer feedback like yours to prioritize future features. The more users who request the ability to customize backgrounds, the higher it moves on our list.
Appreciate if you could share the feedback on our Microsoft Fabric Ideas. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
I hope this information helps. If you have any further queries please do let us know.
Thanks
Hi @yongshao ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
Regarding of spark job definition, its parallel runs would boost ETL performance and scalability, hope such feature would be supported in Fabric future release
Regarding of spark job definition, its parallel runs would boost ETL performance and scalability, hope such feature would be supported in Fabric future release
We tried to connect lakehouse SQL endpoint from a notebook. By the way, we've no problem to use spark (dataFrame and SQL) with the lakehouse. But we want to create Views with the lakehouse SQL endpoint.
The views could be created manually from SQL endpoint query console, but we want to automate View creation/update from a Fabric notebook
Appreciated for your help
Hi @yongshao ,
Please refer this thread - Solved: Re: Creating views from spark notebook into SQL an... - Microsoft Fabric Community
Incase if you have further queries, please raise a new thread as it is a different question. After raising the query please tag me and share the url here, so I can look into it closely.
Thank you
Thank you for the answer, we'll stick with notebook
another question: from notebook, use PYODBC to connect a lakehouse's SQL endpoint, could a view be created or updated?
I tried it, the select query ran fine, but create view query ran a long time without any result
Hi @yongshao ,
Glad to know that you got some insights. Please continue using Fabric Community on your further queries.
Coming to your another question, I would like to understand why you are using pyodbc inorder to connect lakehouse?
It would be great if you can raise a new thread as it is a different question. After raising the query please tag me and share the url here, so I can look into it closely.
Hi @yongshao ,
Thanks for using Fabric Community.
As mentioned in docs - " Notebook utilities aren't applicable for Apache Spark job definitions (SJD). "
Unfortunately we cannot run spark jobs defined by spark job defintion in a parallel way.
At present the only way to run the Spark Job Definitions is -
1. Run a Spark job definition manually by selecting Run from the Spark job definition item in the job list.
2. Schedule a Spark job definition by setting up a schedule plan on the Settings tab. Select Settings on the toolbar, then select Schedule.
Docs to refer -
Run an Apache Spark job definition - Microsoft Fabric | Microsoft Learn
Microsoft Spark Utilities (MSSparkUtils) for Fabric - Microsoft Fabric | Microsoft Learn
We use customer feedback like yours to prioritize future features. The more users who request the ability to customize backgrounds, the higher it moves on our list.
Appreciate if you could share the feedback on our Microsoft Fabric Ideas. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
I hope this information helps. If you have any further queries please do let us know.
Thanks
Hi @yongshao ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
Hi @yongshao ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
Check out the September 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
7 | |
3 | |
3 | |
2 | |
2 |
User | Count |
---|---|
19 | |
4 | |
3 | |
3 | |
3 |