Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
I have a scheduled notebook running on daily basis which is connected with Lakehouse for reading input parquet files and writing output as Tables. These tables are connected with Power BI report and semantic model. Now my daily schduled script is failing because pinned Lakehouse to the Notebook is getting disconnected and I need to add the lakehouse everyday manually and re-run the script. May I know workaround for this issue?
Solved! Go to Solution.
If you clone the notebook and then pin the lakehouse to the cloned notebook, does the same happen?
Hi @anusha_2023
Glad that your query got resolved.
Please continue using Fabric Community for any help regarding your queries.
Hello @anusha_2023
Thanks for using the Fabric community.
I did tried to repro the issue and as I understand , the issue here is to the notebooks scheduled runs are failing .
I tried this to read a file and write the content to a table ,
and then I scheduled it for every 10 mins . The run just went fine without any failure .
I am sure that I am missing something here . What is the error you are getting ?
Thanks
HImanshu
I tried with sample data by pinning the lakehouse and loading as table with a new notebook shedules are successful.
I have issue with my daily scheduled notebooks.
This is the error:
Py4JJavaError: An error occurred while calling o6337.parquet. : java.io.FileNotFoundException: Operation failed: "Not Found", 404, PUT, http://onelake.dfs.fabric.microsoft.com/b04579d1-31c1-4194-a912-9f15db327234/6aa4d65e-dc1f-4212-8c1f... ArtifactNotFound, "Artifact '6aa4d65e-dc1f-4212-8c1f-1cafc20c20d5' is not found in workspace 'b04579d1-31c1-4194-a912-9f15db327234'."
Caused by: Operation failed: "Not Found", 404, PUT, http://onelake.dfs.fabric.microsoft.com/b04579d1-31c1-4194-a912-9f15db327234/6aa4d65e-dc1f-4212-8c1f... ArtifactNotFound, "Artifact '6aa4d65e-dc1f-4212-8c1f-1cafc20c20d5' is not found in workspace 'b04579d1-31c1-4194-a912-9f15db327234'." at org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.completeExecute(AbfsRestOperation.java:231) at org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.lambda$execute$0(AbfsRestOperation.java:191)
As a work aroud If I gave absolute path then I could able to write as parquet file. But saveAsTable is not working with absolute path as below.
If you clone the notebook and then pin the lakehouse to the cloned notebook, does the same happen?
If I made another copy and schedule the new notebook its worked at first time. Thanks for the tip. But second time schedule is successful but script did not ran. As below showing the monitor hub first run is successful at 2:23AM showing in the first screen shot and next runs are showing succeeded but check the setting details in the second screenshot below.
Notebook is not really running the script. Does that mean am I am ran out of ran capacity or what would be the reason behind?
How are you saying the notebook did not execute? The data does not get written to table?
If you go to recent runs, you will be able to individually go into each run's status and you can click on item snapshots to see how the notebook has run. Please check that and see if any individual cell is causing a problem.
Thanks for the reply. Yes, I have checked in the same way, and see the below screenshot. The server is not getting started. If I run manually by going into the notebook it works but if I try to schedule server is not picked up. Please check the below screenshot