Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
JoeCrozier
Helper II
Helper II

Library Install issue in notebook

I'm having an issue today where I just cant seem to install libraries into environments for a Fabric Notebook.  

 

Let me clarify, I CAN install things if I do it this way:

JoeCrozier_0-1729710824041.png

If I do it that way, works just fine.

Problem is, I want the libraries to be available across sessions, and so I'm trying to install it into the environment here:

JoeCrozier_1-1729710890166.png

When I add it there, and 'publish' it just takes forever and then eventually fails.  The failure is very nondescript.  It just says this:

JoeCrozier_2-1729710955648.png


and when I ask for more details, it gives me a log that I pasted at the bottom of this message (its long, didnt want to clog everything).  When scanning through the log, it kinda just says everything worked fine?  But it doesn't.

I've also tried uploading packages that i've previously uploaded to other environments and they'll still fail.  I'm not sure what I'm doing wrong.

 

Lastly, its strange that PyCap (the module I want) isnt available in Fabric's dropdown from PyPi 2.6.0, because it IS on PyPi.  Is there a reason for that?

JoeCrozier_0-1729711351775.png

 

 

Container: container_1729702657135_0001_01_000001 on vm-45c90282_14454
LogAggregationType: AGGREGATED
======================================================================
LogType:stdout
LogLastModifiedTime:Wed Oct 23 17:15:36 +0000 2024
LogLength:16231
LogContents:
24/10/23 17:07:07 INFO Library preparation stage started
24/10/23 17:07:07 INFO Workspace ID: 00bfa58a-8950-409f-9be9-37a6cf6fa9f3
24/10/23 17:07:07 INFO Artifact ID: c5c67736-394f-4587-9642-994a2bf76b89
24/10/23 17:07:10 INFO 
get_libraries_to_process took 2.49 s
24/10/23 17:07:10 INFO Data exfiltration protection (DEP) set to: false
24/10/23 17:07:10 INFO Running mkdir -p /usr/lib/library-manager/bin/lps/libraries/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/jars/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/py/ && mkdir -p /usr/lib/library-manager/bin/lps/wheels/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/R/tmp/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_python/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_scala/
24/10/23 17:07:10 INFO Ran mkdir -p /usr/lib/library-manager/bin/lps/libraries/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/jars/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/py/ && mkdir -p /usr/lib/library-manager/bin/lps/wheels/ && mkdir -p /usr/lib/library-manager/bin/lps/libraries/R/tmp/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_python/ && mkdir -p /usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_scala/.  returncode: 0
24/10/23 17:07:10 INFO Check if python py files are present.
24/10/23 17:07:10 INFO No Python py files are present.
24/10/23 17:07:10 INFO Python parallel execution submitted
24/10/23 17:07:10 INFO Adding conda configurations
24/10/23 17:07:10 INFO Check if scala jars are present.
24/10/23 17:07:10 INFO Complete list of conda configurations : /usr/lib/miniforge3/bin/conda config --set auto_update_conda false /usr/lib/miniforge3/bin/conda config --set notify_outdated_conda false /usr/lib/miniforge3/bin/conda config --set default_threads 4
24/10/23 17:07:10 INFO Jar parallel execution submitted
24/10/23 17:07:10 INFO Running cd /usr/lib/library-manager/bin ; /bin/bash applyChanges.sh && /usr/lib/miniforge3/bin/conda config --set auto_update_conda false && /usr/lib/miniforge3/bin/conda config --set notify_outdated_conda false && /usr/lib/miniforge3/bin/conda config --set default_threads 4 && /usr/lib/miniforge3/bin/conda create --prefix /home/trusted-service-user/cluster-env/clonedenv --clone /home/trusted-service-user/cluster-env/trident_env --yes --offline && source /usr/lib/miniforge3/bin/activate /home/trusted-service-user/cluster-env/clonedenv
24/10/23 17:07:10 INFO Scala jars found.
24/10/23 17:07:10 INFO Check if R files are present.
24/10/23 17:07:10 INFO Library file to be copied: 'spark-snowflake_2.12-3.0.0.jar'
24/10/23 17:07:10 INFO R parallel execution submitted
24/10/23 17:07:10 INFO Running azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/f51e4992-3430-4187-9206-1b8fe7a413829xUqhVF-4iC6uZ5rjWiPphwnVhouTsxQgj6tsY6nzBo=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/jars/spark-snowflake_2.12-3.0.0.jar"
24/10/23 17:07:10 INFO R libraries found.
24/10/23 17:07:10 INFO Library file to be copied: 'REDCapR_1.3.0.tar.gz'
24/10/23 17:07:10 INFO Running azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/1b043696-ec42-4dc0-bf38-35b939b4ffc5iLmybgHTRg.Hk0pw6qBSZDX8c25mULfS.4Csb332lco=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/R/tmp/REDCapR_1.3.0.tar.gz"
24/10/23 17:07:15 INFO Ran azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/f51e4992-3430-4187-9206-1b8fe7a413829xUqhVF-4iC6uZ5rjWiPphwnVhouTsxQgj6tsY6nzBo=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/jars/spark-snowflake_2.12-3.0.0.jar".  returncode: 0
stdout: INFO: Scanning...
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support

Job b341bc2d-a386-f74e-6211-04dfc60b7e39 has started
Log file is located at: /home/trusted-service-user/.azcopy/b341bc2d-a386-f74e-6211-04dfc60b7e39.log


100.0 %, 1 Done, 0 Failed, 0 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 2.181


Job b341bc2d-a386-f74e-6211-04dfc60b7e39 summary
Elapsed Time (Minutes): 0.0335
Number of File Transfers: 1
Number of Folder Property Transfers: 0
Number of Symlink Transfers: 0
Total Number of Transfers: 1
Number of File Transfers Completed: 1
Number of Folder Transfers Completed: 0
Number of File Transfers Failed: 0
Number of Folder Transfers Failed: 0
Number of File Transfers Skipped: 0
Number of Folder Transfers Skipped: 0
Total Number of Bytes Transferred: 547966
Final Job Status: Completed

 
24/10/23 17:07:15 INFO Ran azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/1b043696-ec42-4dc0-bf38-35b939b4ffc5iLmybgHTRg.Hk0pw6qBSZDX8c25mULfS.4Csb332lco=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/R/tmp/REDCapR_1.3.0.tar.gz".  returncode: 0
stdout: INFO: Scanning...
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support

Job 4a37ef3a-745d-554d-7f87-93d04d6cd6db has started
Log file is located at: /home/trusted-service-user/.azcopy/4a37ef3a-745d-554d-7f87-93d04d6cd6db.log


100.0 %, 1 Done, 0 Failed, 0 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 4.3677


Job 4a37ef3a-745d-554d-7f87-93d04d6cd6db summary
Elapsed Time (Minutes): 0.0335
Number of File Transfers: 1
Number of Folder Property Transfers: 0
Number of Symlink Transfers: 0
Total Number of Transfers: 1
Number of File Transfers Completed: 1
Number of Folder Transfers Completed: 0
Number of File Transfers Failed: 0
Number of Folder Transfers Failed: 0
Number of File Transfers Skipped: 0
Number of Folder Transfers Skipped: 0
Total Number of Bytes Transferred: 1098765
Final Job Status: Completed

 
24/10/23 17:07:15 INFO Library file to be copied: 'snowflake-jdbc-3.9.2.jar'
24/10/23 17:07:15 INFO Running source /lib/vhd/synapse-trident-r.env && Rscript -e 'library(tools); write_PACKAGES("/usr/lib/library-manager/bin/lps/libraries/R/tmp/")'
24/10/23 17:07:15 INFO Running azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/a7660112-ef11-4ec4-9ee8-7c9063e2b262BzFiP5aRy9ZfQcf6TqNw22lrDj7Q6us6cmOhScrgRRo=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/jars/snowflake-jdbc-3.9.2.jar"
24/10/23 17:07:17 INFO Ran azcopy copy "REDACTED/3764e7d0-8fff-4c8c-bd13-06c688fed784/FileManagementSettingsBlobs/a7660112-ef11-4ec4-9ee8-7c9063e2b262BzFiP5aRy9ZfQcf6TqNw22lrDj7Q6us6cmOhScrgRRo=?skoid=d0a1b8ed-5bd8-4d2a-8a6d-d5fc7f5fb6cb&sktid=975f013f-7f24-47e8-a7d3-abc4752bf346&skt=2024-10-23T17%3A07%3A10Z&ske=2024-10-24T17%3A07%3A10Z&sks=b&skv=2023-08-03&sv=2023-08-03&se=2024-10-23T18%3A07%3A10Z&sr=b&sp=r&sig=REDACTED" "/usr/lib/library-manager/bin/lps/libraries/jars/snowflake-jdbc-3.9.2.jar".  returncode: 0
stdout: INFO: Scanning...
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support

Job 89f46e08-be7a-b44c-7d33-4f1e4c6a3582 has started
Log file is located at: /home/trusted-service-user/.azcopy/89f46e08-be7a-b44c-7d33-4f1e4c6a3582.log


100.0 %, 1 Done, 0 Failed, 0 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 122.4233


Job 89f46e08-be7a-b44c-7d33-4f1e4c6a3582 summary
Elapsed Time (Minutes): 0.0334
Number of File Transfers: 1
Number of Folder Property Transfers: 0
Number of Symlink Transfers: 0
Total Number of Transfers: 1
Number of File Transfers Completed: 1
Number of Folder Transfers Completed: 0
Number of File Transfers Failed: 0
Number of Folder Transfers Failed: 0
Number of File Transfers Skipped: 0
Number of Folder Transfers Skipped: 0
Total Number of Bytes Transferred: 30625824
Final Job Status: Completed

 
24/10/23 17:07:17 INFO Spawning Scala packages metadata cooking process...
24/10/23 17:07:17 INFO Running timeout 60 /usr/lib/library-metadata-cooker/bin/execute-jar-metadata-cooker-trident.sh "/usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_scala/" "/usr/lib/library-manager/bin/lps/libraries/jars/" || true
24/10/23 17:07:20 INFO Ran timeout 60 /usr/lib/library-metadata-cooker/bin/execute-jar-metadata-cooker-trident.sh "/usr/lib/library-manager/bin/lps/libraryMetadata//Spark34//metadata_scala/" "/usr/lib/library-manager/bin/lps/libraries/jars/" || true.  returncode: 0
stdout: Begin to run jar metadata cooker
LMC: Successfully processed Jar: /usr/lib/library-manager/bin/lps/libraries/jars/spark-snowflake_2.12-3.0.0.jar..
LMC: Successfully processed Jar: /usr/lib/library-manager/bin/lps/libraries/jars/snowflake-jdbc-3.9.2.jar..
LMC: Cooked 2 jars, using 2876ms
 
24/10/23 17:07:20 INFO Seconds scala LMC took: 3.1
24/10/23 17:07:20 INFO 
process_scala_jar_files took 9.71 s
24/10/23 17:07:20 INFO Closing down clientserver connection
24/10/23 17:07:32 INFO Ran source /lib/vhd/synapse-trident-r.env && Rscript -e 'library(tools); write_PACKAGES("/usr/lib/library-manager/bin/lps/libraries/R/tmp/")'.  returncode: 0
24/10/23 17:07:32 INFO Sanitizing package: REDCapR
24/10/23 17:07:32 INFO Package sanitized: REDCapR
24/10/23 17:07:32 INFO Running source /lib/vhd/synapse-trident-r.env && Rscript -e 'install.packages("REDCapR", contriburl="file://usr/lib/library-manager/bin/lps/libraries/R/tmp/",  quiet = TRUE)'
24/10/23 17:07:43 INFO Ran source /lib/vhd/synapse-trident-r.env && Rscript -e 'install.packages("REDCapR", contriburl="file://usr/lib/library-manager/bin/lps/libraries/R/tmp/",  quiet = TRUE)'.  returncode: 0
stderr: Warning: dependency ‘checkmate’ is not available
Updating HTML index of packages in '.Library'
Making 'packages.html' ... done
Warning message:
In install.packages("REDCapR", contriburl = "file://usr/lib/library-manager/bin/lps/libraries/R/tmp/",  :
  installation of package ‘REDCapR’ had non-zero exit status

24/10/23 17:07:43 INFO Running source /lib/vhd/synapse-trident-r.env && Rscript -e 'library("REDCapR")'
24/10/23 17:07:46 ERROR Error while running command source /lib/vhd/synapse-trident-r.env && Rscript -e 'library("REDCapR")': got exit code 1
stderr:Error in library("REDCapR") : there is no package called ‘REDCapR’
Execution halted

24/10/23 17:07:46 INFO Cleanup following folders and files from staging directory:
24/10/23 17:07:46 INFO Staging directory cleaned up successfully
24/10/23 17:07:46 INFO 
clean_up took 0.04 s
24/10/23 17:09:46 INFO Closing down clientserver connection
24/10/23 17:15:34 INFO Ran cd /usr/lib/library-manager/bin ; /bin/bash applyChanges.sh && /usr/lib/miniforge3/bin/conda config --set auto_update_conda false && /usr/lib/miniforge3/bin/conda config --set notify_outdated_conda false && /usr/lib/miniforge3/bin/conda config --set default_threads 4 && /usr/lib/miniforge3/bin/conda create --prefix /home/trusted-service-user/cluster-env/clonedenv --clone /home/trusted-service-user/cluster-env/trident_env --yes --offline && source /usr/lib/miniforge3/bin/activate /home/trusted-service-user/cluster-env/clonedenv.  returncode: 0
stdout: Warning: This Script only works on conda release 22.9.0  Other release is not guaranteed!
/usr/lib/library-manager/bin
patching file /usr/lib/miniforge3/lib/python3.10/site-packages/conda/misc.py
patching file /usr/lib/miniforge3/lib/python3.10/site-packages/conda/gateways/disk/create.py
Source:      /home/trusted-service-user/cluster-env/trident_env
Destination: /home/trusted-service-user/cluster-env/clonedenv
Packages: 519
Files: 11010
Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... 

done
#
# To activate this environment, use
#
#     $ conda activate /home/trusted-service-user/cluster-env/clonedenv
#
# To deactivate an active environment, use
#
#     $ conda deactivate

 
stderr: + echo 'Warning: This Script only works on conda release 22.9.0  Other release is not guaranteed!'
+ cd /usr/lib/miniforge3/bin
+ ./conda config --set allow_softlinks True
+ ./conda config --set always_softlink True
+ cd -
+ cp /usr/lib/miniforge3/lib/python3.10/site-packages/conda/misc.py /usr/lib/miniforge3/lib/python3.10/site-packages/conda/misc.backup.py
+ patch /usr/lib/miniforge3/lib/python3.10/site-packages/conda/misc.py misc.patch
+ cp /usr/lib/miniforge3/lib/python3.10/site-packages/conda/gateways/disk/create.py /usr/lib/miniforge3/lib/python3.10/site-packages/conda/gateways/disk/create.backup.py
+ patch /usr/lib/miniforge3/lib/python3.10/site-packages/conda/gateways/disk/create.py create.patch

24/10/23 17:15:34 INFO Capture common so files and remove them from clonedenv
24/10/23 17:15:36 INFO Closing down clientserver connection
24/10/23 17:15:36 INFO Waiting for parallel executions
Traceback (most recent call last):
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 812, in <module>
    run_lps()
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 119, in wrap
    ret = f(*args, **kwargs)
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 763, in run_lps
    python_result = parallel_future_python.result()
  File "/home/trusted-service-user/cluster-env/trident_env/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/home/trusted-service-user/cluster-env/trident_env/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/home/trusted-service-user/cluster-env/trident_env/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 119, in wrap
    ret = f(*args, **kwargs)
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 598, in process_all_python
    capture_so_files(so_files_list)
  File "/mnt/var/hadoop/tmp/nm-secondary-local-dir/usercache/trusted-service-user/appcache/application_1729702657135_0001/container_1729702657135_0001_01_000001/library-preparation-stage.py", line 197, in capture_so_files
    with open(so_files_output_file, 'w') as file:
FileNotFoundError: [Errno 2] No such file or directory: '/usr/lib/library-manager/bin/lps/so_files_list.txt'
24/10/23 17:15:36 INFO Closing down clientserver connection

End of LogType:stdout
***********************************************************************

 

 

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @JoeCrozier 

 

You can install it from PyPI Public repository. I just did this. The drop-down list only suggests some common options and does not include all the modules that exist in PyPI repository. You can just enter "PyCap" and it will detect the available versions automatically. It will pick the latest version, you can also pick another version. 

vjingzhanmsft_0-1729735323605.png

 

If one module doesn't exist in PyPI repository, it will tell you "Library is not found", just like below. 

vjingzhanmsft_1-1729735887224.png

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Updated:

 

When you choose to install custom libraries, you have to upload and install all the relevant dependency modules that don't exist in the current environment at the same time. Fabric will not detect and install its dependencies for you.

vjingzhanmsft_2-1729736382985.png

When viewing the logs, you can first look for the place where the exit status is non-zero (1). Then view the logs above it, the specific cause of the error is usually there, not far.

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Anonymous
Not applicable

Hi @JoeCrozier 

 

You can install it from PyPI Public repository. I just did this. The drop-down list only suggests some common options and does not include all the modules that exist in PyPI repository. You can just enter "PyCap" and it will detect the available versions automatically. It will pick the latest version, you can also pick another version. 

vjingzhanmsft_0-1729735323605.png

 

If one module doesn't exist in PyPI repository, it will tell you "Library is not found", just like below. 

vjingzhanmsft_1-1729735887224.png

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.