Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I've create a whl file containing a python function and uploaded it as a custom library into my fabric environment. I've succesfully deployed the library and now I would expect the library to be available to import during a notebook session.
However, this is not the case. When I run the pip list, it's not shown in the list and I can also not call the library through the import function:
When I install the package from the whl file in the environment resource, the package does work. However I would expect to just add my whl file to the environment as a custom library and then be able to use the packages inside. Did I miss something or do i need the use the custom library in another way?
Thanks for your responses.
To answer @Srisakthi, yes the library is present:
And to @Anonymous, i've tried the %pip list and my package isn't present.
Some additional info i can give is:
- I've created the package my self, based on een small set of python files. However, since the package does work when i install it through a notebook cell, I''ve concluded that the package isn't the issue.
This is the structure and one of the functions is the package:
This is the code i've used for the creation of the package:
from setuptools import find_packages, setup
setup(
name='idpdemo',
version='0.0.1',
description='demo_description',
author='Broeks',
packages=find_packages(include=['idp']),
install_requires=[], # Add your dependencies here
extras_require={
'dev': ['pytest'] # Test dependencies go here
}
)
And I've tried to import the function through the following code:
from idp import check_dataframe_for_duplicate_keys
If you need additional information, let me known.
HI @Broeks,
I'd like to suggest text with other whl format custom libraries to confirm if this issue only applies on your side.
Regards,
Xiaoxin Sheng
Yes, will do. I'm quitte busy these days, so a response might take a few days. But thanks for the help so far!
Hi @Broeks
Did you manage to solve this problem? I'm at the point in a project where writing a custom library maybe a better option, rather than executing notebook from within another notebook.
Cheers
Jeff Jones
Hello Jeff,
Yes i was able to fix the issue. My package was not correct.
Now I'm able to upload and publish a whl file through the fabric api.
https://learn.microsoft.com/en-us/rest/api/fabric/environment/spark-libraries
Thanks, good to hear.
I might put some time in and see if I can migrate my code to a library then.
Cheers,
Jeff Jones
Yes, do you have any idea on how you are going to develop your python/pysparkcode?
I wanted to use the spark job definition integration within vscode to develop against the fabric kernel. However, debugging doesn't work as expected.
So I'm wondering whats your approach?
Hi @Broeks ,
Can you please share some more detail information about this issue and sameple whl file ? They should help us clarify your scenario and test to troubleshoot.
In addition, have you tried to use %pip list instead of native jupyter pip command to list the installed libraries?
Regards,
Xiaoxin Sheng
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
6 | |
4 | |
4 | |
3 | |
3 |