Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I'm trying to use some custom libraries in a notebook by uploading JAR files to my notebook environment in the "Custom libraries" section. I've verified that the jars are attached with the "spark.sparkContext.listJars.foreach(println)" command, but the namespaces still aren't recognized in my notebook in the import statement.
I've used this same set of libraries in a Synapse Analytics notebook by assigning them as workspace packages. That notebook in synapse and my Fabric notebook are using the same spark and scala versions.
Does anyone have any pointers about why the custom library import might be failing?
Hi @cahenning ,
I get where you’re coming from, this can be confusing. Even if you see the JAR files listed with sparkContext.listJars, sometimes Fabric Notebooks have a few quirks with actually making those classes available for import in the notebook session.
A couple things I’d check:
If you’ve tried all that, sometimes it’s just a limitation in how Fabric handles class loading for custom libraries. You could try submitting the JAR via a %run or %dep magic command, if Fabric supports those (I’ve seen this help in some Spark setups).
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
User | Count |
---|---|
80 | |
43 | |
16 | |
11 | |
7 |
User | Count |
---|---|
93 | |
88 | |
27 | |
8 | |
8 |