The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I'm trying to use some custom libraries in a notebook by uploading JAR files to my notebook environment in the "Custom libraries" section. I've verified that the jars are attached with the "spark.sparkContext.listJars.foreach(println)" command, but the namespaces still aren't recognized in my notebook in the import statement.
I've used this same set of libraries in a Synapse Analytics notebook by assigning them as workspace packages. That notebook in synapse and my Fabric notebook are using the same spark and scala versions.
Does anyone have any pointers about why the custom library import might be failing?
Solved! Go to Solution.
Hi @cahenning,
The import fails due to a known limitation in Fabric Notebooks — the Scala REPL doesn't recognize classes from dynamically added JARs at compile time.
As a workaround, you can use reflection to interact with the class or repackage your code as a fat JAR (including all dependencies) and try again. Direct import may not work unless Microsoft updates the REPL classloading behavior.
Regards,
Vinay Pabbu
Hi @burakkaragoz,
Thank you for your thorough response.
I've tried adding the jars with the command "spark.sparkContext.addJar("./env/UdsReader-1.1.jar"), but that does not lead to the import working either. I've restarted the notebook session a few times as well. The class path is also definitely correct.
Interestingly, the statement "Class.forName("BingPlacesData.UdsReader.UdsReader")" does not cause an error. I think I have all the dependencies installed, but it's hard for me to know for sure.
Do you have any recommendations on how I can debug this further?
Hi @cahenning,
The import fails due to a known limitation in Fabric Notebooks — the Scala REPL doesn't recognize classes from dynamically added JARs at compile time.
As a workaround, you can use reflection to interact with the class or repackage your code as a fat JAR (including all dependencies) and try again. Direct import may not work unless Microsoft updates the REPL classloading behavior.
Regards,
Vinay Pabbu
Hi @Anonymous , you're right! Thank you so much - using reflection does work.
Hi @cahenning ,
I get where you’re coming from, this can be confusing. Even if you see the JAR files listed with sparkContext.listJars, sometimes Fabric Notebooks have a few quirks with actually making those classes available for import in the notebook session.
A couple things I’d check:
If you’ve tried all that, sometimes it’s just a limitation in how Fabric handles class loading for custom libraries. You could try submitting the JAR via a %run or %dep magic command, if Fabric supports those (I’ve seen this help in some Spark setups).
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
User | Count |
---|---|
14 | |
9 | |
5 | |
5 | |
3 |
User | Count |
---|---|
44 | |
23 | |
17 | |
17 | |
12 |