Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
dbeavon3
Memorable Member
Memorable Member

Another random spark failure in fabric on custom pool - No module named 'azure.core.exceptions'

I'm helping a coworker with his pyspark notebooks in Fabric.

 

This morning all his notebooks failed with an error:

No module named 'azure.core.exceptions'


Notebook execution failed at Notebook service with http status code - '200', please check the Run logs on Notebook, additional details - 'Error name - ModuleNotFoundError, Error value - No module named 'azure.core.exceptions'' :

 

 

He was loading these libraries, and internally it was causing a failure when it tried to load a Microsoft-specific module:

 

from darts.models import RandomForest, ExponentialSmoothing, StatsForecastAutoARIMA, TBATS

 

 

Here is the full stdout error from the driver ...

 

 

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[4], line 1
----> 1 import notebookutils
      3 # Personalize Session
      4 from notebookutils.common.initializer import initializeLHContext

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/__init__.py:5
      1 __version__ = "1.1.8"
      3 __all__ = ["cognitiveService", "data", "fs", "lakehouse", "notebook", "session", "runtime", "help", "warehouse", "workspace", "fabricClient", "credentials", "PBIClient", "udf"]
----> 5 from . import cognitiveService, data, fs, lakehouse, notebook, session, warehouse, workspace, fabricClient, credentials, PBIClient, udf
      7 # TODO: this line should be removed after update runner ipynb code
      8 from notebookutils.visualization import displayHTML

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/cognitiveService.py:1
----> 1 from .mssparkutils.handlers import CognitiveServcieHandler
      3 cs = CognitiveServcieHandler()
      5 def getEndpointAndKey(lsName):

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/mssparkutils/__init__.py:5
      2 from os.path import dirname, basename, isfile, join
      3 import glob
----> 5 from .handlers import RuntimeHandler
      6 from ..common.logger import deprecated, print_deprecated_message
      8 modules = glob.glob(join(dirname(__file__), "*.py"))

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/mssparkutils/handlers/__init__.py:1
----> 1 from .fsHandler import SynapseFSHandler
      2 from .notebookHandler import SynapseNotebookHandler
      3 from .runtimeHandler import RuntimeHandler

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/mssparkutils/handlers/fsHandler.py:2
      1 from six import string_types
----> 2 from notebookutils.visualization import displayHTML, display_mount_points
      3 from notebookutils.common.logger import log4jLogger
      5 from .baseHandler import SynapseBaseHandler

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/visualization/__init__.py:1
----> 1 from .display import display, display_mount_points
      2 from .displayHTML import displayHTML
      3 from .msInlinePlotlib import enable_msinline_backend as enableMatplotlib

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/visualization/display.py:17
     14 from notebookutils.ipythoninterpreter import is_ipython_enabled
     15 from notebookutils.ipython import runtime
---> 17 from notebookutils.visualization.display_jupyter import display_without_spark
     18 from notebookutils.visualization.constants import SYNAPSE_DISPLAY_WIDGET_TYPE_KEY, MAX_ROW_COUNT
     19 from notebookutils.visualization.utils import sparkContextHelper

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/visualization/display_jupyter.py:11
      8 from notebookutils.common.logger import log4jLogger
      9 from notebookutils.visualization.constants import SYNAPSE_DISPLAY_WIDGET_TYPE_KEY, \
     10     MAX_ROW_COUNT, MAX_CONTENT_LENGTH
---> 11 from notebookutils.visualization.dataWrangler import get_wrangler_display_entry_context, WRANGLER_ENTRY_CONTEXT_KEY
     13 # Map to the unified type set that will be consumed by client side
     14 # We use Spark data type .simpleString() as the unified type set
     15 # Some of the types are not common Pandas dtype, like boolean/uint64. But it may appear in some DataFrame generated
     16 # by 3rd party library like sempy.
     17 _pandas_type_mapping = {
     18     'datetime64[ns]': 'timestamp',
     19     'int8': 'tinyint',
   (...)
     31     'boolean': 'boolean'
     32 }

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/notebookutils/visualization/dataWrangler.py:3
      1 from notebookutils.common.logger import log4jLogger
      2 import sys
----> 3 import pandas as pd
      4 import pyspark
      6 WRANGLER_ENTRY_CONTEXT_KEY = "wranglerEntryContext"

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/pandas/__init__.py:236
    195 __doc__ = """
    196 pandas - a powerful data analysis and manipulation library for Python
    197 =====================================================================
   (...)
    232     conversion, moving window statistics, date shifting and lagging.
    233 """
    235 from fsspec.registry import register_implementation
--> 236 from fsspec_wrapper.trident.core import OnelakeFileSystem
    237 register_implementation('abfs', OnelakeFileSystem, clobber=True)
    238 register_implementation('abfss', OnelakeFileSystem, clobber=True)

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/fsspec_wrapper/__init__.py:1
----> 1 from .core import AzureBlobFileSystem
      2 from .trident.core import OnelakeFileSystem
      3 from .version import VERSION as __version__  # noqa

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/fsspec_wrapper/core.py:5
      2 import time
      3 from urllib.parse import urlsplit
----> 5 import adlfs
      6 from fsspec.utils import infer_storage_options
      8 from .utils import logger as synapseml_pandas_logger

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/adlfs/__init__.py:2
      1 from .gen1 import AzureDatalakeFileSystem
----> 2 from .spec import AzureBlobFile, AzureBlobFileSystem
      4 __all__ = ["AzureBlobFileSystem", "AzureBlobFile", "AzureDatalakeFileSystem"]
      6 try:

File ~/cluster-env/clonedenv/lib/python3.11/site-packages/adlfs/spec.py:20
     17 from glob import has_magic
     18 from typing import Optional, Tuple
---> 20 from azure.core.exceptions import (
     21     HttpResponseError,
     22     ResourceExistsError,
     23     ResourceNotFoundError,
     24 )
     25 from azure.storage.blob import (
     26     BlobBlock,
     27     BlobProperties,
   (...)
     30     generate_blob_sas,
     31 )
     32 from azure.storage.blob.aio import BlobPrefix

ModuleNotFoundError: No module named 'azure.core.exceptions'

 

 

 

I have too many cases open with the Fabric Spark team already, and hoping to avoid another.  Has anyone seen this before?  Is it related to using custom pools?  Is it specific to our azure region (north central)?  Why would this happen today without any warning?  What are other customers doing when they encounter these issues?  Does everyone just repeat their workloads 2 or 3 times a day to accommodate these reliability problems in Fabric?  Is there any way for a customer to investigate the logs on the cluster nodes themselves?  I'm guessing there is something wrong on the VM itself.

 

 

 

1 ACCEPTED SOLUTION

Hi, 

I have been troubleshooting this issue today, as I experienced it whilst running my custom packages in my custom Fabric environment. I ended up creating a new environment and testing in a notebook to see if the issues persisted, miraculously this actually resolved the issue. All the built-in libraries were the same, and my custom packages also work in this new environment.

View solution in original post

9 REPLIES 9
v-saisrao-msft
Community Support
Community Support

Hi @dbeavon3,

 

I would like to confirm if the issue has been resolved on your end. If so, kindly mark the helpful reply and accept it as the solution. This will assist other community members in resolving similar issues more quickly.

 

Thank you

v-saisrao-msft
Community Support
Community Support

Hi @dbeavon3,


I would like to confirm if the issue has been resolved on your end. If so, kindly mark the helpful reply and accept it as the solution. This will assist other community members in resolving similar issues more quickly.

 

Thank you

v-saisrao-msft
Community Support
Community Support

Hi @dbeavon3,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

v-saisrao-msft
Community Support
Community Support

Hi @dbeavon3,

Thank you for reaching out to the Microsoft Forum Community.

 

We greatly appreciate your efforts and are glad to hear that your issue has been resolved while updating the version of the azure-core library in Fabric. 

Please accept your response as a solution, as this will help other community members can resolve similar issues more efficiently. 

 

Thank you.

Hi @v-saisrao-msft 

Are you a python guy?  I think you would agree that customers shouldn't be tinkering with the default version of azure-core in the Fabric runtime. 

Do you know where we can find out about sudden breaking changes like this one, that would impact the reliability of our scripts?

Hi @dbeavon3,

 

As a member of the Microsoft Fabric Community Support team, I can refer you to the following links that will help address your query regarding sudden breaking changes that could affect the reliability of the scripts.


Troubleshoot Python function apps in Azure Functions | Microsoft Learn 
Issues · Azure/azure-functions-python-worker

 
Additionally, please review the Idea Forum where you can contribute your own ideas and suggestions on any topic. These will be reviewed by Microsoft and may be implemented in future updates.

Home 

 

Thank you.

 

 

Hi, 

I have been troubleshooting this issue today, as I experienced it whilst running my custom packages in my custom Fabric environment. I ended up creating a new environment and testing in a notebook to see if the issues persisted, miraculously this actually resolved the issue. All the built-in libraries were the same, and my custom packages also work in this new environment.

dbeavon3
Memorable Member
Memorable Member

Sounds like the fix is to update the version of azure-core library in Fabric.  Odd that customers would be doing this independently of Microsoft's direction.  

I hope it is still a supported configuration.  We don't want to risk even more problems, and get even less support for our customized environment.

Did you find a solution to this? I have just had the exact same error, regarding the same package "notebookutils". Something must be happening with updates since we are both getting the same error within the same week.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.