Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
peterott
New Member

Suppressing logging...

I have a number of Fabric notebooks to pull data from a number of APIs.  I recently started logging to Log Analytics, and then quickly realized that the platform was generating millions of records per night.  I would like to suppress all but my custom logs.  How can this be achieved?

6 REPLIES 6
v-achippa
Community Support
Community Support

Hi @peterott,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @nielsvdc and @tayloramy for the prompt response. 

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Apologies - I was on holiday in Italy for the past 10 days.  I will follow up later today, after I've had a chance to digest all of this.

 

Hi @peterott,

 

As we haven’t heard back from you, we wanted to kindly follow up to check that if you’ve had any chance to check this? Is your issue resolved?

 

Thanks and regards,

Anjan Kumar Chippa

Hi @peterott,

 

We wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

tayloramy
Community Champion
Community Champion

Hi @peterott

 

My org saw the same thing. We decided to not use Log Analytics and instead use the admin REST APIs to capture audit data. 

 

SPecifically we use the GetActivityEvents endpoint nightly to pull audit data into a SQL database. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
nielsvdc
Solution Sage
Solution Sage

Hi @peterott

 

When you enable Log Analytics integration for Fabric notebooks, the Spark diagnostic emitter sends all categories by default:

  • Log (driver logs)
  • EventLog (Spark events)
  • Metrics (runtime metrics)

These include verbose platform logs for every Spark job, executor, and metric—not just your custom messages. That’s why ingestion explodes overnight.

 

You might want to have a look at Python logging in a notebook - Microsoft Fabric | Microsoft Learn on how to use the Python logging library. You can also use the example configuration code below. Added it in a utils-generic notebook and include (%run) it every notebook, so that logging is always consistent.

 

import logging

logging_config = {
    "version": 1,
    "disable_existing_loggers": False,
    "formatters": {
        "simple": {
            "format": "%(asctime)s.%(msecs)03d [%(levelname)-8s] %(message)s",
            "datefmt": "%Y-%m-%d %H:%M:%S",
        }
    },
    "handlers": {
        "stdout": {
            "class": "logging.StreamHandler",
            "formatter": "simple",
            "stream": "ext://sys.stdout",
        }
    },
    "loggers": {
        "NotebookLogger": {
            "level": "DEBUG",
            "handlers": ["stdout"],
            "propagate": False,
            },
    },
}

logger = logging.getLogger("NotebookLogger")
logging.config.dictConfig(config=logging_config)

 

Hope this helps. If so, please give a Kudos 👍 and mark as Accepted Solution ✔️.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.