Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Preparing for a certification exam? Ask exam experts all your questions on May 15th. Register now.

Reply
DennesTorres
Impactful Individual
Impactful Individual

Notebook Logs

Hi,

 

We can link a workspace with log analytics, I did with mine.

 

How could I from a notebook generate a log which is recorded in the log analytics linked to the workspace ?

 

Kind Regards,

 

Dennes

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @DennesTorres 
Thanks for using Fabric Community.
You can add a code cell in your notebook as follows:

// Test log4j MDC/NDC
// https://github.com/mspnp/spark-monitoring/issues/28 
import org.apache.log4j.MDC 
val LOGGER_PREFIX = "[Gluten]" 
val logger = org.apache.log4j.LogManager.getLogger(LOGGER_PREFIX + " " +
"com.contoso.LoggerExample") 

logger.info("Hello, info message") 
logger.warn("Hello, warn message") 
logger.error("Hello, error message")

 

vnikhilanmsft_0-1712136530662.png


You can validate the log here:

vnikhilanmsft_1-1712136581177.png


Hope this helps. Please let me know if you have any further queries.

View solution in original post

9 REPLIES 9
vgiatti
Frequent Visitor

@DennesTorres is the real king as always.

The only real pain is that (as of today) the tables are ingested in Azure Log Analytics as "Custom Table (classic)", and these tables cannot be added to an Azure Events Hub, which is the only "make-sense" real-time source to be used in a Fabric Eventstream pipeline.

Hi,

I never managed to make the code explained here work. But the support to a log analytics emmitter was included in Fabric some time ago. I made a video about it: https://www.youtube.com/watch?v=F8w59GYDEBA&list=PLNbt9tnNIlQ5TB-itSbSdYd55-2F1iuMK&index=17&t=15s

Kind Regards,

Dennes

Your solution is gold, but still the solution creates Custom Tables into Azure Log Analytics, and that kind of tables as of today cannot be used to fuel Azure Event Hub. Therefore, it is not possible to stream those logs from Azure Log Analytics into a Fabric Lakehouse or Eventhouse.

The partial solution is to set the log retention period on Azure Log Analytics to 2 years, and query them with KQL queries on PBI or directly on ALA.

Anonymous
Not applicable

Hi @DennesTorres 
Thanks for using Fabric Community.
You can add a code cell in your notebook as follows:

// Test log4j MDC/NDC
// https://github.com/mspnp/spark-monitoring/issues/28 
import org.apache.log4j.MDC 
val LOGGER_PREFIX = "[Gluten]" 
val logger = org.apache.log4j.LogManager.getLogger(LOGGER_PREFIX + " " +
"com.contoso.LoggerExample") 

logger.info("Hello, info message") 
logger.warn("Hello, warn message") 
logger.error("Hello, error message")

 

vnikhilanmsft_0-1712136530662.png


You can validate the log here:

vnikhilanmsft_1-1712136581177.png


Hope this helps. Please let me know if you have any further queries.

Until here is fine. The real pain is that the custom tables generated from Log Analytics cannot be exported to Azure Event Hub and consequentially neither in a Fabric EventStream Pipeline.

See: Log Analytics workspace data export in Azure Monitor - Azure Monitor | Microsoft Learn

 

Like this, if I want to consume the custom log data in PBI I have to continuosly refresh the semantic model to retrieve fresh data, and everytime you click refresh it queries ALL of the historical logs in Azure Log Analytics. 

 

Hi, @Anonymous ,

 

This example seems made for Databricks, I couldn't manage to make it work in Fabric.

First problem is the import: This library doesn't exist and pip install doesn't work for it as well. Is there other way to make this import work ?

I tried to work around the import in two different way:

First attempt:

logger = spark.sparkContext._jvm.org.apache.log4j.LogManager.getLogger(LOGGER_PREFIX + " " + "com.contoso.LoggerExample")
 
Second Attempt: 
logger = spark.sparkContext._jvm.org.apache.log4j.LogManager.getRootLogger()
 
The workspace was linked to log analytics. The table you exemplified in log analytics doesn't exist. I tried many other tables related to power bi or spark, but none of them registered the log messages.

What else am I missing?

Thank you in advance!

Kind Regards,

Dennes



Hello, I am trying the same, did it work for you? Thanks!

 

Thank you, this seems great!

Anonymous
Not applicable

Hi @DennesTorres 
We haven’t heard from you on the last response and was just checking back to see if your query got resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks

Helpful resources

Announcements
FBCApril_Carousel

Fabric Monthly Update - April 2025

Check out the April 2025 Fabric update to learn about new features.

Notebook Gallery Carousel1

NEW! Community Notebooks Gallery

Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.

April2025 Carousel

Fabric Community Update - April 2025

Find out what's new and trending in the Fabric community.