Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers!
Enter the sweepstakes now!Preparing for a certification exam? Ask exam experts all your questions on May 15th. Register now.
Hi,
We can link a workspace with log analytics, I did with mine.
How could I from a notebook generate a log which is recorded in the log analytics linked to the workspace ?
Kind Regards,
Dennes
Solved! Go to Solution.
Hi @DennesTorres
Thanks for using Fabric Community.
You can add a code cell in your notebook as follows:
// Test log4j MDC/NDC
// https://github.com/mspnp/spark-monitoring/issues/28
import org.apache.log4j.MDC
val LOGGER_PREFIX = "[Gluten]"
val logger = org.apache.log4j.LogManager.getLogger(LOGGER_PREFIX + " " +
"com.contoso.LoggerExample")
logger.info("Hello, info message")
logger.warn("Hello, warn message")
logger.error("Hello, error message")
You can validate the log here:
Hope this helps. Please let me know if you have any further queries.
@DennesTorres is the real king as always.
The only real pain is that (as of today) the tables are ingested in Azure Log Analytics as "Custom Table (classic)", and these tables cannot be added to an Azure Events Hub, which is the only "make-sense" real-time source to be used in a Fabric Eventstream pipeline.
Hi,
I never managed to make the code explained here work. But the support to a log analytics emmitter was included in Fabric some time ago. I made a video about it: https://www.youtube.com/watch?v=F8w59GYDEBA&list=PLNbt9tnNIlQ5TB-itSbSdYd55-2F1iuMK&index=17&t=15s
Kind Regards,
Dennes
Your solution is gold, but still the solution creates Custom Tables into Azure Log Analytics, and that kind of tables as of today cannot be used to fuel Azure Event Hub. Therefore, it is not possible to stream those logs from Azure Log Analytics into a Fabric Lakehouse or Eventhouse.
The partial solution is to set the log retention period on Azure Log Analytics to 2 years, and query them with KQL queries on PBI or directly on ALA.
Hi @DennesTorres
Thanks for using Fabric Community.
You can add a code cell in your notebook as follows:
// Test log4j MDC/NDC
// https://github.com/mspnp/spark-monitoring/issues/28
import org.apache.log4j.MDC
val LOGGER_PREFIX = "[Gluten]"
val logger = org.apache.log4j.LogManager.getLogger(LOGGER_PREFIX + " " +
"com.contoso.LoggerExample")
logger.info("Hello, info message")
logger.warn("Hello, warn message")
logger.error("Hello, error message")
You can validate the log here:
Hope this helps. Please let me know if you have any further queries.
Until here is fine. The real pain is that the custom tables generated from Log Analytics cannot be exported to Azure Event Hub and consequentially neither in a Fabric EventStream Pipeline.
See: Log Analytics workspace data export in Azure Monitor - Azure Monitor | Microsoft Learn
Like this, if I want to consume the custom log data in PBI I have to continuosly refresh the semantic model to retrieve fresh data, and everytime you click refresh it queries ALL of the historical logs in Azure Log Analytics.
Hi, @Anonymous ,
This example seems made for Databricks, I couldn't manage to make it work in Fabric.
First problem is the import: This library doesn't exist and pip install doesn't work for it as well. Is there other way to make this import work ?
I tried to work around the import in two different way:
First attempt:
Hello, I am trying the same, did it work for you? Thanks!
Thank you, this seems great!
Hi @DennesTorres
We haven’t heard from you on the last response and was just checking back to see if your query got resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks
Check out the April 2025 Fabric update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
9 | |
4 | |
4 | |
2 | |
2 |
User | Count |
---|---|
9 | |
4 | |
4 | |
4 | |
3 |