We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
I see the OpenLineage libraries are by default included as built-in library in Spark. When a notebook reads and writes to OneLake does it emit lineage events automatically? According to Copilot it does and lineage visualization in Purview is optional. Where are those events stored? I see a SparkLineage folder in OneLake but it is always empty. I am not able to find clear documentation regarding this topic. I appreciate comments. Thank you.
Solved! Go to Solution.
Hi @RenatoDM
The `SparkLineage` folder in OneLake is not populated by default. Its presence suggests compatibility with OpenLineage standards, but explicit configuration is required.
• To emit granular OpenLineage events (e.g., column-level lineage), you must:
• Implement a SparkListener to intercept Spark execution plans.
• Configure diagnostic emitters to route logs to Azure Storage or Log Analytics
Native Purview integration captures basic item-level lineage (e.g., notebook → Lakehouse table) but doesn’t populate `SparkLineage`
Hi @RenatoDM
The `SparkLineage` folder in OneLake is not populated by default. Its presence suggests compatibility with OpenLineage standards, but explicit configuration is required.
• To emit granular OpenLineage events (e.g., column-level lineage), you must:
• Implement a SparkListener to intercept Spark execution plans.
• Configure diagnostic emitters to route logs to Azure Storage or Log Analytics
Native Purview integration captures basic item-level lineage (e.g., notebook → Lakehouse table) but doesn’t populate `SparkLineage`
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.