Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedJoin us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now
Hi,
I am working on Medallion Architecture in Fabric.
I am having Bronze and Silver Layer in Lakehouse (Bronze - Loaded data from my SQL Server using Pipeline) (Silver - Enriched zone).
For Lakehouse, I have used a Notebook to transform and saved the same as Delta Tables. (One of my approach is automating the same using Notebooks based on Timestamps - Can load incrementally, and call notebooks)
For my gold layer, I am preferring a "Warehouse".
Is there any way to connect notebooks to warehouse to automate while calling a pipeline - My flow is like, getting fact and dimension tables from Silver Layer, saving as a view and automatically updating in Tables, while calling pipeline.
Is there any way which I can connect notebooks to Warehouse and automate using Pipeline.
Any suggestion on this would be highly helpful!
Thanks in Advance
Solved! Go to Solution.
Well with Notebooks you're working within the Spark environment which is not the scope of the Warehouse. You can read from Warehouse delta tables but not write to them (currently). You can use pyodbc or jdbc to connect to the Warehouse sql endpoint in notebooks to issue sql commands
My approach has been to load the Warehouse using SQL stored procedures triggered from a Data Pipeline, if that helps
How are you populating your Warehouse?
Hi @AndyDDC ,
Thanks for your reply!
I have my Bronze and Silver tables in Lakehouse, would like to have a separate "Warehouse" for "Gold" layer.
For Silver layer, I have used a Notebook (Can automate the same using Pipleine)
I would like to know how can I use my warehouse for Gold Layer, too automate if possible, based on incremental loads (For normal table creation, we can use T-SQL in Warehouse to get the gold layer tables)
Is there any ways to connect Notebooks with Warehouse, such that if it can be automated based on incremental data.
Will notebooks be useful for Warehouse, would like to know about that!
Any other approach to use Warehouse as a gold layer is also fine for me.
Thanks in Advance!!
Well with Notebooks you're working within the Spark environment which is not the scope of the Warehouse. You can read from Warehouse delta tables but not write to them (currently). You can use pyodbc or jdbc to connect to the Warehouse sql endpoint in notebooks to issue sql commands
My approach has been to load the Warehouse using SQL stored procedures triggered from a Data Pipeline, if that helps
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Fabric update to learn about new features.
User | Count |
---|---|
26 | |
19 | |
5 | |
3 | |
2 |
User | Count |
---|---|
31 | |
25 | |
16 | |
12 | |
12 |