Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
i'm looking for best-practices how to track and deploy changes on LH-Managed Tables and LH-SQL-Views between workspaces.
How do you handle this task dynamically and with less-maintenance efforts?
Situation:
In Fabric WH and DB the git-integrations tracks all DB Objects like Tables,VIews,Procs... with this GIT integration we can use fabric deployment piplines to transfer changes from DEV (WorkspaceA) to PROD (WorkspaceB).
In Lakehouses we are missing the Object-Level tracking in GIT as stated here: Lakehouse deployment pipelines and git integration - Microsoft Fabric | Microsoft Learn
Important
Only the Lakehouse container artifact is tracked in git in the current experience. Tables (Delta and non-Delta) and Folders in the Files section aren't tracked and versioned in git.
Solution Approaches:
Anyone with other options and/or practical experience what works best or maybe even also a working notebook solution?
I shouldn't be the first with this requirement?! Spoiler: My background is SQL-Datawarehounsing and not Lakehousing, so maybe my approach from Warehousing must be adpated to Lakehouse world?! We are implementing currenlty a lakehouse solution that has hundreds of LH-Tables and VIews. The solution relies on quiete extensive SQL Endpoint Views (much hof the transformation logic is inside views, this views act as input for a Notebook MERGE to silver-layer LH-Tables). Tracking all this tables/view with a manual notebook is not really feasible and time-intensive.. Side note: We moved to LH-Approach in hope that it is more capacity-efficent as our previous approach building a DWH based on Fabric Database, because we had to found out that the DB consumptions was eating our complete F64 capacity on data loads. But now challeging how to apply a working CI/CD Concept for LH
Solved! Go to Solution.
Hi @jochenj ,
Thanks for reaching out to the Microsoft fabric community forum.
Cosider this as a workaround and try it once. Create a notebook in your development workspace that automatically extracts metadata for all:
You can serialize the output (e.g., as .json or .sql files) and store them in the Files section of your Lakehouse or in a dedicated folder within OneLake. This makes the metadata portable and ready for deployment to other environments.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Menaka.
Community Support Team
Hi @jochenj ,
Thanks for reaching out to the Microsoft fabric community forum.
Cosider this as a workaround and try it once. Create a notebook in your development workspace that automatically extracts metadata for all:
You can serialize the output (e.g., as .json or .sql files) and store them in the Files section of your Lakehouse or in a dedicated folder within OneLake. This makes the metadata portable and ready for deployment to other environments.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Menaka.
Community Support Team
Hi @jochenj ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @jochenj ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution so that other community members can find it easily.
Thank you.
Hi @jochenj ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!