Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Shawn_Eary
Advocate V
Advocate V

Version Control for Lakehouse File Uploads

When I upload files into an MS Fabric backed lakehouse, I don't think any version control is used:
(175) Manually Upload Large CSV files to a Microsoft Fabric Lakehouse - YouTube

https://www.youtube.com/watch?v=Ln4mpuknuco    (Same Link as Above)

I'm worried that out of confusion someday, someone will accidentally replace one of my uploaded CSV files with a corrupt or blank version. With Git or SharePoint, when a user checks in a corrupt file over a good version of the same file, you have a way to revert to the previous good version, but I don't see anyway to do that with MS Fabric.

How do I configure my MS Fabric Lakehouse to create a new version of hello_world.csv each time it is uploaded?

Example: If someone uploads a file named hello_world.csv into my lakehouse 7 times, then I want a repo to save all 7 versions with the latest version being the one that stays on top until I invoke a Git or SharePoint command to revert to an older version.

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @Shawn_Eary ,
The internal team has updated me regarding version control in Fabric.

Git integration is for doing version control on code, not data files. OneLake, isn't connected to Git Integration. Instead it is basically ADLS gen2. 
You can go through this link for reference : Solved: Delta lake time travel in Fabric SQL endpoints? - Microsoft Fabric Community
Instead if you want file versioning you can follow the below steps:

1)  You can connect to Blob Storage using the Dataflows or Data Factory connectors, or in a python notebook using the Storage APIs directly.

 

2) You can use GIT to version your files . Just stick them in the repo and then again connect to the repo using a python notebook. 

Hope this helps . Please let us know if you have any further queries.

View solution in original post

4 REPLIES 4
Anonymous
Not applicable

Hi @Shawn_Eary ,

Thanks for using the Fabric community and reporting this . 

I have reached the internal team for help on this. I will update you once I hear from them.

Appreciate your patience.

Anonymous
Not applicable

Hi @Shawn_Eary ,
The internal team has updated me regarding version control in Fabric.

Git integration is for doing version control on code, not data files. OneLake, isn't connected to Git Integration. Instead it is basically ADLS gen2. 
You can go through this link for reference : Solved: Delta lake time travel in Fabric SQL endpoints? - Microsoft Fabric Community
Instead if you want file versioning you can follow the below steps:

1)  You can connect to Blob Storage using the Dataflows or Data Factory connectors, or in a python notebook using the Storage APIs directly.

 

2) You can use GIT to version your files . Just stick them in the repo and then again connect to the repo using a python notebook. 

Hope this helps . Please let us know if you have any further queries.

Hi,

 

do you have any steps/example in order to implement what you suggested?

 

Anonymous
Not applicable

Hi @Shawn_Eary ,
It was great to know that you were able to get to a resolution. We expect you to keep using this forum and also motivate others to do that same. 

Thanks
Nikhila N

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.