Is there a way to lock individual files or directories on fork when using git?

We are a team of 60+ developers working on the same product and are moving from SVN to Git and GitHub. We have a process in SVN where in individual files are locked and whenever a developer wants to commit code, he needs to get it unlocked by the owner of the file. Three of us are the owners of the total 150+ files. The unlocking is preceded by a code review.

In Github, we are planning to use the Fork-Clone model - each project a group of dev is working on will do a fork, each developer will do a clone of the fork, write the code & commit to origin, the lead of the feature will do a pull request to upstream.

Though this seems fine, the problem is when a big project gets delivered, it brings in lots of changes for review and hence, increases the load for the file owners. Also, this might happen in the later cycles of development and hence the project might be jeopardized.

One method we thought might work is to have hooks when the git push is done to the origin (fork). There can be one final review git pull to upstream.

However, we could not find any github extensions or push hooks for the same. Is there a quick way (read, existing extension) to do this with Github or should we use the same hooks that we would use with git?

109232 次浏览

This use case is one of the reasons Git is so much better than SVN --> rebase! If you follow good git workflow you rebase from upstream before submitting your Pull Requests. You don't need to worry about file locking and stomping on another person's commits and merge conflicts etc... a rebase sets your work aside, applies the remote commits and then applies your work on top.

I think this just takes a rethinking in your process and relying on the strengths of git versus force fitting a Subversion workflow on top of git. Your "fork-clone" model might need another look as well. Most often every developer has their own fork, you can share repos via remotes between teams if you want. But contributors sharing the same origin sets up some bad habits.

Gitflow is a very popular git workflow, and Github themselves has some nice tips and shares their workflow.

No chance, if file is not mergeable and you need to lock it, use a centralized solution instead of GIT, i.e. SVN or ClearCase.

Git does not provide any locking functionality, since it is decentralized. However, if you host your code on GitLab Enterprise Edition Premium, you can use the web interface to lock individual files or folders, achieving exactly what you want to do.

If you do not want to host your project on someone else's server (their website), you can also download GitLab and host it on your own webserver.

If you are using git LFS (which is supported by some git hosting providers, like GitHub) you could use File Locking.

Mark a file type as lockable by editing the .gitattributes file:

*.docx lockable
# Make MS Word files lockable

And lock it with:

$ git lfs lock example.docx

You can unlock your files with git lfs unlock example.docx and those of somebody else by adding --force.

Not exactly locking, but Github has introduced a concept called "Code Owners". Allows you to restrict part of your codebase to only allow commits after review by the code owners

this is possible. git-lfs 2.0 introduces the ability to lock files: see these links: https://github.com/git-lfs/git-lfs/wiki/File-Locking. Support for this feature is available starting from TFS 2017.2: https://learn.microsoft.com/en-us/vsts/release-notes/.

You can use LFS and you could lock individual files, or instead jus add the files to .gitattributes file,

https://github.com/git-lfs/git-lfs/wiki/File-Locking

For reference:

Git is a distributed version control tool, so centralising files to lock them is against its devlopment philosphy, so its not possible.

However, there are ways by which we can achieve the same result.

one is using git-lfs that is used for large file system. But this require you to migrate the git repo, installing git-lfs alters git repo, and uninstalling require you to migrate again to git repo. As stated in its document. This is already explained more clearly in other answers.

Other solution is to use git in some way to achiveve the same result. like using git pull's upload-pack to trigger command on server that will invoke some command(manually written) on server that will handle file locking in some way. and put (pre|post)-recieve hooks to disallow changes if pushed by someone other then who locked it initially. you can also alter git-upload-pack command to trigger warning if some files are locked, at the time ot pull requests, or repo update, if you want, but i don't recomment that, as that will make git update/upgrade difficult. check out this git repo of mine here i achieve the same using git-pull's upload-pack.

For background, in our project we have import/export functionality that generate XMLs on exporting and when there are multiple XML pushes from multiple persons it become extremly difficult to do manual merging XMLs as its structure changes on every export. so even if two XMLs are technically same, but from file's and git's prespective they are always very different files, so we coordinate using chat groups to stop working on one XML when someone is already working. here using git-pull's upload-pack now we coordinate in team just for alerting, and sometime don't and handle locks by this script of mine, so even if someone didn't recieve the alert can still be working without worry.