Sounds like a flawed workflow, if this didn’t go through at least code review. Was it committed directly to master?
Curious to know what kind of system relies on hashed not changing? Technically the hashes don’t change, but a new set of commits is made. The history diverges, and you can still keep the old master if you need it for some time, even cherry pick patches to it…
The guy at work who managed git before me, well didn’t quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It’s a nightmare.
Special shout out to the person who committed a gigabyte memory dump a few years ago. Even with a shallow clone, it’s pretty darn slow now.
We can’t rewrite history to remove it since other things rely on the commit IDs not changing.
Oh well.
Sounds like a flawed workflow, if this didn’t go through at least code review. Was it committed directly to master?
Curious to know what kind of system relies on hashed not changing? Technically the hashes don’t change, but a new set of commits is made. The history diverges, and you can still keep the old master if you need it for some time, even cherry pick patches to it…
If you want to be the team’s hero, I’ve had good luck removing old commits using
git filter repo
.https://hibbard.eu/erase-sensitive-files-git-history-filter-repo/
Oh right. Oof.
I would be working to arrange an accident for those other things. Those other things probably need to be retired.
The guy at work who managed git before me, well didn’t quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It’s a nightmare.
why did you merge it?