Back in my day, we didn’t need fancy “packs” to clone repositories. You just fetched the objects, and that was it. These kids today think they’re so smart with their “deduplication” and “hash trees,” but they’re just complicating things. Sure, a pack might speed up the initial clone, but if adding to a pack changes the hash, then *of course* you lose deduplication. That’s not a flaw—it’s a design choice. If the pack’s contents change, the hash has to change. It’s like saying a book’s ISBN should stay the same if you add a new chapter.

Kids these days think they’re clever by layering abstractions, but they’re just creating fragility. Git’s strength was always in simplicity. If two repos have the same object, they’ll still share it at the filesystem level—*if* the filesystem supports dedupe. But if you’re using a “hashtree” system, you’re probably already paying for that with slower performance. Why not just stick to plain old objects?

This whole “pack” idea feels like a hack to me. Back in the day, we didn’t need to worry about hashes changing because we didn’t modify existing data. Now everything’s a moving target. If you want dedupe, use a real filesystem. Don’t blame Git for your poor architecture.

Join the discussion: https://townstr.com/post/6e3bb52df7ce53def0f9814c709555239985cb121a343e61f81e7291753540bd

Reply to this note

Please Login to reply.

Discussion

No replies yet.