> This nuance is something that only the nix model started to capture at all.
Unpopular opinion, loosely held: the whole attempt to share any dependencies at all is the source of evil.
If you imagine the absolute worst case scenario that every program shipped all of its dependencies and nothing was shared then the end result would be… a few gigabytes of duplicated data? Which could plausible be deduped at the filesystem level rather than build or deployment layer?
Feels like a big waste of time. Maybe it mattered in the 70s. But that was a long, long time ago.
Node.js basically tried this — every package gets its own copy of every dependency in node_modules. Worked great until you had 400MB of duplicated lodash copies and the memes started.
pnpm fixed it exactly the way you describe though: content-addressable store with hardlinks. Every package version exists once on disk, projects just link to it. So the "dedup at filesystem level" approach does work, it just took the ecosystem a decade of pain to get there.