So, for years, at least in the unix field, we've been using shared libraries to avoid duplication across every binary. This usually goes tightly coupled with a package manager. If myprogram needs libcrypto.so, it "depends" on it, and the package manager makes sure it's installed.
Static linking can be useful for binary redistribution, and when a program requires a specific version of a library to work.
Now, lately, some new (admittedly interesting) programming languages have popped up (Better worded: they have gained some popularity). I've found Go, Rust and Nim, though there could be others.
These three, all come with a package manager of their own to fetch and build dependencies, which is totally fine. The only problem is, it is very difficult if not impossible to build the dependencies as shared libraries, and link the binaries dynamically.
Sure, this might be a good idea if you just want to redistribute a binary, or if you want your dependencies at a very specific version.
But how will system package managers handle this? Sure, they can go the lazy way and provide the static binary, but that goes against what a package manager is meant to do (Hurp durp drag and drop in (/usr)/bin).
Why can't these languages/compilers at least make it easy to build shared objects and dynamically linked binaries "the olde way"?