the point is you solve language and integration problems through composition, not through consolidation.
Modularity, not frameworks.
Composability, not monolithic integrated designs.
You design and specify the interfaces, and allow any implementation that implements the agreed upon interfaces and behaviour.
Yes. But that's not how software is developed nowadays. Most projects
aggregate into something that can be shipped, that's all.
In modular designs, correctly tracking and fulfilling dependencies is key. Interpreted languages like Python can do much more logic when choosing which dynamic library to load, than lower-level systems programming languages like C that rely on the system dynamic loader.
If we consider portable binaries say across x86-64 ELF systems using SYSV ABI (Linux, BSDs, etc.), the optimum case would be if we could provide a dynamic linking 'hint' or 'script' file, a sort of a shell script type of thing, that could specify the paths/files to the dynamic libraries based on the current run-time environment, mixing both system-provided and application-provided libraries as needed. This has been tried many times before, of course –– one I don't really like is GNU
libtool ––, but the one approach that seems to work best is to use a launcher script to set up environment variables defining those paths. The downside with the script approach is that it cannot really deal with individual libraries, unless you use a dedicated library directory you populate with symlinks. Checking and populating that in a script or loader program does incur a bit of a startup latency, though.
These are the kinds of problems that are more important to me than which language does a trivial loop the fastest. I can implement whatever I need in whatever language I have available, but I'd like it to be modular and maintainable: reinventing the wheel again and again is dull, and only good for those who like the job security as wheel reinventors.