I had very similar thoughts, the major difference being that C# is no worse than other languages.
It is similar to Java in that it compiles to CIL, a stack-based bytecode interpreter, which is touted as "portable". We all know how portable Java is, and how portable .Net/Mono is: not very, when you start trying to get actual work done, instead of simple examples. Instead of acknowledging differences between operating systems and hardware architectures, it applies a heavy (again, bytecode interpreter/JIT compiler) abstraction layer to hide them.
Shortly before C# was released, Anders Hejlsberg gave a talk about it at work. We were all underwhelmed, regarded it at a typical me-too-entend-embrace-extinguish MS ploy. I never programmed in it, and never regretted it. I have never had much faith in Mono either
For this reason alone, C# is not a language one should base their understanding of programming languages on.
At the language level, C# is Java with added insecurities.
At the implementation level, C# is Java with longer application installation times due to static optimisation, and the
inability to optimise based on how the code is interacting with the data.
Beyond that, it is better to ignore the details, and free up your brain to concentrate on significantly different languages.
Portability and adaptation to different architectures, and understanding the different approaches (paradigms) different languages have, is utterly paramount for understanding software engineering in a way that one can apply in more than one particular niche (Windows C# programming). C# is designed to avoid having to do that.
All modern languages are designed to conceal the underlying architecture.
If we ignore the single-vendor control, the reliance on CIL, and just look at the language specification, say the latest version (C# 6.0) standardized by ECMA in ECMA-334.pdf –– noting that Microsoft published it in 2015, ECMA-334 was published in 2022, and Microsoft has already published C# 11.0 –– you could compare it to older versions of C++. In particular, it does not have specific-size integers, and instead codifies that 'short' is 16-bit, 'int' is 32-bit, 'long' 64-bit, and the 'char' type uses UTF-16; i.e. IP32 or LP64.
In short, meh.
Such delays in standardisation are, ahem, standard. But there are worse phenomena, e.g. I remember a usenet post triumphantly announcing the first
complete C++ compiler
six years after the spec was released. That alone confirmed my earlier decision that C++ was fundamentally the wrong direction!
I didn't know about that integer issue, because I decided not to trouble my brain with C# details. I presume it was a consequence of
the security holes in C# programs containing C/C++ components.
Java ... my Java-cup says "program once, use many"...and I laugh every time I use it to drink my coffee.
Exactly.
If we do not learn from the past, we're doomed to repeat the same mistakes, as well.
Java's portability is far better than C/C++
How much that matters in any given case is up for discussion.