Two things have clearly changed over time. First, the body of knowledge is continually growing, so the necessary, or at least useful core competencies has grown and will continue to grow. To me core competencies means things that will likely be useful over an entire career. Maths, ohms law, networks, written communication and so on. An engineer graduating in the 1930s would not have needed or been exposed to all of the maths associated with Lebesque integration(Impulse functions, Laplace transforms and so on). One graduating in the early 1950s would have had no exposure to programming languages etc. Engineering level understanding of quantum mechanics has become more and more important since the 1960s/1970s. And so on. Addition of these skills has not generally allowed elimination of earlier skills (differential equations, Fourier transforms etc).
The second thing is that the pace of technology change has quickened. The application like things that might have lasted an entire career in the 1930s just don't happen anymore. Knowing how to bias and operate a vacuum tube was an application technology that was broadly useful in the field for 30 or more years. Use of large arrays of discrete logic chips lasted 20 years or less. Programming languages/development environments seem to have half lifes measured in months or years, not decades. (Remember Algol, Smalltalk, APL etc?)
Both of these changes are a result of good things, but they place more pressure on trying to identify an appropriate foundation and absorb it during degree studies, and up the ante on continued learning, either through formal programs or through various independent learning methods such as this forum.
Summary: choose technology wisely, and it will remain mainstream for the majority of your career.
Most of the things I learned in my late 70s EE degree are
still relevant. I did, however, take a great deal of care in choosing my degree, and there were many that I thought were worthless.
Returning to electronics after 20 years away, I am
very surprised at how little has changed. Most of the changes can be classified as smaller/faster/cheaper, e.g. an Arduino class MCU is equivalent to a Z80 + memory + peripherals, plus an ICE for debugging - exactly what I was using 33 years ago. And it is
still programmed in C! And the x86 is still important.
Analogue is pretty much the same, except that frequencies can now be significantly higher. You need to understand microwave circuit techniques for analogue and quantised analogue (a.k.a. digital) circuitry. The best general purpose book is
still The Art Of Electronics - I have both the first and third editions, so I can compare them!
As for languages, I have used many from assemblers, HDLs, procedural, mathematical, and OO languages. If you understand their foundations and the strengths/weaknesses of each, you can pick up just about everything very quickly.
I chose carefully and ignored this month's fashionable language, which meant I have principally used C (1982-to date), Smalltalk (1988-1994), and Java (1996-to date). I ran away from monstrosities such as Delphi (= not C), C++ (if that's the answer then the question is wrong), PERL (for the same reason as APL!). As for HDLs, there's not too much difference between HiLo (early 80s) and Verilog/VHDL.
Development machines were originally PDP-11+Unix (made by Microsoft, who else), then Sun, then HP-UX, now Linux. (Still running that abortion, the XWindow system!)
So what has changed significantly? More DSP. Faster ADCs/DACs. Very low power. IDEs (where modern IDEs have now caught up with early 90s Smalltalk IDEs). The net.