1. Unis do teach a lot of theory, but that says nothing about you can't go practical. I came from an undergraduate university that almost hate capitalism and money, and only does pure theoretical research granted by state funding. Still, I managed to build my own dorm lab and did a number of interesting projects ranging from simple MCU pranking toys to ion mobility spectrometer. Now, as a PhD candidate, I can still do 0201 soldering and PCB milling or etching with primitive tools. Theoretical education never contradicts practical capability.
I fully agree. My opinion is that if you're so inclined, it is best to learn both theory and practice at the places where they are taught best.
Considering there is so much material on youtube concerning hobby electronics, it's not hard to get started.
2. Math is important? No. I have calculus II before calculus I in my transcript, which means I failed calculus I and had to retake it. I HATE MATH (weird from an Asian, but that's true). I know my business, and I don't touch control theory and RF magic. But besides those math intensive fields, I fail to see why math is that important for EE.
I think it depends on the field. I've a friend who is in telecommunications, RF and microwave engineering. His courses are a grind. He literally get 700 pages of Maxwell equations and other proofs to follow (little to no textual explanations) for a single exam. People are lucky to pass with a 7. An 8 would get you the most prestigious graduation project from the professor straight away.
3. You don't have to pick one field. You can know and be competent om more than one field. As for the 10 fields you've listed, I've did them all. Despite I won't claim expert on all of them, but if there's an entry level job (BS level, not graduate level), I can take on them all.
Exactly. In practice focusing on being a "T shape" engineer is good. But added value from a grad degree should come from specialization, although I do think that many similar principles are spread across different fields, and with a degree in hand you have proven that if you can learn 1, so you could also learn the other.
IC is just designing circuits without ideal OPAMPs, and it's all about designing the OPAMP by your self. With solid understanding of circuit analysis and some knowledge in control theory, it's not that bad. Virtuoso/ADE crashing drives me mad more than the design itself.
Actually funny. I asked one of the PhDs in analog IC design how a typical project may look like. He told that given a CMOS circuit idea, they would ofcourse try to come up with an interesting circuit implementation on paper and in the simulator first, identifying correct biasing and establishing critical area's of the circuit.
Then once the schematic is done, about 90% of the
work grind is in creating a good chip layout for the design. Modern chips have such small feature size that capacitance is a trouble on everything, so they keep simulating parasitics from the layout over and over again.
90% Okay perhaps I'm taking his figures a bit to literally, but I tend to believe that "schematic capture and initial simulation" is a minority of the work.
Depending on what field you operate in, I think that e.g. analog IC design for radio transceivers is rather interesting. At my university they are certainly experimenting with extremely high linearity transceivers and filters for wireless receivers (e.g. receive multiple radio bands at once with 1 radio frontend, or receive and transmit on the same RF channel at the same time).
I've never heard anyone talk about opamps. Only about nullors and g
m's.
Digital design is, for the most of the time, reusing IP blocks. When you do need to write one, pick an HDL and make something. Fuck the debate between VHDL and Verilog, either will be fine, and for most of the time, unless you are a professional digital IP designer, your task will be easy, and you probably will spend more time on the tools than the design.
Not so sure, depends on what you're doing I suppose. Most companies that were pitching their digital design projects, showed how the
architecture of a large data acquisition system is most important, alongside with achieving the requires computations/second (e.g. multiply-accumulate operations or FFTs per second) given by the algorithm engineers.
Academically speaking there is some research going on how tools and designflow can be improved using EDSLs. We don't want to keep mapping and folding discrete algorithms manually onto functional units, explicitly telling how a controller should look like. VHDL and Verilog in many respects are still very medieval tools with standards going to back to the 80s and 90s. Certainly it can't be as "hip and modern" as web languages of today, but there are plenty of tools available today that use a higher order level language and generate VHDL/Verilog as an input to virtually all FPGA and ASIC toolchains out there.
PCB designing is about knowing the tool and the theory. SI and PI are tough science with simplified simple rules of thumbs, learn them, practice them and you will be fine. As for the tool, I use Altium, and I can't see why it's hard to learn. I started using a pirate copy of Protel 99 SE when I was 11, and to me, Eagle is harder to use than Altium. I can cook a simple board in hours and a fairly complicated ARM+RAM board in less than a week.
I think that PCB design is a rather practical design art. Knowing your transmission lines helps to implement modern communication standards. Knowing your EMC do's-and-dont's is essential to be doing PCB and product design for a living. Apart from an EMC course (from RF group - so a grind), I don't think uni will give much practice in PCB design at all - but given how cheap China is in fabricating there is basically no excuse not to practice it.