A bit
and silly arguing against something I kind of agree with, but…
On one hand yes, a basis. But what is a basis? Theoretical? Foundational? Occupational?
That the phenomena observed is explicable and predictable. Engineers just use that phenomena to solve problems.
By that definition, any empirical observation an engineer or 'lower' worker makes consistently and believes they have a good handle on describing, fits that same bill. Perhaps with less clarity, or more depending on how it works for them.
Your story about selenium is missing the preamble - ...
Yes, because I know it had a basis in some kind of science, but that doesn't really support your original claim that Einstein's "... explanation of the photoelectric effect not only underpinned quantum mechanics (the basis for transistors) but also gave basis for the engineering of image sensors, phototelegraphy, etc etc.".
You mention work of physicists like in 1876 the rather interesting "Effect of Moonlight on Selenium", Maxwell experimenting with selenium and his letter of 1874. But I can then go back to Edmond Becquerel "credited with the discovery of the photovoltaic effect" in 1839, Wikipedia says he was a physicist but then I notice he was 19 at the time, experimenting in his father's lab - what pigeon hole does he go in?
We can keep digging back into the past to find some earlier work and say "but, but, physics!" (or "but, but, engineering!"), but as penfold said there seems to be a chicken and egg argument. Winding back in time through the chicken / egg oscillation, we soon get back to something that's not chicken, not bird egg, murkier to maybe something more like the fruiting body of a slime mould (still chicken and eggish) or with multiple phases (like instars in insects) even some which can reverse, or become chaotic:
https://en.wikipedia.org/wiki/Mating_of_yeast(Another odd factoid stumbled on during this thread.) Eventually we reach a point where the oscillation becomes indiscernible.
My point is that image sensor technology etc predated quantum theory, when it is easy to assume that it had to be required knowledge. We can keep going back to make the same mistake. Winding back over examples to find "basis" support for an argument like physics vs engineering, or the decisive origin, is arbitrary. In fact, in bringing up this "preamble", you seem to be confirming this point.
... While the first 1925 FET patent didn't get noticed, it's remarkable how clearly Lilienfeld defines its operation is based on nascent quantum theory,
https://worldwide.espacenet.com/patent/search/family/035202468/publication/US1745175A?q=pn%3DUS1745175
"The basis of the invention resides apparently in the fact that the conducting layer at the particular point selected introduces a resistance varying with the electric field at so this point; and in this connection it may be assumed that the atoms (or molecules) of a conductor are of the nature of bipoles. In order for an electron, therefore, to travel in the electric field, the bipoles are obliged to become organized in this field substantially with their axes parallel or lying in the field of flow. - Any disturbance in this organization, as by heat inovement, magnetic field, electrostatic cross-field, etc., will serve to increase the resistance of the conductor; and in the instant case, the conductivity of the layer is influenced by the electric field. Owing to the fact that this layer is extremely thin the field is permitted to penetrate the entire volume thereof and thus will change the conductivity throughout the entire cross-section of this conducting portion."
It is, but there is no wild mathematical treatment there, and it's not really "quantum" in the sense the word is used today. The electrons are still seen as classical particles. What made fets "possible" (in extreme volumes) was purity and passivation, incl the breakthrough of "Egyptian engineer Mohamed Atalla".
I'm sure quantum theory made a good showing, but it's not intruding much into these stories.
I can only go off what I know about history. The unfortunate thing about history is that there is no "control" in the experiment of our history to tell whether something would or would not have happened without X-Y-Z.
What I do know is that MASSIVE technology advancements ALWAYS follow a PARADIGM SHIFT (term from my History of Science class) in theory.
I'm not entirely buying it. There are lots of narratives like that in belief systems, and science and technology is no exception. We will always find exemplars which or who 'bucked the trend' in the right or wrong way. I see technological progress as remarkably smooth in spite of these major events. It would be interesting to see if Moore's law applies backwards in time - from what I know about the evolution of computers it might well.
It doesn't fundamentally alter your point, but it does suggest that these patterns we like to hold high are not what they seem. There is an understandable fundamental bias in academia (to believe in itself). I can see why it feels a compelling need to be the "self-appointed defender of the orhtodoxy" (to borrow more words from the crackpot index). Universities are magnets for intellectual types, and to an extent are the tail that wags the dog of society. Like most people, I tolerate or even embrace that, but there is a limit to how much of it we will swallow. (Who am I kidding, there is no limit, not in the short term anyway.)
And I'd say we got the ol' 1-2 punch from the paradigm shift from Maxwellian Theory (that made global, wireless communications and AC power possible) to Quantum Theory (that made transistors possible, enough said!).
I'm trying but failing to let that one go. Transistors were no more or less "possible" after the discovery.
It's entirely possible one was even accidentally used by some unpublished crystal set experimenter, trying to inject DC bias as close to the rectifying point as possible under a microscope, while messing up the circuit in a way they never bothered checking, because it worked so well. Or worked out and published in some obscure way, which comes to light now, rewriting history (or at least Wikipedia). "Physics" (the belief, science the consensus) holds that that is not a true discovery, as if physics feels invalidated - language on Wikipedia reflective of that. Engineering would see it as no big, history would still call it a "discovery".
Like me probing away at the SiC "stone" (even now I don't know if it is a crystal) to replicate the blue LED. I don't need to understand everything at the quantum level (some might say I do) to do that. Nor does it imply I need to be dumb as a box of rocks and incapable of understanding it or stumbling onto a plausible explanation. Nor does it have to be a big thing or accepted in the halls of academia (where some would say it does).
I think what triggered me when you say "possible" is that you don't appear to be using it in a qualified way (or dangling it as bait - except for the intended meaning itself which you could assume would get a bite!). Alternators and transformers predate Maxwell's publication, I think it was bsfeechannel who was trying to argue that Maxwell's subsequent inspirational encapsulation of things which were already 'possible', somehow enables them, rendered it as foundational after the event. Global communications, yes, while I would say made "possible" is a stretch, Maxwell's theory directly inspired and led to it - that can't be anything but foundational.
But people were trying to make transistorish things like uber Hall-effect amplifiers well before quantum theory got a foothold, and by my reasoning, that initial desire and "scent" of expectation (they knew something like it ought to be possible) was more foundational than theories progressed in no small part because of that same effort.
I'm not certain we'd have the iPhone ever. There are a lot of 'wrong' ways to make an iPhone. It's much more likely we'd find all the wrong ways before the right ways without a governing theory to predict what to do in the next experiment... which is in the physics.
I think that relies on a misconception that engineers do not work scientifically, or methodically, so are doomed to fumble about indefinitely exploring each wrong option with equal probability as right one(s). If that were true, then iPhones would still be at version 0.1, for reasons much more fundamental than the radio module not working or even CPUs having not been invented.
It reminds me of in my youth when I tried to drive through random suburban streets to get to the top of a hill at night (in the days before maps on phones). I said just drive "up" and you'll eventually find your way to the top. My friend (also engineering) was critical, saying "there's only one way to somewhere and nearly infinite wrong turns". Then I said "all roads lead to Rome", he said "no all roads lead from Rome but only one leads to it". I can't remember how it was sorted out, except we got to the top ok, and didn't have a map. Then driving along a ridge, he said "don't look over the edge, people drive where they're looking". Narratives. Everyone learns different things, and can end up thinking about the same thing in different ways.
As sort of an aside, I think I neglected the fundamental importance of experiment at school and university. I either just believed everything I was told, formed quiet reservations, or considered labs a redundant if interesting a waste of time (it seemed unreasonable for a lab to refute something that had been taught). Nowadays I question everything I am told and believe only evidence, and even that is a risk.
I certainly don't want to say physics had no part or is dead. The only thing I want to target is this (I assume) taught notion that science begat physics begat engineering that doesn't seem to exist outside of academia and governmental ivory towers. The commercial world is completely indifferent to that, and simply assumes that physics is one of the parts of engineering.
I work in both academia and the commercial world. The predecessor to ABET defined engineering as,
"The creative application of scientific principles to design or develop structures, machines, apparatus, or manufacturing processes, or works utilizing them singly or in combination; or to construct or operate the same with full cognizance of their design; or to forecast their behavior under specific operating conditions; all as respects an intended function, economics of operation and safety to life and property."
Emphasis mine. So, if physics isn't foundational to what we do, what even is engineering???
Perhaps it might be helpful if you elucidate on that question.
I've seen a similar definition trotted out by newbie engineers or students, and have a quiet chuckle (or hope) that they'll soon learn. It's not terribly wrong, but kind of misses the point of what engineering is. You do what you need to to make things work, respond to marketing, know when to copy those before you and what to check or trust, deal with overly formulaic expectations or meaningless requirements, work out how to recognise mistakes without making them, what theories deserve to be thrown out the window and when to do that (like resistivity of metals). It looks like the output of a committee - it looks like a theory.
I possibly read "scientific principles" differently from you. I take it to mean the principles of science, which in practical terms, means tested and understood. Not in a sense of selecting from a menu of established laws, equations, textbook smarts, except in loose terms.
You must use tools at your disposal and you must test. You must think, be objective. These are all no-brainers. Engineering is advanced hacking? It's certainly not going through a menu of checklist items provided by an academic recipe-writer.
Given that the definition is a fairy tale, and the premise is unfalsifiable, I'm not going to do a good job of elucidating on your question "what even is engineering?" Perhaps a job for physicists who don't care about physics?
Claims that "physics had to come before technology" can be made arbitrarily, eg fax machines might have stepper motor drivers, image sensor chips and even lasers. But when it's said that optical fax existed in the late 1800s, those claims need adjustment. They might still be correct, but it doesn't have much meaning.
I'm not sure what you're saying here.
I'm saying the premise and supporting evidence is arbitrary. Either way can be supported by picking facts to use.
Probably the real "physicists versus engineers" debate is whether to use i or j for the complex number. And the answer is obviously j because what the heck do you call current then?
You can probably guess my response even when you wrote that! Engineers don't use
j (or
i). Is there any place in engineering, anywhere, where sqrt(-1) has any physical relevance at all? The only place I've ever seen it doing something useful (beyond being an arcane convenience for mathematicians) is in a Feynman lecture where it quasi-continuously described a wave function inside and out of an energy well or something (I can't find it now).