Author Topic: Let's bamboozle the programmers: Computing Amplifiers book from 1966  (Read 1768 times)

0 Members and 1 Guest are viewing this topic.

Offline RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6446
  • Country: ro
A nice weekend book:  "Applications Manual for Computing Amplifiers for Modeling, Measuring, Manipulating, and Much Else" George A. Philbrick Researches, 1966

Webpage ToC:  https://www.analog.com/en/education/education-library/applications-manual-computing-amplifiers.html
Full book PDF:  https://www.analog.com/media/en/training-seminars/design-handbooks/Application-Manual-for-computing-amplifiers/application-manual-computing_amplifiers.pdf
« Last Edit: April 10, 2022, 03:35:18 pm by RoGeorge »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9907
  • Country: us
Obviously, I haven't read the entire document but section 11.10 deals with integrators which leads to analog computing and it goes on from there.  That is one of my favorite topics!

Lord Kelvin is the guy who came up with the use of integrators to solve differential equations.
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 8057
  • Country: us
  • Retired, now restoring antique test equipment
Another interesting book in my collection of "many a quaint and curious volume of forgotten lore" from before that:
R M Howe:  Design Fundamentals of Analog Computer Components, D Van Nostrand Co, 1961
(my copy "withdrawn" from the library of Michigan Techological University, after having been checked out only three times according to the "date due" sheet)
All amplifiers therein use vacuum tubes.  On p. 74, "Transistors show promise, but at present their disadvantages of low input impedance, low gain, and low output voltage seem to outweigh the advantages of their small size, high efficiency, high reliability, and no filament requirements."  Of course, the vacuum tube amplifiers needed mechanical choppers for high accuracy.
The largest examples, occupying three or more 19" rack bays, are from Beckman, Goodyear, or Reeves, and some use plugboards for programming.
I have posted this anecdote before:
I was doing some work at a Boeing facility, who received a house-organ magazine from Boeing corporate monthly.  There were lots of interesting historical photographs, mostly of aircraft.  One month, they showed an engineer (wearing white shirt and tie) working with an analog computer in a single 19" rack, about 7 feet high, to his left while to his right was a blackboard identifying which pot dial corresponded to which variable.
The photo caption indicated that this computer generated so much heat, that "400 vacuum tubes were required to remove the heat".
Next month, there was an interesting collection of letters to the editor by mature employees and retirees.
« Last Edit: April 10, 2022, 04:10:47 pm by TimFox »
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6514
  • Country: fi
    • My home page and email address
If a software dev is bamboozled by analog logic (or even more esoteric stuff like fluidics or any other unconventional computing), they're not really a software developer, are they?  Not in the sense of 'software' that John W. Tukey coined the term in 1958, at least.

All kidding aside, that is actually an important, serious point.

A proper software developer, a proper software engineer, should be able to handle many different forms of expressing logic and control expressions; otherwise, they're just "foo-language programmers", aren't they?  Understanding that logic and computation can be performed in many wildly different ways is extremely important to software design, as otherwise you're just reusing the same hammer and treating everything as nails.  It is like understanding that you can have a toolbox with many wildly different tools, rather than just the one you're holding right now, no matter how amazing a supertool it might feel to you to be.

Mathematical transforms are a crucial tool.  The reason humans used to use slide rules, is the logarithmic-antilogarithmic transform, that converts multiplication to addition, and division into subtraction of terms.  Similarly, Fourier and its inverse transform convert signals from the time domain into the frequency domain.  Transforms turn hard problems easy and easy problems hard; and by chaining them – Unix philosophy didn't pop out of nothing, it has long practical precedence! – you can solve really complicated, hard problems, and even make it all seem simple.

Many of the unconventional computing methods involve such transforms.  You can even model the different paradigms in programming –– imperative or functional; procedural or event-driven; and so on – as transforms applied to the underlying problem, to deal with it in a more appropriate domain.

Thus, being surprised and delighted to discover unconventional computing, is natural.  But being bamboozled by it?  If you are, then I'm afraid you're not a proper software person, just a whatever-programming-language programmer.
 
The following users thanked this post: DiTBho

Offline RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6446
  • Country: ro
You're right there, I'm certainly not even a programmer, let alone a software developer.   ;D

Though, the SW dev you are saying looks more like a SW researcher, rather from academia than from a SW company.  In SW production, software development is rather prosaic execution than philosophical thinking.

The fun philosophical realization is that everything around computes.  Even a sitting still pebble computes in its existence - by computing, thinking the laws of physics that made the pebble possible and sitting there like it does.

Absolutely everything computes something, on its own way, just that we usually don't take advantage (in an explicit manner) of the computing properties of the world.  We don't usually harvest the computation in a numerical form, but we indirectly use/take advantage of those pebble-specific computations as properties of the given object.

For example, we harvest the computation result of a water molecule by simply drinking the water  ^-^  instead of laying down in a table all those numbers computed by that water molecule.  The sliding ruler is a good example where we explicitly harvest the computation power in the form of numbers, and not as ruler's object properties (though one may improperly use the sliding ruler as a pointing stick instead of a multiplier  ::) ).

In my understanding, the need of numbers and computation (in a mathematical way) came first as a need to keep an inventory (to count the sheep), then from the need to simulate "something" in order to predict the future evolution/behavior of that given something.  And in the last century the computation turned into a goal in itself and now we have armies of computer science engineers, and thinking trends like Unix philosophy.  (maybe in the Unix philosophy should be explicitly added the rule saying "Unix will not try to stop you doing wrong things, because that will also stop you doing smart things").

We are now at the point where we start realizing the importance of yet another thing coming after computation, the so called big data.  Huge piles of data, which are nothing but stored computation results  :-// .  And suddenly we are in the ML (Machine Learning) and AI (Artificial Intelligence) realm, where a trained NN (Neural Network) computes in a very different way than a sliding ruler or a digital computer.

Very different in the sense that there is a new factor coming into play, that new factor being the training data, more precisely the context (the world) where from the training data was collected.  If we look at a computer as a "crunching numbers" machine, then a trained NN will also crunch numbers, but the computation laws of a NN are not coming from math, but from the training data.  The numbers from the training data were also capturing the behavior of an entire world, the world from where the training data was harvested.

It's interesting to compare how a NN based computation differs from an ALU (Arithmetic Logic Unit) based computation:

ALU based computation
- requires an algorithm for the given computation/problem
- the programmer must completely describe the algorithm
- produces exact results, relevant for each and every input data

NN based computation
- requires a pile of data harvested from a similar situation/world with the problem to solve, and the data doesn't have to be a complete description of that world
- the programmer must describe only the goal function, not the algorithm of how to achieve that goal
- produces good enough but inexact results, relevant only from a statistical standpoint over all the possible input data

From a bird's eye view, we can as well conclude that the ALU type is based on mathematically exact truths, like true/false, while the NN type computation is based on statistical and contextual truths, like good/bad.

(In fact there is no such thing as good, or bad, by itself.  Good and bad makes sense only in the context of a given goal.  Same thing can be either good or bad, depending on the given goal.  Good is whatever suits that goal, bad is whatever stands against the goal.  Sure, one can say that setting a goal on which we define good/bad is similar with setting the math axioms on which we define what is true/false, but the fuzziness difference between good/bad and true/false still remains.)

For a NN, the rules (axioms) are extracted from the training pile of data, and thus whatever situations happens more often in the training data will become the truths for the trained NN.  A corollary for this is, for a NN, the definition of truth can be changed by simply cherry-picking the training data.  Whoever controls the data can create a totally new/different reality for a NN.

The funny thing is, we humans are the NN kind of computing machines, and thus we have the saying "A lie repeated a thousand times becomes truth" to remind us all the above in just one line.

Sorry for the long rambling slip, it's a "rainy Sunday afternoon"TM outside.  ;D


Pic embedded from:  http://readingjimwilliams.blogspot.com/2011/09/book-1-chapter-7.html

\[ \star \ \star \ \star \]
Another interesting book in my collection of "many a quaint and curious volume of forgotten lore" from before that:
R M Howe:  Design Fundamentals of Analog Computer Components, D Van Nostrand Co, 1961

Couldn't find that one on https://www.pdfdrive.com/ but found a few other Analog Computation books from around the same decade.  Some are very well structured, and it was very entertaining to browse them.  :)

\[ \star \ \star \ \star \]
[analog computing] is one of my favorite topics!

Totally believe you about that.  :)


pic stolen from https://www.eevblog.com/forum/testgear/what-did-you-do-with-or-to-your-oscilloscope-today/msg3030544/#msg3030544

It's funny how the "analog electronic" term we use nowadays was coined from "analog computation", where the "analog" word was denoting the similarity between a physical dynamic system and its corresponding differential equation(s) describing the given dynamic system.  It just happened that "the analog" was also producing continuous and smooth signals.  Or at least that's how I remember reading somewhere.

[quote user=https://www.vocabulary.com/dictionary/analogue]The word analogue (also spelled analog) comes from the Greek ana, meaning "up to," and logos, meaning, among other things, "ratio" and "proportion." In 1946, it entered computer language as an adjective to describe a type of signal that is continuous in amplitude.[/quote]

Analog circuits could have been as well called "similar circuits", and the the term "Analog Engineer" in EE would have become "Similar Engineer".

\[ \star \ \star \ \star \]
Just imagine the dialog:
- I'm a mechanical engineer working for Ford, and you?
- I'm a similar engineer working for Similar Devices.

 :-DD

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6514
  • Country: fi
    • My home page and email address
Though, the SW dev you are saying looks more like a SW researcher, rather from academia than from a SW company.  In SW production, software development is rather prosaic execution than philosophical thinking.
I put the line between "developer" vs. "programmer".  Developers are given problems they then solve.  Programmers implement logic they're given, in a specific programming language.  Creativity being the difference.  Software Engineering is a separate axis, and is about whether sound engineering principles are used in the process.  Others use other names and definitions, of course.

Even simple application development involves "philosophical thinking", unless you are creating throwaway single-use code.  One reason the world is so full of crappy software, is that vendors still hire "programmers" to do development work.  If you hire schoolkids to create a brick wall, and then pay a professional plasterer to make it look nice, it should not be a surprise when it falls down on someone.  Even if it looks very nice.

(If it matters, I have done quite a few years of software development myself, and also ran an IT company for a few years.  Oh, and I also did not mean to, nor do I think, that you are in any way wrong.  I just thought that it might be interesting to read why I think a software developer should be surprised and delighted by things like this, and not be weirded out or bamboozled at all.)
« Last Edit: April 10, 2022, 03:14:12 pm by Nominal Animal »
 
The following users thanked this post: golden_labels

Offline RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6446
  • Country: ro
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #6 on: April 10, 2022, 03:35:50 pm »
Got it, different definitions.  Where I happened to work before, the "programmers" were called "software devs", while the "software developers" as you describe, were rather called "software designers", or at least that was my understanding while looking from the HW realm.  Changed the title.
« Last Edit: April 10, 2022, 03:41:31 pm by RoGeorge »
 

Offline RoGeorgeTopic starter

  • Super Contributor
  • ***
  • Posts: 6446
  • Country: ro
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #7 on: April 10, 2022, 04:00:12 pm »
rstofer, while admiring your analog computer in the above pic, I've noticed the cover of the book sitting underneath it:  That's the "Introduction to Analog Computer Programming - Dale I. Rummer", is it?

I know because I've accidentally found that book yesterday at http://archive.computerhistory.org/resources/access/text/2017/02/102628334-05-01-acc.pdf, while looking for the other title TimFox was mentioning.  ;D

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14863
  • Country: fr
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #8 on: April 10, 2022, 05:08:57 pm »
"Analog computers" are actually getting a little bit of revival, but let's still put that in perspective. ::)
For some specific computations, they can be more efficient than the digital equivalent in terms of hardware. Downside is, they are of course not "exact" by nature, but guess what, that's fine for some applications. For instance, neural networks. Some companies/labs are working on such chips:
https://rain.ai/
https://mythic.ai/

But these days, it takes "niche" applications like this and careful, silicon-level implementations to actually make sense. Otherwise, it would never compete with digital computing, especially regarding power efficiency. And, it remains to be seen if those approaches will actually end up successful and kinda "mainstream".

Apart from the low-level implementation (you do not usually expect software engineers to master low-level analog hardware design), the concepts should actually not be a major problem for them - ideally -  since it's essentially just maths. And software is also just applied maths. Now it's unfortunate that many software engineers are not that good in maths anymore, and to me, that's a problem (and that may be one of the reasons why it's still hard to talk about software development as "engineering", strictly speaking). But as the OP mentioned "programmers", then I guess it's all the more clear that the OP probably meant your basic software "coder", who only really knows about sequential imperative programming anyway, with maths skills that often do not exceed basic arithmetics by much.

Now as I said, if we're not just talking about the concepts of analog computing, but actual low-level implementation, then of course most software devs are going to be lost, just as much as they would be if they were looking at the schematic of a simple amplifiier, or even that of a logic gate at the transistor level, while they know what their function is.
« Last Edit: April 10, 2022, 05:10:51 pm by SiliconWizard »
 

Offline TimFox

  • Super Contributor
  • ***
  • Posts: 8057
  • Country: us
  • Retired, now restoring antique test equipment
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #9 on: April 10, 2022, 06:15:18 pm »
Before analog computers became unfashionable, and digital computers became "blazing fast", there was a common statement:
"Analog computers take a long time to program, but give their results immediately.
Digital computers are faster to program, but take time to compute."
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6514
  • Country: fi
    • My home page and email address
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #10 on: April 11, 2022, 07:21:13 am »
All designers and problem solvers should realize the power of transformations: how one can transform the problem in one domain to another, where it is much easier to solve.

It can be a deceptively difficult concept, because we do it so often naturally.  Consider things like musical instruments, say a guitar or a violin.  Many humans discover the functionality of the sound box as kids, how it makes sounds louder.  If we did not intuitively understand the power of transformations, we'd instead just try having bigger or thicker wires to produce louder sound, wouldn't we?

Specifically, it is too easy to limit oneself too strictly wrt. transformations, either by defining as just a mathematical operation, or by not being able to "think outside the box".  I consider the latter just a missed opportunity to transform the problem into a more suitable domain.  (Which also means I do believe it is a skill that can be taught, one that many learn by themselves through experience and working in many different problem-solving domains.)

This is also why I firmly believe that software designers should be surprised but delighted to discover unconventional logic, including analog logic.  It may not be useful in their work, but it helps overcome the natural built-in barriers we have against unconventional problem transformations, i.e. "thinking outside the box".
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4012
  • Country: gb
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #11 on: April 11, 2022, 08:45:42 am »
nano-bots (MEMS-based) will be more mechanical computers than digital computers.
More similar to the Babbage's machine than to the Von Neumann's machine  :o :o :o
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4012
  • Country: gb
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #12 on: April 11, 2022, 08:53:47 am »
Von Neumann, Alan Turing and Claude Shannon are the most important conceptual inventors of the stored-program digital computer.

But talking about the Hungarian-born American mathematician János Neumann (1903-1957), the Von Neumann’s pioneered gifts he gave us apply in all the directions that he influenced:
  • quantum theory
  • automata theory
  • economics
  • game theory
  • defense planning

Impressed  :o :o :o
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4012
  • Country: gb
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #13 on: April 11, 2022, 09:20:59 am »
Anyway, the main problem with human beings is that conscience is hosted into shells of flesh and bones, organic tissues subject to inevitable metabolic stress, while *Computing* requires mathematical powers, and mathematical powers start to decline at age a quarter of a century, after which experience can conceal the deterioration for a time, but it's like entropy ... you can talk as long as you want, the bloody entropy will grow the same.

So, even considering a genius-mind (which is a rarity), human beings are mathematically-inefficient at designing computers because the human mind can only produce great results from the age of 10 to the age of 25, which apparently looks a great thing, but actually only 15 years of autonomy isn't a great deal  due to primary and secondary needs that massively limit the productivity.

(humans need to eat, drink, dream, work, googling for things, and having holiday (and sex) because it easily gets tired, ... 90% of time busy for primary and secondary needs)
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9907
  • Country: us
Re: Let's bamboozle the programmers: Computing Amplifiers book from 1966
« Reply #14 on: April 11, 2022, 01:24:00 pm »
rstofer, while admiring your analog computer in the above pic, I've noticed the cover of the book sitting underneath it:  That's the "Introduction to Analog Computer Programming - Dale I. Rummer", is it?

I know because I've accidentally found that book yesterday at http://archive.computerhistory.org/resources/access/text/2017/02/102628334-05-01-acc.pdf, while looking for the other title TimFox was mentioning.  ;D

Yes, it is Rummer.  I'm going to grab the PDF as well.

That patch job is for 2 slightly different versions of damped harmonic motion.  One is the classic mass-spring-damper and the other is the swinging door problem (like behind the counter in a restaurant).  Chapter 6 of the book...

In addition to the two Comdyna computers, I have also built Dr. Vogel's computer (see attached)
http://www.analogmuseum.org/english/homebrew/vogel/

I remember spending time on the mass-spring-damper problem in grad school and any analytic solution wasn't helpful.  It is so much easier to make adjustments to the constants while viewing the results in real time.  Of course, this predates the IBM PC by a few years as well.
« Last Edit: April 11, 2022, 01:31:54 pm by rstofer »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf