Hope you had some fun tinkering with that simple analog temperature comparator from the previous part. That circuit was stripped down to the bone – a classic “how to even do linear signal processing with op-amps” example. But the world doesn’t end at a comparator. You can build 4–20 mA → voltage converters, voltage → frequency, even a basic PID… But today let’s zero in on what used to actually be called an analog process computer.

Analog machines: algorithm written in hardware

The basic principle is dead simple: If you can write your requirements down mathematically – you can implement it with a handful of op-amps, resistors and capacitors. In other words – you’re building an algorithm out of discrete components.

Example 1: simple temperature watchdog

Temperature > 100 °C → turn on the fans. Just add a plain NPN transistor switch to the comparator output and you’ve got yourself a ready-made controller:

Changing the algorithm from the temperature comparator to a watchdog

Simple temperature watchdog diagram

Example 2: multiple sensors → single alarm output

You’ve got six temperatures scattered around different spots. You want an alarm the moment any one of them crosses the limit? Take six identical comparators, slap 1N4148 diodes on the outputs and you’ve got a real-world diode-OR:

Changing the algorithm from the temperature comparator to a watchdog

Combining multiple comparators together

That’s exactly how the first serious supervisory systems were built in the 60s…80s. It was a total game-changer because the watchkeeping crew stopped running around with a pen and notepad logging values from every corner of the facility. They had one central “panel” in the control room and could see everything live, all at once.

Where’s the Processor in all this?

Exactly… there isn’t one. No von Neumann state machine, no RAM, no multitasking, no context switching, no scheduling delays. Every analog block runs its job continuously and in true parallel.

For our experiments I picked the LM393 – because odds are you’ve got one lying around in a drawer somewhere. Cheap, ubiquitous chip that’s been in production since 1976. But if you built those same circuits on something like an AD8000 you’d get your algorithm result in ~3–5 ns.

And whether you have 1 or 30 measurement channels – all of them finish at basically the same instant. Dream on about timings like that on a von Neumann machine with its memory bottlenecks.

The biggest pain: changes

When you need to tweak the algorithm – say shift the setpoint by 2 degrees or flip “above” to “below” – you have to physically change the electronic circuit. With 3–4 blocks – manageable: throw in a few multi-turn pots, switches and jumpers.

Ship's machinery was supervised by algorithm written in op-amps and copper.

Autronica system card: applied analog signal computing

With 40–50 cards – it turns into a nightmare. Scalability becomes a multi-act tragedy. Obviously a tragedy only from today’s perspective, when you usually make those changes in software while sipping your morning coffee.

The real implementation example: Autronica

I worked on several of these computers, including Autronica systems on ships.

Ship's machinery was supervised by algorithm written in op-amps and copper.

Autronica system card: applied analog signal computing

Classic setup:

  • dozens of Eurocard 100×160 mm boards
  • each card handles one independent channel (temperature, pressure, level, rpm, etc.)
  • each compares the sensor signal against a setpoint (multi-turn pots or fixed resistor dividers)
  • when something crosses the limit – alarm LED lights up on that card + group alarm trigger + master signal to the operator.

Before these systems arrived the crew did scheduled “engine room rounds” and wrote down readings by hand. I remember one case – during sea trials after yard repair: the intermediate shaft bearing started seizing. It melted in roughly 3 minutes from the first temperature alarm. In the old fashion / round-based world – no way you catch that in time; you’d end up with catastrophic damage.

Mixed-Signals era

In the 80s people started bolting small 8-bit micros (Motorola 6800, Zilog Z80) onto those analog racks. What did the processor do?

flowchart TD A[polling for active alarm line of analog computer] --> B[CPU] C[RTC clock] --> B B -->D[Alarm for operator] B -->E[Printing log] F[alarm corresponding text message]-->B

It polled which alarm line went active, looked up the pre-programmed message text for that channel, displayed it on a monitor and printed it to the event logger (basically a proto-syslog on paper). Pretty useful – let you analyze history and trace root cause when several events cascaded at once. Especially when the whole chain ended in a blackout and nothing was lit anymore.

aftermath

Analog signal processing is still very much alive today and gets used everywhere you need:

  • micro / pico-volt signals
  • noise floor < 1 nV/√Hz
  • input impedance > 10¹² Ω
  • or simply the absolute lowest propagation delay

One of the hottest topics right now is using analog computers for machine learning. Research shows they could solve some of the biggest headaches: insane power consumption, von Neumann memory bottlenecks, latency and cost. Turns out you can build analog circuits that exploit basic electronic laws – Ohm’s and Kirchhoff’s – to perform neural network MAC operations. Where’s the real power of analog computers hiding? Weights become physical quantities (conductance) stored directly inside the compute element itself!

If you enjoyed the topic of analog computers that refuse to fade into history, or you feel there’s more to analog tech you should know about and I could write something on it – just let me know!