www.wired.com Open in urlscan Pro
151.101.2.194  Public Scan

Submitted URL: https://url6380.news.pitchbook.com/ls/click?upn=-2B0Jlug6i80mluh5Xp2-2BZsfdSR-2BbC0M2D2Pkba6hdCewpGBorHoi0Ja-2Fhz5NUVK-2FQeSf4IToNA...
Effective URL: https://www.wired.com/story/unbelievable-zombie-comeback-analog-computing/
Submission: On April 06 via api from US — Scanned from DE

Form analysis 1 forms found in the DOM

Name: newsletter-subscribePOST

<form class="form-with-validation NewsletterSubscribeFormValidation-dsLZxZ knFPLP" id="newsletter-subscribe" name="newsletter-subscribe" novalidate="" method="POST"><span class="TextFieldWrapper-fyESqh jHBvgO text-field"
    data-testid="TextFieldWrapper__email"><label class="BaseWrap-sc-SJwXJ BaseText-fEohGt TextFieldLabel-gOIlYA deUlYF iMwyKu gvciiP text-field__label text-field__label--single-line" for="newsletter-subscribe-text-field-email"
      data-testid="TextFieldLabel__email">
      <div class="TextFieldLabelText-iZIfRd Vyjr">Your email</div>
      <div class="TextFieldInputContainer-ftOJqv jjmDkC"><input aria-describedby="privacy-text" aria-invalid="false" id="newsletter-subscribe-text-field-email" required="" name="email" placeholder="Enter your email"
          class="BaseInput-jOuGBm TextFieldControlInput-djZhRz kQKbFR fVFHhe text-field__control text-field__control--input" type="email" data-testid="TextFieldInput__email" value=""></div>
    </label><button class="BaseButton-bchzqy ButtonWrapper-dPnRsw bNHSqa krFfZV button button--utility TextFieldButton-hDAZno cPfSJU" data-event-click="{&quot;element&quot;:&quot;Button&quot;}" data-testid="Button" type="submit"><span
        class="ButtonLabel-eCHSuR cFpizh button__label">SUBMIT</span></button></span>
  <div id="privacy-text" tabindex="-1" class="NewsletterSubscribeFormDisclaimer-dfKtWx cKSCxS"><span>By signing up you agree to our <a href="https://www.condenast.com/user-agreement">User Agreement</a> (including the
      <a href="https://www.condenast.com/user-agreement#introduction-arbitration-notice"> class action waiver and arbitration provisions</a>), our <a href="https://www.condenast.com/privacy-policy">Privacy Policy &amp; Cookie Statement</a> and to
      receive marketing and account-related emails from WIRED. You can unsubscribe at any time.</span></div>
</form>

Text Content

Skip to main content

Open Navigation Menu
Menu
Story Saved

To revist this article, visit My Profile, then View saved stories.

Close Alert


The Unbelievable Zombie Comeback of Analog Computing
 * Backchannel
 * Business
 * Culture
 * Gear
 * Ideas
 * Science
 * Security

Story Saved

To revist this article, visit My Profile, then View saved stories.

Close Alert

Sign In



Search
Search
 * Backchannel
 * Business
 * Culture
 * Gear
 * Ideas
 * Science
 * Security

 * Podcasts
 * Video
 * Artificial Intelligence
 * Climate
 * Games
 * Newsletters
 * Magazine
 * Events
 * Wired Insider
 * Jobs
 * Coupons

 * How to Love Tech Again
 * I Saw God in a Chip Factory
 * The Never-Ending Fight to Repair
 * The Future Is Analog 
 * Who’s Watching the Watchers?
 * Weapons of Gas Disruption





Play/Pause Button
Pause
Illustration: Khyati Trehan

Charles Platt

Backchannel
Mar 30, 2023 6:00 AM


THE UNBELIEVABLE ZOMBIE COMEBACK OF ANALOG COMPUTING

Computers have been digital for half a century. Why would anyone want to
resurrect the clunkers of yesteryear?
 * Facebook
 * Twitter
 * Email
 * Save Story

   To revist this article, visit My Profile, then View saved stories.

 * Facebook
 * Twitter
 * Email
 * Save Story

   To revist this article, visit My Profile, then View saved stories.




LET'S GET PHYSICAL

From semiconductors to surveillance cameras, hardware still rules the world and
shapes the future.

 * How to Love Tech Again

 * I Saw God in a Chip Factory

 * The Never-Ending Fight to Repair

 * The Future Is Analog 
   
   Now Reading

 * Who’s Watching the Watchers?

 * COMING SOON
   
   
   Weapons of Gas Disruption

When old tech dies, it usually stays dead. No one expects rotary phones or
adding machines to come crawling back from oblivion. Floppy diskettes, VHS
tapes, cathode-ray tubes—they shall rest in peace. Likewise, we won’t see old
analog computers in data centers anytime soon. They were monstrous beasts:
difficult to program, expensive to maintain, and limited in accuracy.

Or so I thought. Then I came across this confounding statement:

Bringing back analog computers in much more advanced forms than their historic
ancestors will change the world of computing drastically and forever.



Seriously?

I found the prediction in the preface of a handsome illustrated book titled,
simply, Analog Computing. Reissued in 2022, it was written by the German
mathematician Bernd Ulmann—who seemed very serious indeed.



I’ve been writing about future tech since before WIRED existed and have written
six books explaining electronics. I used to develop my own software, and some of
my friends design hardware. I’d never heard anyone say anything about analog, so
why would Ulmann imagine that this very dead paradigm could be resurrected? And
with such far-reaching and permanent consequences?



I felt compelled to investigate further.

For an example of how digital has displaced analog, look at photography. In a
pre-digital camera, continuous variations in light created chemical reactions on
a piece of film, where an image appeared as a representation—an analogue—of
reality. In a modern camera, by contrast, the light variations are converted to
digital values. These are processed by the camera’s CPU before being saved as a
stream of 1s and 0s—with digital compression, if you wish.



Engineers began using the word analog in the 1940s (shortened from analogue;
they like compression) to refer to computers that simulated real-world
conditions. But mechanical devices had been doing much the same thing for
centuries.

The Antikythera mechanism was an astonishingly complex piece of machinery used
thousands of years ago in ancient Greece. Containing at least 30 bronze gears,
it displayed the everyday movements of the moon, sun, and five planets while
also predicting solar and lunar eclipses. Because its mechanical workings
simulated real-world celestial events, it is regarded as one of the earliest
analog computers.

As the centuries passed, mechanical analog devices were fabricated for earthlier
purposes. In the 1800s, an invention called the planimeter consisted of a little
wheel, a shaft, and a linkage. You traced a pointer around the edge of a shape
on a piece of paper, and the area of the shape was displayed on a scale. The
tool became an indispensable item in real-estate offices when buyers wanted to
know the acreage of an irregularly shaped piece of land.

Other gadgets served military needs. If you were on a battleship trying to aim a
16-inch gun at a target beyond the horizon, you needed to assess the orientation
of your ship, its motion, its position, and the direction and speed of the wind;
clever mechanical components allowed the operator to input these factors and
adjust the gun appropriately. Gears, linkages, pulleys, and levers could also
predict tides or calculate distances on a map.

Featured Video



Digital Amps to Revive Analog Audio Gear



Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





In the 1940s, electronic components such as vacuum tubes and resistors were
added, because a fluctuating current flowing through them could be analogous to
the behavior of fluids, gases, and other phenomena in the physical world. A
varying voltage could represent the velocity of a Nazi V2 missile fired at
London, for example, or the orientation of a Gemini space capsule in a 1963
flight simulator.

But by then, analog had become a dying art. Instead of using a voltage to
represent the velocity of a missile and electrical resistance to represent the
air resistance slowing it down, a digital computer could convert variables to
binary code—streams of 1s and 0s that were suitable for processing. Early
digital computers were massive mainframes full of vacuum tubes, but then
integrated circuit chips made digital processing cheaper, more reliable, and
more versatile. By the 1970s, the analog-digital difference could be summarized
like this:



The last factor was a big deal, as the accuracy of analog computers was always
limited by their components. Whether you used gear wheels or vacuum tubes or
chemical film, precision was limited by manufacturing tolerances and
deteriorated with age. Analog was always modeled on the real world, and the
world was never absolutely precise.



When I was a nerdy British schoolboy with a mild case of OCD, inaccuracy
bothered me a lot. I revered Pythagoras, who told me that a triangle with sides
of 3 centimeters and 4 centimeters adjacent to a 90-degree angle would have a
diagonal side of 5 centimeters, precisely. Alas, my pleasure diminished when I
realized that his proof only applied in a theoretical realm where lines were of
zero thickness.

In my everyday realm, precision was limited by my ability to sharpen a pencil,
and when I tried to make measurements, I ran into another bothersome feature of
reality. Using a magnifying glass, I compared the ruler that I’d bought at a
stationery store with a ruler in our school’s physics lab, and discovered that
they were not exactly the same length.

How could this be? Seeking enlightenment, I checked the history of the metric
system. The meter was the fundamental unit, but it had been birthed from a
bizarre combination of nationalism and whimsy. After the French Revolution, the
new government instituted the meter to get away from the imprecision of the
ancien régime. The French Academy of Sciences defined it as the longitudinal
distance from the equator, through Paris, to the North Pole, divided by 10
million. In 1799, the meter was solemnified like a religious totem in the form
of a platinum bar at the French National Archives. Copies were made and
distributed across Europe and to the Americas, and then copies were made of the
copies’ copies. This process introduced transcription errors, which eventually
led to my traumatic discovery that rulers from different sources might be
visibly unequal.

Similar problems impeded any definitive measurement of time, temperature, and
mass. The conclusion was inescapable to my adolescent mind: If you were hoping
for absolute precision in the physical realm, you couldn’t have it.

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





My personal term for the inexact nature of the messy, fuzzy world was muzzy. But
then, in 1980, I acquired an Ohio Scientific desktop computer and found prompt,
lasting relief. All its operations were built on a foundation of binary
arithmetic, in which a 1 was always exactly a 1 and a 0 was a genuine 0, with no
fractional quibbling. The 1 of existence, and the 0 of nothingness! I fell in
love with the purity of digital and learned to write code, which became a
lifelong refuge from muzzy math.

Of course, digital values still had to be stored in fallible physical
components, but margins of error took care of that. In a modern 5-volt digital
chip, 1.5 volts or lower would represent the number 0 while 3.5 volts or greater
would represent the number 1. Components on a decently engineered motherboard
would stay within those limits, so there shouldn’t have been any
misunderstandings.

Consequently, when Bernd Ulmann predicted that analog computers were due for a
zombie comeback, I wasn’t just skeptical. I found the idea a bit … disturbing.

Hoping for a reality check, I consulted Lyle Bickley, a founding member of the
Computer History Museum in Mountain View, California. Having served for years as
an expert witness in patent suits, Bickley maintains an encyclopedic knowledge
of everything that has been done and is still being done in data processing.



“A lot of Silicon Valley companies have secret projects doing analog chips,” he
told me.

Really? But why?


DIG DEEPER WITH OUR LONGREADS NEWSLETTER

Sign up to get our best longform features, investigations, and thought-provoking
essays, in your inbox every Sunday.
Your email

SUBMIT
By signing up you agree to our User Agreement (including the class action waiver
and arbitration provisions), our Privacy Policy & Cookie Statement and to
receive marketing and account-related emails from WIRED. You can unsubscribe at
any time.

“Because they take so little power.”

Bickley explained that when, say, brute-force natural-language AI systems
distill millions of words from the internet, the process is insanely power
hungry. The human brain runs on a small amount of electricity, he said, about 20
watts. (That’s the same as a light bulb.) “Yet if we try to do the same thing
with digital computers, it takes megawatts.” For that kind of application,
digital is “not going to work. It’s not a smart way to do it.”

Bickley said he would be violating confidentiality to tell me specifics, so I
went looking for startups. Quickly I found a San Francisco Bay Area company
called Mythic, which claimed to be marketing the “industry-first AI analog
matrix processor.”

Mike Henry cofounded Mythic at the University of Michigan in 2013. He’s an
energetic guy with a neat haircut and a well-ironed shirt, like an old-time IBM
salesman. He expanded on Bickley’s point, citing the brain-like neural network
that powers GPT-3. “It has 175 billion synapses,” Henry said, comparing
processing elements with connections between neurons in the brain. “So every
time you run that model to do one thing, you have to load 175 billion values.
Very large data-center systems can barely keep up.”

That’s because, Henry said, they are digital. Modern AI systems use a type of
memory called static RAM, or SRAM, which requires constant power to store data.
Its circuitry must remain switched on even when it’s not performing a task.
Engineers have done a lot to improve the efficiency of SRAM, but there’s a
limit. “Tricks like lowering the supply voltage are running out,” Henry said.

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





Mythic’s analog chip uses less power by storing neural weights not in SRAM but
in flash memory, which doesn’t consume power to retain its state. And the flash
memory is embedded in a processing chip, a configuration Mythic calls
“compute-in-memory.” Instead of consuming a lot of power moving millions of
bytes back and forth between memory and a CPU (as a digital computer does), some
processing is done locally.



What bothered me was that Mythic seemed to be reintroducing the accuracy
problems of analog. The flash memory was not storing a 1 or 0 with comfortable
margins of error, like old-school logic chips. It was holding intermediate
voltages (as many as 256 of them!) to simulate the varying states of neurons in
the brain, and I had to wonder whether those voltages would drift over time.
Henry didn’t seem to think they would.

I had another problem with his chip: The way it worked was hard to explain.
Henry laughed. “Welcome to my life,” he said. “Try explaining it to venture
capitalists.” Mythic’s success on that front has been variable: Shortly after I
spoke to Henry, the company ran out of cash. (More recently it raised $13
million in new funding and appointed a new CEO.)

I next went to IBM. Its corporate PR department connected me with Vijay
Narayanan, a researcher in the company’s physics-of-AI department. He preferred
to interact via company-sanctioned email statements.

For the moment, Narayanan wrote, “our analog research is about customizing AI
hardware, particularly for energy efficiency.” So, the same goal as Mythic.
However, Narayanan seemed rather circumspect on the details, so I did some more
reading and found an IBM paper that referred to “no appreciable accuracy loss”
in its memory systems. No appreciable loss? Did that mean there was some loss?
Then there was the durability issue. Another paper mentioned “an accuracy above
93.5 percent retained over a one-day period.” So it had lost 6.5 percent in just
one day? Was that bad? What should it be compared to?

So many unanswered questions, but the biggest letdown was this: Both Mythic and
IBM seemed interested in analog computing only insofar as specific analog
processes could reduce the energy and storage requirements of AI—not perform the
fundamental bit-based calculations. (The digital components would still do
that.) As far as I could tell, this wasn’t anything close to the second coming
of analog as predicted by Ulmann. The computers of yesteryear may have been
room-sized behemoths, but they could simulate everything from liquid flowing
through a pipe to nuclear reactions. Their applications shared one attribute.
They were dynamic. They involved the concept of change.



Engineers began using the word analog in the 1940s to refer to computers that
simulated real-world conditions.

Illustration: Khyati Trehan
Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





Another childhood conundrum: If I held a ball and dropped it, the force of
gravity made it move at an increasing speed. How could you figure out the total
distance the ball traveled if the speed was changing continuously over time? You
could break its journey down into seconds or milliseconds or microseconds, work
out the speed at each step, and add up the distances. But if time actually
flowed in tiny steps, the speed would have to jump instantaneously between one
step and the next. How could that be true?

Later I learned that these questions had been addressed by Isaac Newton and
Gottfried Leibniz centuries ago. They’d said that velocity does change in
increments, but the increments are infinitely small.

So there were steps, but they weren’t really steps? It sounded like an evasion
to me, but on this iffy premise, Newton and Leibniz developed calculus, enabling
everyone to calculate the behavior of countless naturally changing aspects of
the world. Calculus is a way of mathematically modeling something that’s
continuously changing, like the distance traversed by a falling ball, as a
sequence of infinitely small differences: a differential equation.



That math could be used as the input to old-school analog electronic
computers—often called, for this reason, differential analyzers. You could plug
components together to represent operations in an equation, set some values
using potentiometers, and the answer could be shown almost immediately as a
trace on an oscilloscope screen. It might not have been ideally accurate, but in
the muzzy world, as I had learned to my discontent, nothing was ideally
accurate.

To be competitive, a true analog computer that could emulate such versatile
behavior would have to be suitable for low-cost mass production—on the scale of
a silicon chip. Had such a thing been developed? I went back to Ulmann’s book
and found the answer on the penultimate page. A researcher named Glenn Cowan had
created a genuine VLSI (very large-scale integrated circuit) analog chip back in
2003. Ulmann complained that it was “limited in capabilities,” but it sounded
like the real deal.

Glenn Cowan is a studious, methodical, amiable man and a professor in electrical
engineering at Montreal’s Concordia University. As a grad student at Columbia
back in 1999, he had a choice between two research topics: One would entail
optimizing a single transistor, while the other would be to develop an entirely
new analog computer. The latter was the pet project of an adviser named Yannis
Tsividis. “Yannis sort of convinced me,” Cowan told me, sounding as if he wasn’t
quite sure how it happened.

Initially, there were no specifications, because no one had ever built an analog
computer on a chip. Cowan didn’t know how accurate it could be and was basically
making it up as he went along. He had to take other courses at Columbia to fill
the gaps in his knowledge. Two years later, he had a test chip that, he told me
modestly, was “full of graduate-student naivete. It looked like a breadboarding
nightmare.” Still, it worked, so he decided to stick around and make a better
version. That took another two years.

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





A key innovation of Cowan’s was making the chip reconfigurable—or programmable.
Old-school analog computers had used clunky patch cords on plug boards. Cowan
did the same thing in miniature, between areas on the chip itself, using a
preexisting technology known as transmission gates. These can work as
solid-state switches to connect the output from processing block A to the input
of block B, or block C, or any other block you choose.

His second innovation was to make his analog chip compatible with an
off-the-shelf digital computer, which could help to circumvent limits on
precision. “You could get an approximate analog solution as a starting point,”
Cowan explained, “and feed that into the digital computer as a guess, because
iterative routines converge faster from a good guess.” The end result of his
great labor was etched onto a silicon wafer measuring a very respectable 10
millimeters by 10 millimeters. “Remarkably,” he told me, “it did work.”

When I asked Cowan about real-world uses, inevitably he mentioned AI. But I’d
had some time to think about neural nets and was beginning to feel skeptical. In
a standard neural net setup, known as a crossbar configuration, each cell in the
net connects with four other cells. They may be layered to allow for extra
connections, but even so, they’re far less complex than the frontal cortex of
the brain, in which each individual neuron can be connected with 10,000 others.
Moreover, the brain is not a static network. During the first year of life, new
neural connections form at a rate of 1 million per second. I saw no way for a
neural network to emulate processes like that.

Glenn Cowan’s second analog chip wasn’t the end of the story at Columbia.
Additional refinements were necessary, but Yannis Tsividis had to wait for
another graduate student who would continue the work.



In 2011 a soft-spoken young man named Ning Guo turned out to be willing. Like
Cowan, he had never designed a chip before. “I found it, um, pretty
challenging,” he told me. He laughed at the memory and shook his head. “We were
too optimistic,” he recalled ruefully. He laughed again. “Like we thought we
could get it done by the summer.”

In fact, it took more than a year to complete the chip design. Guo said Tsividis
had required a “90 percent confidence level” that the chip would work before he
would proceed with the expensive process of fabrication. Guo took a chance,
and the result he named the HCDC, meaning hybrid continuous discrete computer.
Guo’s prototype was then incorporated on a board that could interface with an
off-the-shelf digital computer. From the outside, it looked like an accessory
circuit board for a PC.

When I asked Guo about possible applications, he had to think for a bit. Instead
of mentioning AI, he suggested tasks such as simulating a lot of moving
mechanical joints that would be rigidly connected to each other in robotics.
Then, unlike many engineers, he allowed himself to speculate.

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





There are diminishing returns on the digital model, he said, yet it still
dominates the industry. “If we applied as many people and as much money to the
analog domain, I think we could have some kind of analog coprocessing happening
to accelerate the existing algorithms. Digital computers are very good at
scalability. Analog is very good at complex interactions between variables. In
the future, we may combine these advantages.”

The HCDC was fully functional, but it had a problem: It was not easy to use.
Fortuitously, a talented programmer at MIT named Sara Achour read about the
project and saw it as an ideal target for her skills. She was a specialist in
compilers—programs that convert a high-level programming language into machine
language—and could add a more user-friendly front end in Python to help people
program the chip. She reached out to Tsividis, and he sent her one of the few
precious boards that had been fabricated.

When I spoke with Achour, she was entertaining and engaging, delivering
terminology at a manic pace. She told me she had originally intended to be a
doctor but switched to computer science after having pursued programming as a
hobby since middle school. “I had specialized in math modeling of biological
systems,” she said. “We did macroscopic modeling of gene protein hormonal
dynamics.” Seeing my blank look, she added: “We were trying to predict things
like hormonal changes when you inject someone with a particular drug.”

Changes was the key word. She was fully acquainted with the math to describe
change, and after two years she finished her compiler for the analog chip. “I
didn’t build, like, an entry-level product,” she said. “But I made it easier to
find resilient implementations of the computation you want to run. You see, even
the people who design this type of hardware have difficulty programming it. It’s
still extremely painful.”

I liked the idea of a former medical student alleviating the pain of chip
designers who had difficulty using their own hardware. But what was her take on
applications? Are there any?



“Yes, whenever you’re sensing the environment,” she said. “And reconfigurability
lets you reuse the same piece of hardware for multiple computations. So I don’t
think this is going to be relegated to a niche model. Analog computation makes a
lot of sense when you’re interfacing with something that is inherently analog.”
Like the real world, with all its muzziness.

Going back to the concept of dropping a ball, and my interest in finding out how
far it travels during a period of time: Calculus solves that problem easily,
with a differential equation—if you ignore air resistance. The proper term for
this is “integrating velocity with respect to time.”

But what if you don’t ignore air resistance? The faster the ball falls, the more
air resistance it encounters. But gravity remains constant, so the ball’s speed
doesn’t increase at a steady rate but tails off until it reaches terminal
velocity. You can express this in a differential equation too, but it adds
another layer of complexity. I won’t get into the mathematical notation (I
prefer to avoid the pain of it, to use Sara Achour’s memorable term), because
the take-home message is all that matters. Every time you introduce another
factor, the scenario gets more complicated. If there’s a crosswind, or the ball
collides with other balls, or it falls down a hole to the center of the Earth,
where gravity is zero—the situation can get discouragingly complicated.

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





Now suppose you want to simulate the scenario using a digital computer. It’ll
need a lot of data points to generate a smooth curve, and it’ll have to
continually recalculate all the values for each point. Those calculations will
add up, especially if multiple objects become involved. If you have billions of
objects—as in a nuclear chain reaction, or synapse states in an AI engine—you’ll
need a digital processor containing maybe 100 billion transistors to crunch the
data at billions of cycles per second. And in each cycle, the switching
operation of each transistor will generate heat. Waste heat becomes a serious
issue.

Using a new-age analog chip, you just express all the factors in a differential
equation and type it into Achour’s compiler, which converts the equation into
machine language that the chip understands. The brute force of binary code is
minimized, and so is the power consumption and the heat. The HCDC is like an
efficient little helper residing secretly amid the modern hardware, and it’s
chip-sized, unlike the room-sized behemoths of yesteryear.

Now I should update the basic analog attributes:



You can see how the designs by Tsividis and his grad students have addressed the
historic disadvantages in my previous list. And yet, despite all this,
Tsividis—the prophet of modern analog computing—still has difficulty getting
people to take him seriously.

Born in Greece in 1946, Tsividis developed an early dislike for geography,
history, and chemistry. “I felt as if there were more facts to memorize than I
had synapses in my brain,” he told me. He loved math and physics but ran into a
different problem when a teacher assured him that the perimeter of any circle
was three times the diameter plus 14 centimeters. Of course, it should be
(approximately) 3.14 times the diameter of the circle, but when Tsividis said
so, the teacher told him to be quiet. This, he has said, “suggested rather
strongly that authority figures are not always right.”



He taught himself English, started learning electronics, designed and built
devices like radio transmitters, and eventually fled from the Greek college
system that had compelled him to learn organic chemistry. In 1972 he began
graduate studies in the United States, and over the years became known for
challenging orthodoxy in the field of computer science. One well-known circuit
designer referred to him as “the analog MOS freak,” after he designed and
fabricated an amplifier chip in 1975 using metal-oxide semiconductor technology,
which absolutely no one believed was suitable for the task.

These days, Tsividis is polite and down to earth, with no interest in wasting
words. His attempt to bring back analog in the form of integrated chips began in
earnest in the late ’90s. When I talked to him, he told me he had 18 boards with
analog chips mounted on them, a couple more having been loaned out to
researchers such as Achour. “But the project is on hold now,” he said, “because
the funding ended from the National Science Foundation. And then we had two
years of Covid.”

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





I asked what he would do if he got new funding.

“I would need to know, if you put together many chips to model a large system,
then what happens? So we will try to put together many of those chips and
eventually, with the help of silicon foundries, make a large computer on a
single chip.”

I pointed out that development so far has already taken almost 20 years.

“Yes, but there were several years of breaks in between. Whenever there is
appropriate funding, I revive the process.”

I asked him whether the state of analog computing today could be compared to
that of quantum computing 25 years ago. Could it follow a similar path of
development, from fringe consideration to common (and well-funded) acceptance?

It would take a fraction of the time, he said. “We have our experimental
results. It has proven itself. If there is a group that wants to make it
user-friendly, within a year we could have it.” And at this point he is willing
to provide analog computer boards to interested researchers, who can use them
with Achour’s compiler.



What sort of people would qualify?

“The background you need is not just computers. You really need the math
background to know what differential equations are.”

I asked him whether he felt that his idea was, in a way, obvious. Why hadn’t it
resonated yet with more people?

“People do wonder why we are doing this when everything is digital. They say
digital is the future, digital is the future—and of course it’s the future. But
the physical world is analog, and in between you have a big interface. That’s
where this fits.”



In a digital processor crunching data at billions of cycles per second, the
switching operation of each transistor generates heat.

Illustration: Khyati Trehan

When Tsividis mentioned offhandedly that people applying analog computation
would need an appropriate math background, I started to wonder. Developing
algorithms for digital computers can be a strenuous mental exercise, but
calculus is seldom required. When I mentioned this to Achour, she laughed and
said that when she submits papers to reviewers, “Some of them say they haven’t
seen differential equations in years. Some of them have never seen differential
equations.”

Most Popular
 * Gear
   Can Burning Man Pull Out of Its Climate Death Spiral?
   
   Alden Wicker

 * Culture
   The 45 Best Shows on Netflix Right Now
   
   WIRED Staff

 * Ideas
   Artificial Wombs Will Change Abortion Rights Forever
   
   Rosalind Moran

 * Security
   ChatGPT Has a Big Privacy Problem
   
   Matt Burgess

 * 





And no doubt a lot of them won’t want to. But financial incentives have a way of
overcoming resistance to change. Imagine a future where software engineers can
command an extra $100K per annum by adding a new bullet point to a résumé:
“Fluent in differential equations.” If that happens, I’m thinking Python
developers will soon be signing up for remedial online calculus classes.

Likewise, in business, the determining factor will be financial. There’s going
to be a lot of money in AI—and in smarter drug molecules, and in agile robots,
and in a dozen other applications that model the muzzy complexity of the
physical world. If power consumption and heat dissipation become really
expensive problems, and shunting some of the digital load into miniaturized
analog coprocessors is significantly cheaper, then no one will care that analog
computation used to be done by your math-genius grandfather using a big steel
box full of vacuum tubes.

Reality really is imprecise, no matter how much I would prefer otherwise, and
when you want to model it with truly exquisite fidelity, digitizing it may not
be the most sensible method. Therefore, I must conclude:



Analog is dead.

Long live analog.

--------------------------------------------------------------------------------

This article appears in the May issue. Subscribe now.

Let us know what you think about this article. Submit a letter to the editor at
mail@wired.com.





Read next



Read next


WHO’S WATCHING THE WATCHERS?


GET MORE FROM WIRED

 * 📩 Don’t miss our biggest stories, delivered to your inbox every day

 * The unbelievable zombie comeback of analog computing

 * Your next landlord could be 100 random people

 * Review: We put ChatGPT, Bing Chat, and Bard to the test

 * The chemical menace inside glaciers and icebergs

 * How a major toy company kept 4chan online

 * 🔊 Our Gear team sounds off on audiophile-grade speakers, vinyl accessories
   and the best wireless headphones for anyone

Charles Platt is a contributing editor at Make and the author of Make:
Electronics, among other books.
 * Twitter

TopicslongreadsHardwareHardware IssueanalogComputerscomputingcode
More from WIRED
On the Trail of the Fentanyl King
An Iraqi translator for the US military emigrated to Texas to start a new life.
He ended up becoming one of the biggest drug dealers on the dark web.

Benoît Morenne

How to Love Technology Again
At a time when software is consuming us, we crave hardware—the material anchors
of our immaterial realities.

WIRED Staff

I Saw the Face of God in a Semiconductor Factory
As the US boosts production of silicon chips, an American journalist goes inside
TSMC, the mysterious Taiwanese company at the center of the global industry.

Virginia Heffernan


Brandon Sanderson Is Your God
He’s the biggest fantasy writer in the world. He’s also very Mormon. These
things are profoundly related.

Jason Kehe

Jenny Odell Can Stretch Time and So Can You
A conversation with the writer and artist on her new book, collective burnout,
and ways to live off the clock.

Camille Bromley

This Is a Philosopher on Drugs
I was at the lowest point in my life. I needed a mind-altering jolt. In the end,
everything—even the meaning of “everything”—changed.

Justin E. H. Smith

‘You Must Believe You Can Repair It’
Six years ago, I moved my family into a 50-year-old RV—not just to see America,
but to test my belief that anything worth fixing can be fixed. 

Scott Gilbertson

A Tiny Blog Took on Big Surveillance in China—and Won
Digging through manuals for security cameras, a group of gearheads found
sinister details and ignited a new battle in the US-China tech war.

Amos Zeeberg






WIRED is where tomorrow is realized. It is the essential source of information
and ideas that make sense of a world in constant transformation. The WIRED
conversation illuminates how technology is changing every aspect of our
lives—from culture to business, science to design. The breakthroughs and
innovations that we uncover lead to new ways of thinking, new connections, and
new industries.
 * Facebook
 * Twitter
 * Pinterest
 * YouTube
 * Instagram
 * Tiktok

More From WIRED

 * Subscribe
 * Newsletters
 * FAQ
 * Wired Staff
 * Press Center
 * Coupons
 * Editorial Standards
 * Black Friday
 * Archive

Contact

 * Advertise
 * Contact Us
 * Customer Care
 * Jobs

 * RSS
 * Accessibility Help
 * Condé Nast Store
 * Condé Nast Spotlight
 * Manage Preferences

© 2023 Condé Nast. All rights reserved. Use of this site constitutes acceptance
of our User Agreement and Privacy Policy and Cookie Statement and Your
California Privacy Rights. WIRED may earn a portion of sales from products that
are purchased through our site as part of our Affiliate Partnerships with
retailers. The material on this site may not be reproduced, distributed,
transmitted, cached or otherwise used, except with the prior written permission
of Condé Nast. Ad Choices

Select international siteUnited StatesLargeChevron
 * UK
 * Italia
 * Japón





We and our partners store and/or access information on a device, such as unique
IDs in cookies to process personal data. You may accept or manage your choices
by clicking below or at any time in the privacy policy page. These choices will
be signaled to our partners and will not affect browsing data.More Information


WE AND OUR PARTNERS PROCESS DATA TO PROVIDE:

Use precise geolocation data. Actively scan device characteristics for
identification. Store and/or access information on a device. Personalised ads
and content, ad and content measurement, audience insights and product
development. Please note, preferences expressed on this site will also apply to
“es.wired.com” List of Partners (vendors)

I Accept
Show Purposes