Key Moments

Future Computers Will Be Radically Different (Analog Computing)

VeritasiumVeritasium
Education3 min read22 min video
Mar 1, 2022|12,836,253 views|363,597|14,134
Save to Pod
TL;DR

Analog computers are back, offering speed and efficiency for AI tasks where digital computing hits limits.

Key Insights

1

Analog computers, once dominant, are resurging due to AI's demands for faster, more energy-efficient computation.

2

AI, particularly neural networks, heavily relies on matrix multiplication, a task analog computers excel at due to their inherent properties.

3

Digital computing faces physical limitations (Moore's Law) and inefficiencies (Von Neumann bottleneck) for large-scale AI.

4

Analog computers, while less precise and non-general purpose, offer significant power savings and speed for specific tasks like AI inference.

5

New analog chips repurpose components like flash memory cells to perform computations directly, offering massive performance gains.

6

The future of computing may involve a hybrid approach, leveraging analog for AI-specific tasks and digital for general computation.

FROM DOMINANCE TO OBSCURITY: THE HISTORY OF ANALOG COMPUTING

For centuries, analog computers were the most powerful calculating devices, instrumental in complex tasks like predicting eclipses and guiding artillery. Their operation was based on physical analogies, with voltages representing continuous physical quantities. However, the invention of transistors and the subsequent rise of digital computers, with their precision and general-purpose capabilities, led to the decline of analog technology. Today, a confluence of factors is driving a re-evaluation of analog systems.

FUNDAMENTAL ADVANTAGES OF ANALOG COMPUTATION

Analog computers excel in speed and energy efficiency. For instance, adding two numbers digitally requires numerous transistors, while an analog computer can achieve this by simply connecting wires. Similarly, multiplication, a complex digital operation, can be performed by an analog computer using a resistor and current. This makes them incredibly powerful for specific, demanding computations without consuming significant power.

THE LIMITATIONS THAT DROVE DIGITAL'S ASCENDANCY

Despite their strengths, analog computers have inherent drawbacks. They are not general-purpose machines, meaning tasks like running word processing software are impossible. Their continuous inputs and outputs make exact repetition difficult, leading to inexact results and inherent variability, often around 1% error due to component tolerances. These limitations in precision and repeatability were key reasons for their obsolescence as digital computing matured.

ARTIFICIAL INTELLIGENCE: THE DEMAND FOR A NEW PARADIGM

The field of artificial intelligence, particularly neural networks, has experienced explosive growth. Inspired by the human brain, these networks perform complex calculations, often involving massive matrix multiplications. Training and running these networks demand immense computational power and energy, pushing traditional digital computers to their limits. This demand has created a perfect storm for analog computing to make a comeback.

THE VIRTUAL CYCLE: NEURAL NETWORKS AND COMPUTATIONAL BOTTLENECKS

Modern digital computers face significant challenges in handling large-scale neural networks efficiently. The Von Neumann bottleneck, where data fetching from memory is a major bottleneck, consumes considerable time and energy. Furthermore, Moore's Law, which has driven transistor miniaturization, is approaching its physical limits. These factors highlight the inefficiencies of serial processing for the parallel, data-intensive tasks required by AI.

ANALOG CHIPS: REPURPOSING TECHNOLOGY FOR AI

Startups are developing analog chips that repurpose existing technologies, like digital flash memory cells, to perform AI computations. These cells, originally designed for storing binary data, are modified to act as variable resistors. By controlling the number of electrons in the floating gate, these cells can represent weights in a neural network. Input activations are applied as voltages, and the resulting current sums up the weighted activations, effectively performing matrix multiplication.

EFFICIENCY AND PERFORMANCE GAINS IN ANALOG AI

These new analog chips can achieve remarkable computational speeds, performing trillions of operations per second while consuming only a few watts of power. While digital systems can match or exceed raw operation counts, they often require significantly more power and larger hardware footprints. Analog computing offers a compelling advantage for power-constrained applications and specific AI workloads, such as those in augmented reality or smart home devices.

HYBRID FUTURES AND THE NATURE OF INTELLIGENCE

The future of computing likely involves a hybrid approach, combining the strengths of both analog and digital systems. Analog excels at the repetitive, high-volume computations of AI, while digital systems handle general tasks and provide necessary precision. As we strive for true artificial intelligence, mirroring human cognition, the all-at-once, continuous nature of analog computation may prove essential, suggesting that the ultimate intelligence might be a blend.

Analog vs. Digital Computing for Basic Operations

Data extracted from this episode

OperationDigital Computer (Transistors)Analog Computer
Add two 8-bit numbers~50Connect two wires
Multiply two numbers~1000 (switching zeros and ones)Pass current through a resistor (Voltage across resistor = I * R)

ImageNet Challenge Performance Over Time

Data extracted from this episode

YearBest PerformerTop-Five Error Rate
2010Unknown28.2%
2011Unknown25.8%
2012AlexNet (University of Toronto)16.4%
2015Unknown (100-layer neural network)3.6% (Better than human performance)

Mythic AI Analog Chip Performance vs. Digital Systems

Data extracted from this episode

MetricMythic AI Analog ChipNewer Digital Systems
Operations per second25 trillion25 - 100 trillion
Power Consumption~3 Watts50-100 Watts (for comparable systems)

Common Questions

An analog computer uses continuous physical phenomena, like voltage, to represent data and solve problems. Instead of zeros and ones, it uses oscillating voltages that directly mimic the behavior of physical systems, allowing for faster computation with less power.

Topics

Mentioned in this video

More from Veritasium

View all 90 summaries

Found this useful? Build your knowledge library

Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.

Try Summify free