Key Moments
Future Computers Will Be Radically Different (Analog Computing)
Key Moments
Analog computers are back, offering speed and efficiency for AI tasks where digital computing hits limits.
Key Insights
Analog computers, once dominant, are resurging due to AI's demands for faster, more energy-efficient computation.
AI, particularly neural networks, heavily relies on matrix multiplication, a task analog computers excel at due to their inherent properties.
Digital computing faces physical limitations (Moore's Law) and inefficiencies (Von Neumann bottleneck) for large-scale AI.
Analog computers, while less precise and non-general purpose, offer significant power savings and speed for specific tasks like AI inference.
New analog chips repurpose components like flash memory cells to perform computations directly, offering massive performance gains.
The future of computing may involve a hybrid approach, leveraging analog for AI-specific tasks and digital for general computation.
FROM DOMINANCE TO OBSCURITY: THE HISTORY OF ANALOG COMPUTING
For centuries, analog computers were the most powerful calculating devices, instrumental in complex tasks like predicting eclipses and guiding artillery. Their operation was based on physical analogies, with voltages representing continuous physical quantities. However, the invention of transistors and the subsequent rise of digital computers, with their precision and general-purpose capabilities, led to the decline of analog technology. Today, a confluence of factors is driving a re-evaluation of analog systems.
FUNDAMENTAL ADVANTAGES OF ANALOG COMPUTATION
Analog computers excel in speed and energy efficiency. For instance, adding two numbers digitally requires numerous transistors, while an analog computer can achieve this by simply connecting wires. Similarly, multiplication, a complex digital operation, can be performed by an analog computer using a resistor and current. This makes them incredibly powerful for specific, demanding computations without consuming significant power.
THE LIMITATIONS THAT DROVE DIGITAL'S ASCENDANCY
Despite their strengths, analog computers have inherent drawbacks. They are not general-purpose machines, meaning tasks like running word processing software are impossible. Their continuous inputs and outputs make exact repetition difficult, leading to inexact results and inherent variability, often around 1% error due to component tolerances. These limitations in precision and repeatability were key reasons for their obsolescence as digital computing matured.
ARTIFICIAL INTELLIGENCE: THE DEMAND FOR A NEW PARADIGM
The field of artificial intelligence, particularly neural networks, has experienced explosive growth. Inspired by the human brain, these networks perform complex calculations, often involving massive matrix multiplications. Training and running these networks demand immense computational power and energy, pushing traditional digital computers to their limits. This demand has created a perfect storm for analog computing to make a comeback.
THE VIRTUAL CYCLE: NEURAL NETWORKS AND COMPUTATIONAL BOTTLENECKS
Modern digital computers face significant challenges in handling large-scale neural networks efficiently. The Von Neumann bottleneck, where data fetching from memory is a major bottleneck, consumes considerable time and energy. Furthermore, Moore's Law, which has driven transistor miniaturization, is approaching its physical limits. These factors highlight the inefficiencies of serial processing for the parallel, data-intensive tasks required by AI.
ANALOG CHIPS: REPURPOSING TECHNOLOGY FOR AI
Startups are developing analog chips that repurpose existing technologies, like digital flash memory cells, to perform AI computations. These cells, originally designed for storing binary data, are modified to act as variable resistors. By controlling the number of electrons in the floating gate, these cells can represent weights in a neural network. Input activations are applied as voltages, and the resulting current sums up the weighted activations, effectively performing matrix multiplication.
EFFICIENCY AND PERFORMANCE GAINS IN ANALOG AI
These new analog chips can achieve remarkable computational speeds, performing trillions of operations per second while consuming only a few watts of power. While digital systems can match or exceed raw operation counts, they often require significantly more power and larger hardware footprints. Analog computing offers a compelling advantage for power-constrained applications and specific AI workloads, such as those in augmented reality or smart home devices.
HYBRID FUTURES AND THE NATURE OF INTELLIGENCE
The future of computing likely involves a hybrid approach, combining the strengths of both analog and digital systems. Analog excels at the repetitive, high-volume computations of AI, while digital systems handle general tasks and provide necessary precision. As we strive for true artificial intelligence, mirroring human cognition, the all-at-once, continuous nature of analog computation may prove essential, suggesting that the ultimate intelligence might be a blend.
Mentioned in This Episode
●Products
●Software & Apps
●Companies
●Organizations
●Concepts
●People Referenced
Analog vs. Digital Computing for Basic Operations
Data extracted from this episode
| Operation | Digital Computer (Transistors) | Analog Computer |
|---|---|---|
| Add two 8-bit numbers | ~50 | Connect two wires |
| Multiply two numbers | ~1000 (switching zeros and ones) | Pass current through a resistor (Voltage across resistor = I * R) |
ImageNet Challenge Performance Over Time
Data extracted from this episode
| Year | Best Performer | Top-Five Error Rate |
|---|---|---|
| 2010 | Unknown | 28.2% |
| 2011 | Unknown | 25.8% |
| 2012 | AlexNet (University of Toronto) | 16.4% |
| 2015 | Unknown (100-layer neural network) | 3.6% (Better than human performance) |
Mythic AI Analog Chip Performance vs. Digital Systems
Data extracted from this episode
| Metric | Mythic AI Analog Chip | Newer Digital Systems |
|---|---|---|
| Operations per second | 25 trillion | 25 - 100 trillion |
| Power Consumption | ~3 Watts | 50-100 Watts (for comparable systems) |
Common Questions
An analog computer uses continuous physical phenomena, like voltage, to represent data and solve problems. Instead of zeros and ones, it uses oscillating voltages that directly mimic the behavior of physical systems, allowing for faster computation with less power.
Topics
Mentioned in this video
A limitation in digital computers where the separation of processing and memory units leads to delays and energy consumption in data transfer.
Computing devices that use continuous physical phenomena, like voltage, to model problems, contrasting with digital computers that use discrete zeros and ones.
More from Veritasium
View all 90 summaries
26 minThe Obvious Problem That No One Can Agree On
53 minThe Internet Was Weeks Away From Disaster and No One Knew
55 minAsbestos is a bigger problem than we thought
31 minThis Common Substance Was Once Worth Millions
Found this useful? Build your knowledge library
Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.
Try Summify free