Key Moments

Scott Aaronson on Computational Complexity Theory and Quantum Computers

Y CombinatorY Combinator
Science & Technology8 min read75 min video
Jun 29, 2018|30,010 views|707|27
Save to Pod
TL;DR

Quantum computers explore two to the 100th power states simultaneously, but random outcomes limit their utility unless interference is precisely choreographed, making speed advantages hard to achieve.

Key Insights

1

Quantum computers can explore 2^100 states simultaneously with 100 qubits, but an equal superposition of all states results in a random answer upon measurement if no other steps are taken.

2

The entire hope for quantum computing speed advantage lies in choreographing interference patterns to cancel wrong answers while reinforcing the right one, a task complicated by not knowing the correct answer beforehand.

3

Quantum computing is fundamentally science, testing quantum mechanics in a new regime of universal computation and error correction, rather than just a technology aiming for immediate business applications.

4

Quantum error correction and fault tolerance, discovered in the 1990s, reduced the challenge of building scalable quantum computers from achieving perfect qubits to an 'merely a staggeringly hard engineering problem'.

5

An application for near-term quantum computers (50-70 qubits) could be generating cryptographically secure random bits by leveraging hardness assumptions of quantum computation for verification.

6

Shadow tomography uses careful manipulation of quantum states to extract information about a large number of measurements from a limited number of state copies, by exploiting principles similar to differential privacy.

The misconception of quantum computing's simultaneous computation

A common misconception about quantum computers is that their ability to explore 2^n states simultaneously with n qubits means they can instantly try all possible solutions at once. While it's true that a quantum computer uses qubits that can be in a superposition of zero and one, allowing for exploration of exponentially many states, this doesn't directly translate to instantaneous problem-solving. The core of quantum mechanics involves amplitudes, which are numbers associated with each possible state. Unlike probabilities, amplitudes can be negative or complex, leading to interference. Probabilities only add positively, increasing likelihood with more paths, but amplitudes can cancel each other out destructively. While creating an equal superposition of all possible answers (like all cryptographic keys or optimization solutions) is possible, the act of measurement collapses this superposition into a single outcome. The probability of observing a particular outcome is the squared absolute value of its amplitude. Without further manipulation, this leads to a random answer. The real power of quantum computing lies in exploiting interference: choreographing patterns where amplitudes for wrong answers cancel out, while amplitudes for the correct answer reinforce each other.

Quantum computers as a path to fundamental scientific discovery

Scott Aaronson emphasizes that quantum computing is fundamentally a scientific endeavor. It's not about overthrowing existing laws of physics but about taking quantum mechanics, as written in 1926, and testing it rigorously in entirely new regimes, such as universal quantum computation and quantum error correction. Skeptics argue that inherent quantum errors will prevent scaling, or that a new law of physics might emerge to halt progress. However, the development of quantum error correction and fault tolerance in the 1990s shifted the perceived barrier from fundamental impossibility to an extremely difficult engineering challenge. Even if building a perfect large-scale quantum computer proves impossible, the attempt itself could lead to the discovery of new 'impossibility principles,' which would be scientifically significant. For Aaronson, the ultimate application of a quantum computer is not breaking codes or solving optimization problems, but proving skeptics wrong and uncovering the computational capabilities inherent in nature. This pursuit tests the limits of our understanding of reality.

Generating secure randomness with near-term quantum devices

One promising near-term application for quantum computers, potentially realizable with around 50-70 qubits, is the generation of cryptographically secure random bits. Current methods for obtaining public randomness, like random.org or NIST's Randomness Beacon, face trust issues, as the hardware could be compromised or generate pseudo-random numbers with backdoors (as seen with a past NIST standard). Similarly, secret random bits would require owning a quantum computer. A novel approach leverages quantum mechanics, specifically Bell inequalities, to ensure randomness. If entangled particles, measured far apart, show correlations impossible classically, it implies inherent randomness—unless communication faster than light occurred, which is impossible. A more recent realization allows guaranteed randomness from a single device, assuming certain computational hardness. A user could send challenges (arbitrary quantum circuits) to a quantum computer (e.g., Google's 70-qubit machine), receive samples from the resulting probability distribution, and statistically verify that the samples are indeed random and not deterministically generated. This verification relies on cryptographic assumptions: if a certain problem is hard for a quantum computer, then quickly generating samples that pass tests implies true randomness.

Shadow tomography: Measuring quantum states efficiently

Measurement in quantum mechanics is typically destructive, collapsing the quantum state. Shadow tomography is a new procedure designed to overcome this by allowing the estimation of a quantum state's behavior across a vast number of potential measurements, even when only a limited number of copies of the state are available. If one had exponentially many copies, one could fully characterize the state. However, with few copies, this is impossible. Shadow tomography enables the extraction of information about exponentially many 'yes/no' measurement outcomes by carefully manipulating and partially reusing quantum states. The partial damage during measurement is minimized when the measurement basis aligns well with the state's inherent properties, so the state is not severely altered. This technique builds on previous work and has connections to classical computer science, specifically differential privacy. The process involves a careful measurement procedure that reuses states, potentially over a long time, to learn about multiple measurement outcomes without complete destruction of the state.

The P vs. NP problem: The quest for efficient problem-solving

The P vs. NP problem is a central unsolved question in computer science and mathematics, essentially asking whether every problem whose solution can be quickly verified (NP) can also be quickly solved (P). 'Efficiently' typically means solving the problem in polynomial time, where the number of computational steps grows at most as a fixed power of the problem's size. Many problems, like factoring large numbers (crucial for cryptography), are in NP but are not known to be in P. It's easy to check if a number is prime (P), but factoring it is much harder. The intuition for most people is that P is not equal to NP; solving Sudoku or breaking codes seems intuitively harder than verifying a solution. However, proving this formally requires understanding the space of all possible algorithms. If P were equal to NP, it would imply that many currently intractable problems have efficient solutions, with profound implications for cryptography, optimization, and artificial intelligence. This problem is considered one of the most important challenges, with a significant prize offered for its resolution.

The intersection of computer science, physics, and AI

There's a growing intersection between computer science, physics, and mathematics. Fields like statistical physics have revealed deep connections between condensed matter physics and combinatorial optimization. Quantum computing stands as a major cross-disciplinary area, forcing rapid exchange of knowledge and terminology. This interdisciplinary nature is also evident in theoretical physics with concepts like the holographic principle, which suggests a duality between a theory in a higher dimension (including gravity) and a theory on its boundary in fewer dimensions. This duality is now understood to be related to quantum error-correcting codes, foundational to quantum computing. The firewall paradox, a puzzle concerning black hole information, is explored within this framework. In the realm of AI, the discussion touches on managing its growth responsibly, ethical considerations, and the long-term implications of artificial general intelligence (AGI). While current AI applications raise issues of job displacement, privacy, and bias, the prospect of superintelligence raises existential questions about alignment and ensuring AI shares human values. However, Aaronson notes that in the next 20-50 years, he worries more about 'super stupidity'—self-inflicted existential threats like climate change or nuclear war—than superintelligent AI.

Busy beaver numbers and mathematical unknowability

The busy beaver function (BB(n)) represents the maximum number of steps a Turing machine of length n can take before halting. This function grows incredibly rapidly, exceeding any computable function. Its values encode profound mathematical truths; determining BB(n) for a specific n could potentially settle unresolved mathematical conjectures like the Riemann hypothesis. Fascinatingly, the standard axioms of set theory (ZFC) can only determine a finite number of BB(n) values. Beyond a certain point, which is not precisely known but is estimated to be around 500-800 states, the values of BB(n) become undecidable within ZFC. A project by Adam Yedidia designed an 8,000-state Turing machine whose halting behavior is equivalent to finding a contradiction in set theory, thus its behavior cannot be proven by set theory itself (due to Gödel's incompleteness theorems). Subsequent work by hobbyists has reduced the state count required for such independence to under 800, demonstrating the practical implications of these theoretical limits on computation and provability.

The enduring value of blogs and advice for young scientists

Scott Aaronson champions blogs as a crucial platform for detailed discourse, aligning with the internet's original promise of reasoned discussion and knowledge sharing, in contrast to the ephemeral and often reactive nature of social media like Twitter. He reminisces about his 15+ years of blogging, emphasizing its role in developing and defending arguments. For young people interested in science, Aaronson recommends Paul Graham's essay 'Why Nerds Are Unpopular,' which speaks to the value of intellectual curiosity over popularity and the eventual discovery of communities that will appreciate one's interests. Aaronson himself left high school early to pursue higher education, facing social challenges but finding fulfillment in intellectual pursuits. He advises aspiring scientists to embrace their interests, be persistent, and recognize that finding like-minded individuals and opportunities is possible, especially in the modern interconnected world. He also highlights the importance of engaging deeply with foundational problems, citing his blog post critiquing Integrated Information Theory of consciousness as an example where rigorous discussion led to progress.

Common Questions

A common misconception is that quantum computers can solve hard search problems instantaneously by trying all solutions at once. While they utilize qubits that can be in superposition, the probabilistic nature of quantum measurement and the need for constructive interference mean they don't simply 'try everything simultaneously' in a brute-force manner.

Topics

Mentioned in this video

People
Scott Aaronson

The speaker, a professor of Computer Science and blogger known for his work in quantum computing and computational complexity theory.

Richard Feynman

A Nobel Prize-winning physicist whose work on quantum mechanics and the Feynman diagrams is highly respected.

Rana Adhikari

A guest on a previous podcast episode discussing LIGO, a project in fundamental science.

Leonard Susskind

Physicist discussing the holographic principle and quantum gravity, with whom Aaronson has done a podcast.

Gerard 't Hooft

Co-developed the idea of black hole complementarity in the 1990s as a response to the information paradox.

Sean Carroll

A friend of Aaronson's who discusses cosmology and experiments related to the Big Bang's low entropy state.

David Chalmers

Philosopher of consciousness who participated in the discussion on Aaronson's blog regarding Integrated Information Theory.

Guy Rothblum

A classical computer scientist from the Weizmann Institute who collaborated with Scott Aaronson on research connecting differential privacy and quantum mechanics.

Sarah Constantine

Blogger who wrote an insightful post about blogs being in keeping with the original promise of the internet.

Paul Graham

Author of 'Why Nerds Are Unpopular,' an influential essay for Aaronson regarding the experience of being a 'nerd' in school.

Alan Turing

Pioneer of theoretical computer science and artificial intelligence, known for the Turing machine concept.

Adam Yedidia

Aaronson's former master's student at MIT who designed an 8,000-state Turing machine to explore Gödel's incompleteness theorems.

Michael Bloomberg

A Twitter user who asked a question about the Busy Beaver numbers and their independence from ZF set theory.

Stefan O'Rear

A hobbyist who improved upon the Turing machine design for Busy Beaver numbers, reducing the state count significantly.

Giulio Tononi

A neuroscientist and psychiatrist who proposed Integrated Information Theory.

Software & Apps
random.org

A website that provides purportedly genuinely random numbers generated from atmospheric noise.

Randomness Beacon

A public service run by NIST that continuously broadcasts random bits generated using quantum mechanical processes.

Maple

A computer algebra system used by the speaker for numerical optimizations.

shadow tomography

A new procedure developed by Scott Aaronson for measuring quantum states, allowing for more information to be extracted than with traditional methods.

jigsaw puzzle

A puzzle consisting of small pieces that must be fitted together to form a complete picture; used as an analogy for the P vs. NP problem.

Sudoku puzzle

A logic-based number-placement puzzle used as an analogy for the P vs. NP problem.

Turing Machines

A mathematical model of computation that defines an abstract machine that moves a hypothetical tape on which it can read and write symbols.

Tumblr

A microblogging and social networking website, characterized as being designed for sharing photos and social signaling rather than discourse.

Usenet

A pre-internet discussion system mentioned as a precursor to blogs and the original promise of the internet for discourse.

Turing machine

A theoretical model of computation proposed by Alan Turing, used to define complexity classes like P and NP, and as the basis for Busy Beaver numbers.

Cold Plunge

Mentioned in the context of practical advice for better performance. (Note: This seems to be a misinterpretation based on common themes in other podcasts. It's not in the transcript for this video.)

Concepts
Bell inequality

A theorem in quantum mechanics used to test local realism, demonstrating correlations that cannot be explained by classical physics.

P versus NP problem

A major unsolved problem in computer science concerning whether every problem whose solution can be quickly verified can also be quickly solved.

Hawking radiation

Black body radiation predicted to be released by black holes, a result of quantum effects near the event horizon.

ZF set theory

A foundational axiomatic system for mathematics, used here in the context of limitations on proving the values of the Busy Beaver function.

Riemann hypothesis

A conjecture about the location of the non-trivial zeros of the Riemann zeta function; its truth or falsehood is connected to the Busy Beaver function.

Gödel's Incompleteness Theorems

Theorems stating that formal systems for arithmetic, like set theory, cannot be both complete and consistent; relevant to the unknowability of certain Busy Beaver numbers.

Integrated Information Theory

A theory of consciousness proposed by Giulio Tononi and others, which Scott Aaronson critiqued on his blog.

Holographic Principle

A principle suggesting that the description of a volume of space can be seen as encoded on a lower-dimensional boundary to the region, like a hologram.

Probability

Used in quantum mechanics to describe the likelihood of observing a particular outcome, calculated from the squared absolute value of the amplitude.

Global warming

Presented as a current major challenge that humanity should solve, analogous to practicing for the problem of AI alignment.

Black hole information paradox

The puzzle concerning whether information that falls into a black hole is lost forever or somehow preserved and released, related to Stephen Hawking's work.

Firewall paradox

A modern refinement of the black hole information paradox, posing a technical puzzle about reconciling different perspectives on what happens at the event horizon.

Differential Privacy

A field in computer science focused on data mining techniques that mathematically guarantee that individual user data is not overly revealed, often by adding random noise.

Phasor

A complex number with magnitude and phase used to represent amplitude in quantum mechanics, different from classical probabilities.

Quantum Error Correction

Techniques used to protect quantum information from errors caused by decoherence and other environmental factors, crucial for building scalable quantum computers.

Black hole complementarity

A viewpoint suggesting that information falling into a black hole can be viewed in two consistent but different ways: as passing through the event horizon or as being processed at the horizon.

Busy Beaver numbers

A sequence of numbers representing the maximum output of a Turing machine of a given size before halting. It grows uncomputably rapidly and encodes mathematical truths.

Quantum Mechanics

The fundamental theory describing the physical properties of nature at the scale of atoms and subatomic particles, central to quantum computing.

Interference

In quantum mechanics, amplitudes can interfere constructively (reinforce) or destructively (cancel out), a key principle for quantum computation.

Prime factorization

The process of finding the prime numbers that multiply together to make the original number, used as an example of an NP-complete problem underlying cryptography.

Fascism

Mentioned as a potential risk and sign of 'super stupidity' that Aaronson worries more about in the near-term future than superintelligence.

Amplitude

A complex number assigned to each possible state of a quantum system, related to probability and capable of interference.

Quantum Entanglement

A phenomenon where two or more quantum particles become linked in such a way that they share the same fate, regardless of the distance separating them.

superposition

A fundamental quantum mechanical principle where a system can exist in multiple states simultaneously, each with a certain amplitude.

Quantum Fault Tolerance

Achieving reliable quantum computation even in the presence of imperfect qubits and operations, a key development in the 90s.

More from Y Combinator

View all 586 summaries

Ask anything from this episode.

Save it, chat with it, and connect it to Claude or ChatGPT. Get cited answers from the actual content — and build your own knowledge base of every podcast and video you care about.

Get Started Free