So You Think Quantum Computing Is Bunk?

advertisement
So You Think Quantum Computing
Is Bunk?
| 
You measurin’
ME?
Scott Aaronson (MIT)
Quantum Computing
When I first heard about QC (around 1996), I was
certain it was bunk!
But to find the “catch,” I’d first have to figure out
what the deal was with quantum mechanics itself…
Quantum Mechanics in 1 Slide
“Like probability theory, but over the complex numbers”
Probability Theory:
Quantum Mechanics:
 s11  s1n   p1   q1 

   
          
s  s  p  q 
nn   n 
 n1
 n
 u11  u1n   1   1 

   
          
 u  u     
nn   n 
 n1
 n
pi  0,
n
p
i 1
i
1
Linear transformations
that conserve 1-norm of
probability vectors:
Stochastic matrices
 i  C,
n

i 1
2
i
1
Linear transformations
that conserve 2-norm of
amplitude vectors:
Unitary matrices
Interference
“The source of all quantum weirdness”





1
2
1
2
1  1  1 

2  12   20
      
1  01   11
2   2   2 
Possible states of a single
quantum bit, or qubit:
0 1
2
1
0 1
2
0
Measurement
If you ask |0+|1 whether it’s |0 or |1, it answers
|0 with probability ||2 and |1 with probability ||2.
And it sticks with its answer from then on!
Measurement is a “destructive” process:
Product state of two qubits:
 0   1  0   1 
  0 0   0 1   1 0   1 1
Entangled state (can’t be
written as product state):
0 0 1 1
2
The “deep mystery” of QM: Who decides when a
“measurement” happens? An “outsider’s view”:
 0
  1  World   0 World0   1 World1
Unitary
The qubit simply gets entangled with your own body (and lots of
other stuff), so that it collapses to |0 or |1 “relative to you”
“Many Worlds? Or Many Words?”
Quantum Computing
“Quantum Mechanics on Steroids”
A general entangled state of n qubits requires ~2n amplitudes
to specify:
x
x0,1n
Presents an obvious practical problem when using
conventional computers to simulate quantum mechanics
 

x
Interesting
Feynman 1981: So then why not turn things around, and
build computers that themselves exploit superposition?
Shor 1994: Such a computer could do more than simulate
QM—e.g., it could factor integers in polynomial time
Where we are: A QC has now factored 21 into 37, with
high probability (Martín-López et al. 2012)
Why is scaling up so hard? Decoherence!
The famous Fault-Tolerance Theorem suggests we only
need to get decoherence down to some finite level (~1%
per qubit per gate time?) to do arbitrarily long quantum
computations
Many discussions of the feasibility of QC focus entirely
on the Fault-Tolerance Theorem and its assumptions
My focus is different! For I take it as obvious that, if QC
is impossible, there must exist a deeper explanation
than “such-and-such error-correction schemes might
not work against every conceivable kind of noise”
A few physicists and computer scientists remain
vocally skeptical that scalable QC is possible…
‘t Hooft
Kalai
Goldreich Wolfram
Alicki
Dyakanov
Levin
(And perhaps a much larger number are “silently skeptical”?)
One historical analogy: People thought
Charles Babbage’s idea was cute but would
never work in practice
And they were right—for ~130 years!
3 “Skeptical Positions” and My Responses
1. The difficulties are immense! QC might not be practical
for a very long time (and will have limited applications even if built)
My response: Agreement
Skeptical
position
I won’titself is wrong
2. QC will fail because
quantum
mechanics
address in this talk: BPP=BQP
My response: Awesome! A revolution in physics—even
better than QC. Count me in
3. Quantum mechanics is fine, but QC won’t work because of
some “principle of unavoidable noise” (?) on top of QM
My response: Also wonderful! Explain your principle,
why it’s true, and why it kills QC. Does it imply a fast
classical simulation of “realistic” quantum systems?
Common Reasons for QC Skepticism
1. “Sounds too good to be true / like science fiction”
Response: Would any science-fiction writer have imagined a
computer that solved factoring, discrete log, and a few
other special problems, but not NP-complete problems?
2. Annoyance at hype/misrepresentations in popular press
Response: Tell me about it…
3. The Extended Church-Turing Thesis rules out QC
Response: The ECT was a CS encroachment onto physics’ turf …
we can’t cry foul if physics counterattacks us!
4. “n qubits couldn’t possibly encode 2n bits”
5. Underlying skepticism of QM itself (or modern physics in general?)
The “2n Bits Is Too Many” Argument
Shouldn’t we search for a more “reasonable” theory that
agrees with QM on existing experiments, but:
Lets us feasibly prepare only a singly-exponential number of
states, not a doubly-exponential number?
Predicts that in a volume of size n, only poly(n) bits can be
reliably stored and retrieved, not exp(n) bits?
Lets us summarize the results of exp(n) possible measurements
on an n-qubit state using only poly(n) classical bits?
Predicts that n-qubit states should be “PAC-learnable” with only
poly(n) samples, not exp(n)?
Such a theory exists! It’s called quantum mechanics
OK, but suppose QC is impossible.
Obvious question: What’s the criterion that tells us
which quantum-mechanical experiments can be
done, and which ones can’t?
Possibility 1: Precision in Amplitudes.
“The major problem is the requirement that basic quantum equations
hold to multi-hundredth if not millionth decimal positions where the
significant digits of the relevant quantum amplitudes reside. We have
never seen a physical law valid to over a dozen decimals … Are quantum
amplitudes still complex numbers to such accuracies or do they become
quaternions, colored graphs, or sick-humored gremlins?” —Leonid Levin
Obvious Response:
Possibility 2: OK, small amplitudes might be fine for
separable states—but entanglement is an illusion.
Obvious Response: The Bell Inequality
(and its experimental violation)
Possibility 3: Fine, 2 or 3 particles might be entangled, but a
thousand particles could never be entangled!
That doesn’t work either…
Buckyball double-slit
experiment
High-temperature
superconductors
Needed: A “Sure/Shor separator” (A. 2004),
between the many-particle quantum states
we’re sure we can create and those that suffice
for things like Shor’s algorithm
PRINCIPLED LINE
My Candidate: “Tree Size”
+
2
7

3
7


But this doesn’t work either!
1
2
|01
+
1
2
|11
1
2
|02
+
1
2
|01
|12
|12
Symmetrized states of n identical fermions/bosons can be shown to
have tree size n(log n)
(Using the breakthrough lower bound of [Raz 2004] on the multilinear
formula size of the permanent and determinant)
n(log n) lower bound probably also holds for 2D and 3D spin lattices
(Indeed, in all these cases, the true tree size is probably exp(n))
“God, Dice, Yadda Yadda”
A completely different way quantum mechanics might be
“not the whole story”: What if there were “deeper,
underlying” physical laws, and quantum mechanics was
“merely a statistical tool” derivable from those laws?
Note: If quantum mechanics were exactly
derivable, this still wouldn’t kill QC! But maybe
it could tell us where to look for a breakdown?
Recently, I became interested in -epistemic theories, an
attempt to formalize the above “Einsteinian impulse”…
A d-dimensional
-Epistemic Theory is defined by:
A set  of “ontic states” (ontic = philosopher-speak for “real”)
For each pure state |Hd, a probability measure  over
ontic states

Can
trivially satisfy these axioms by setting
pointB=(v
measure
concentrated
on
d,  = the basis
For each=H
orthonormal
,…,v
)
and
i[d],
a
1
d
2
|
itself,
and
R
()=|v
||
i,B
i
“response function” R :[0,1], satisfying
i,B
Gives a completely uninteresting restatement of
quantum mechanics (called the “Beltrametti(Conservation ofBugajski
Probability)
theory”)
(Born Rule)
More Interesting Example: Kochen-Specker Theory
Response functions Ri,B():
deterministically return basis
vector closest to |
Accounts beautifully for one
qubit -epistemically!
(One qutrit: Already a problem…)
Observation: If |=0, then  and  can’t overlap
Call the theory maximally nontrivial if (as above)  and 
overlap whenever | and | are not orthogonal
PBR (Pusey-Barrett-Rudolph 2011) No-Go Theorem
Suppose we assume =
(“-epistemic theories must behave well under tensor product”)
Then there’s a 2-qubit entangled measurement M, such
that the only way to explain M’s behavior on the 4 states
is using a “trivial” theory that doesn’t mix 0 and +.
(Can be generalized to any pair of states, not just |0 and |+)
Bell’s Theorem: Can’t “locally” simulate all separable
measurements on a fixed entangled state
PBR Theorem: Can’t “locally” simulate a fixed entangled
measurement on all separable states (at least nontrivially so)
But suppose we drop PBR’s tensor assumption. Then:
Theorem (A.-Bouland-Chua-Lowther ‘13): There’s a maximallynontrivial -epistemic theory in any finite dimension d
Albeit an extremely weird one!
Solves the main open problem of Lewis et al. ‘12
Ideas of the construction:
Cover Hd with -nets, for all =1/n
Mix the states in pairs of small balls
(B,B), where |,| both belong
to some -net
(“Mix” = make their ontic distributions overlap)
To mix all non-orthogonal states,
take a “convex combination” of
countably many such theories
On the other hand, suppose we want our theory to be
symmetric—meaning that
and
Theorem (ABCL’13): There’s
no symmetric, maximallynontrivial -epistemic theory
in dimensions d3
Our proof, in the general case,
uses some measure theory
and differential geometry
(and strangely, currently works
only with complex amplitudes,
not real ones)
If scalable QC is indeed possible, are there any
experiments that could help demonstrate that—short
of actually building a general-purpose QC?
Some possibilities:
- Keep 1 qubit coherent for an extremely long time
(Current record: ~15 minutes in ion traps)
- Quantum adiabatic optimization
(the “D-Wave approach”)
- BosonSampling
(and other restricted QC proposals)
BosonSampling [A.-Arkhipov 2011]
“For when you only need your QC to overthrow the
Extended Church-Turing Thesis, not do anything useful”
n identical photons are generated, sent through a network of
beamsplitters, then measured to see where they are
The result: A sample from a distribution {px}, such that each
probability px equals |Per(Ax)|2, for some known nn complex
matrix Ax (Permanent: Famous #P-complete problem)
Theorem: A classical computer can’t sample the same
distribution in polynomial time, unless P#P=BPPNP.
We conjecture that this extends even to approximate/noisy classical
simulations. Leads to a beautiful complexity-theoretic open problem!
Is it #P-complete to approximate Per(A), with high probability over an
nn matrix A of independent N(0,1) Gaussians?
Recent BosonSampling demonstrations with 3-4 photons
[Broome et al., Tillmann et al., Walmsley et al., Crespi et al.]
If this could be scaled to ~20-30 photons, it would probably
BosonSample faster than a classical simulation of itself…
Main engineering challenge: Deterministic generation of
single photons, for synchronized arrival at the detectors
Conclusion
I don’t know for sure that scalable QC is possible
But I do know that the popular framing of the question
gets it exactly backwards!
Believing that QC can work doesn’t make you a
starry-eyed visionary, but a scientific conservative
Doubting that QC can work doesn’t make you a
cautious realist, but a scientific radical
Download