advertisement

Montreal Summer School2005

100 %
0 %
advertisement
Information about Montreal Summer School2005
Entertainment

Published on December 31, 2007

Author: Roxie

Source: authorstream.com

advertisement

Quantum Information Theory:  Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School Overview:  Overview Part I: What is information theory? Entropy, compression, noisy coding and beyond What does it have to do with quantum mechanics? Noise in the quantum mechanical formalism Density operators, the partial trace, quantum operations Some quantum information theory highlights Part II: Resource inequalities A skeleton key Information (Shannon) theory:  Information (Shannon) theory A practical question: How to best make use of a given communications resource? A mathematico-epistemological question: How to quantify uncertainty and information? Shannon: Solved the first by considering the second. A mathematical theory of communication [1948] The Quantifying uncertainty:  Quantifying uncertainty Entropy: H(X) = - x p(x) log2 p(x) Proportional to entropy of statistical physics Term suggested by von Neumann (more on him later) Can arrive at definition axiomatically: H(X,Y) = H(X) + H(Y) for independent X, Y, etc. Operational point of view… Compression:  X1 X2 … Xn Compression Source of independent copies of X If X is binary: 0000100111010100010101100101 About nP(X=0) 0’s and nP(X=1) 1’s Can compress n copies of X to a binary string of length ~nH(X) Typicality in more detail:  Typicality in more detail Let xn = x1,x2,…,xn with xj 2 X We say that xn is -typical with respect to p(x) if For all a 2 X with p(a)>0, |1/n N(a|xn) – p(a) | <  / |X| For all a 2 X with p(a) = 0, N(a|xn)=0. For >0, the probability that a random string Xn is -typical goes to 1. If xn is -typical, 2-n[H(X)+]· p(xn) · 2-n[H(X)-] The number of -typical strings is bounded above by 2n[H(X)+] Quantifying information:  Quantifying information H(X) H(Y|X) H(X|Y) = H(X,Y)-H(Y) = EYH(X|Y=y) I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y) H(X,Y) Sending information through noisy channels:  Sending information through noisy channels Statistical model of a noisy channel: Data processing inequality:  Data processing inequality Alice Bob time I(X;Y) ¸ I(Z;Y) Optimality in Shannon’s theorem:  Optimality in Shannon’s theorem Shannon’s noisy coding theorem: In the limit of many uses, the optimal rate at which Alice can send messages reliably to Bob through  is given by the formula Assume there exists a code with rate R and perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages. nR = H(M) = I(M;M’) · I(M;Yn) · I(Xn;Yn) · j=1n I(Xj,Yj) · n¢maxp(x) I(X;Y) Shannon theory provides:  Shannon theory provides Practically speaking: A holy grail for error-correcting codes Conceptually speaking: A operationally-motivated way of thinking about correlations What’s missing (for a quantum mechanic)? Features from linear structure: Entanglement and non-orthogonality Quantum Shannon Theory provides:  Quantum Shannon Theory provides General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits… Relies on a Major simplifying assumption: Computation is free Minor simplifying assumption: Noise and data have regular structure Before we get going: Some unavoidable formalism:  Before we get going: Some unavoidable formalism We need quantum generalizations of: Probability distributions (density operators) Marginal distributions (partial trace) Noisy channels (quantum operations) Mixing quantum states: The density operator:  Mixing quantum states: The density operator Draw |xi with probability p(x) Perform a measurement {|0i,|1i}: Probability of outcome j: qj = x p(x) |hj|xi |2 = x p(x) tr[|jih j|xihx|] = tr[ |jih j|  ], where Outcome probability is linear in  Properties of the density operator:  Properties of the density operator  is Hermitian: y = [x p(x) |xihx|]y = x p(x) [|xihx|]y =  is positive semidefinite: h||i = x p(x) h|xihx|i¸ 0 tr[] = 1: tr[] = x p(x) tr[|xihx|] = x p(x) = 1 Ensemble ambiguity: I/2 = ½[|0ih 0| + |1ih 1|] = ½[|+ih+| + |-ih-|] The density operator: examples:  The density operator: examples Which of the following are density operators? The partial trace:  The partial trace Suppose that AB is a density operator on A­B Alice measures {Mk} on A Outcome probability is qk = tr[ (Mk­ IB) AB] Define A = trB[AB] = j Bhj|AB|jiB. Then qk = tr[ Mk A ] A describes outcome statistics for all possible experiments by Alice alone {Mk} A Purification:  Purification Suppose that A is a density operator on A Diagonalize A = i i |iihi| Let |i = i i1/2 |iiA|iiB Note that A = trB[] |i is a purification of  Symmetry: A=A and B have same non-zero eigenvalues A Quantum (noisy) channels: Analogs of p(y|x):  Quantum (noisy) channels: Analogs of p(y|x) What reasonable constraints might such a channel :A! B satisfy? Take density operators to density operators Convex linearity: a mixture of input states should be mapped to a corresponding mixture of output states All such maps can, in principle, be realized physically Require that ( ­ IC)(AC) always be a density operator too Doesn’t come for free! Let T be the transpose map on A. If |i = |00iAC + |11iAC, then (T­ IC)(|ih|) has negative eigenvalues The resulting set of transformations on density operators are known as trace-preserving, completely positive maps Quantum channels: examples:  Quantum channels: examples Adjoining ancilla:    ­ |0ih0| Unitary transformations:   UUy Partial trace: AB  trB[AB] That’s it! All channels can be built out of these operations: Further examples:  Further examples The depolarizing channel:   (1-p) + p I/2 The dephasing channel   j hj||ji Equivalent to measuring {|ji} then forgetting the outcome One last thing you should see...:  One last thing you should see... What happens if a measurement is preceded by a general quantum operation? Leads to more general types of measurements: Positive Operator-Valued Measures (forevermore POVM) {Mk} such that Mk ¸ 0, k Mk = 1 Probability of outcome k is tr[Mk ] POVM’s: What are they good for?:  POVM’s: What are they good for? States are non-orthogonal, so projective measurements won’t work. Let N = 1/(1+1/21/2). Exercise: M0 = N |1ih1|, M1 = N |-ih-|, M2 = I – M0 – M1 is a POVM Note: * Outcome 0 implies |1i * Outcome 1 implies |0i * Outcome 2 is inconclusive Instead of imperfect distinguishability all of the time, the POVM provides perfect distinguishability some of the time. Notions of distinguishability:  Notions of distinguishability Basic requirement: quantum channels do not increase “distinguishability” Fidelity Trace distance F(,)=max |h|i|2 T(,)=|-|1 F(,)=[Tr(1/21/2)]2 F=0 for perfectly distinguishable F=1 for identical T=2 for perfectly distinguishable T=0 for identical T(,)=2max|p(k=0|)-p(k=0|)| where max is over measurements {Mk} F((),()) ¸ F(,) T(,) ¸ T((,()) Statements made today hold for both measures Back to information theory!:  Back to information theory! Quantifying uncertainty:  Quantifying uncertainty Let  = x p(x) |xihx| be a density operator von Neumann entropy: H() = - tr [ log ] Equal to Shannon entropy of  eigenvalues Analog of a joint random variable: AB describes a composite system A ­ B H(A) = H(A) = H( trB AB) Quantifying uncertainty: examples:  Quantifying uncertainty: examples H(|ih|) = 0 H(I/2) = 1 H(­) = H() + H() H(I/2n) = n H(p © (1-p)) = H(p,1-p) + pH() + (1-p)H() Compression:   ­  ­ … ­  Compression Source of independent copies of AB: Can compress n copies of B to a system of ~nH(B) qubits while preserving correlations with A A A A B B B [Schumacher, Petz] The typical subspace:  The typical subspace Diagonalize  = x p(x) |exihex| Then ­n = xn p(xn) |exn ihexn| The -typical projector t is the projector onto the span of the |exn ihexn| such that xn is typical tr[­ n t] ! 1 as n ! 1 Quantifying information:  Quantifying information H(A) H(B|A) H(A|B) = H(AB)-H(B) B = I/2 H(A|B) = 0 – 1 = -1 Conditional entropy can be negative! H(AB) Quantifying information:  Quantifying information H(A) H(B|A) H(A|B) = H(AB)-H(B) I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB) ¸ 0 H(AB) Sending classical information through noisy channels:  Sending classical information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) Sending classical information through noisy channels:  Sending classical information through noisy channels B­ n 2nH(B) X1,X2,…,Xn Sending classical information through noisy channels:  Sending classical information through noisy channels B­ n 2nH(B) X1,X2,…,Xn 2nH(B|A) 2nH(B|A) 2nH(B|A) Distinguish using well-chosen POVM Data processing inequality (Strong subadditivity):  Data processing inequality (Strong subadditivity) Alice Bob time U I(A;B)  I(A;B) ¸ I(A;B) Optimality in the HSW theorem:  Optimality in the HSW theorem Assume there exists a code with rate R with perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages. nR = H(M) = I(M;M’) · I(A;B) where m Sending quantum information through noisy channels:  Sending quantum information through noisy channels Physical model of a noisy channel: (Trace-preserving, completely positive map) Entanglement and privacy: More than an analogy:  Entanglement and privacy: More than an analogy p(y,z|x) x = x1 x2 … xn y=y1 y2 … yn z = z1 z2 … zn How to send a private message from Alice to Bob? AC93 Can send private messages at rate I(X;Y)-I(X;Z) Entanglement and privacy: More than an analogy:  Entanglement and privacy: More than an analogy UA’->BE­ n |xiA’ |iBE = U­ n|xi How to send a private message from Alice to Bob? D03 Can send private messages at rate I(X:A)-I(X:E) Entanglement and privacy: More than an analogy:  Entanglement and privacy: More than an analogy UA’->BE­ n x px1/2|xiA|xiA’ x px1/2|xiA|xiBE How to send a private message from Alice to Bob? SW97 D03 Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E) Conclusions: Part I:  Conclusions: Part I Information theory can be generalized to analyze quantum information processing Yields a rich theory, surprising conceptual simplicity Operational approach to thinking about quantum mechanics: Compression, data transmission, superdense coding, subspace transmission, teleportation Slide42:  Some references: Part I: Standard textbooks: * Cover & Thomas, Elements of information theory. * Nielsen & Chuang, Quantum computation and quantum information. (and references therein) * Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127 Part II: Papers available at arxiv.org: * Devetak, Harrow & Winter, A family of quantum protocols, quant-ph/0308044. * Horodecki, Oppenheim & Winter, Quantum information can be negative, quant-ph/0505062

Add a comment

Related presentations

Related pages

Pictures from the 2005 Benoit Barbeau Summer Hockey School

Montreal Summer Hockey School: Magdalen Islands Hockey Camp ... Downloads: Contact : Français : PICTURES FROM THE 2005 BENOIT BARBEAU SUMMER HOCKEY SCHOOL
Read more

Summer School 2004 | Computational Mechanics of Materials ...

Summer School 2005; Summer School 2004; Summer School 2003; Summer School 2002; Summer School 2001; Admission. Eligibility. ... McGill University Montreal ...
Read more

WHO Summer School, 2005 - Clinical Unit of Health Promotion

Montreal, 3725 St. Denis, Montreal, H2X 3L9, ... WHO Summer School 2005 Clinical Unit of Health Promotion WHO Collaborating Centre Good ideas and tricks
Read more

Carmelle Goldberg | LinkedIn

Montreal's Public Health Summer School. 2005 – 2006 (1 year) Montreal, Canada Area. Facilitated two courses: 1) La donnée d'enquête : un plus pour la ...
Read more

Carmelle Goldberg | LinkedIn

Carmelle Goldberg. Senior Consultant, Program Monitoring and Evaluation Lead at Gevity Consulting Inc. ... Montreal's Public Health Summer School; Education:
Read more

Summer Program Employment - ics.edu.hk

Summer Program Employment. Please apply at the link here to apply for summer program employment. Time. Show Choir Presents: Time ...
Read more

Montreal,McGill and Home - blogspot.com

Montreal,McGill and Home Saturday, March 05, 2005. ... SHARCNET Materials Science Summer School 2005 : Instructors Application Form MRSEC main page
Read more

Cover Pages: SGML/XML Conferences, Seminars, Tutorials ...

SGML/XML Conferences, Seminars, Tutorials, Workshops: ... "Co-inventor of XML 1.0 to Speak at CSW's XML Summer School 2005." Announcement April 28, 2005.
Read more

Curriculum Vitae Rüdiger Bachmann April 2014

Curriculum Vitae Rüdiger Bachmann April 2014 ... Yale Summer School, 2005 ... Member of the program committee of the 2010 SED meetings in Montreal, ...
Read more

Nathalie Agar, Ph.D. - Harvard University

Nathalie Agar, Ph.D. Director, ... Ph.D. Chemistry (Dr. Justin Powlowski), Concordia University, Montreal, Canada, 2002. ... Inorganic Biochemistry Summer ...
Read more