# Quantum mutual information

In quantum information theory, **quantum mutual information**, or **von Neumann mutual information**, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.

## Motivation

For simplicity, it will be assumed that all objects in the article are finite-dimensional.

The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables *p*(*x*, *y*), the two marginal distributions are

- [math]\displaystyle{ p(x) = \sum_{y} p(x,y), \qquad p(y) = \sum_{x} p(x,y). }[/math]

The classical mutual information *I*(*X*:*Y*) is defined by

- [math]\displaystyle{ I(X:Y) = S(p(x)) + S(p(y)) - S(p(x,y)) }[/math]

where *S*(*q*) denotes the Shannon entropy of the probability distribution *q*.

One can calculate directly

- [math]\displaystyle{ \begin{align} S(p(x)) + S(p(y)) &= - \left (\sum_x p_x \log p(x) + \sum_y p_y \log p(y) \right ) \\ &= -\left (\sum_x \left ( \sum_{y'} p(x,y') \log \sum_{y'} p(x,y') \right ) + \sum_y \left ( \sum_{x'} p(x',y) \log \sum_{x'} p(x',y) \right ) \right ) \\ &= -\left (\sum_{x,y} p(x,y) \left (\log \sum_{y'} p(x,y') + \log \sum_{x'} p(x',y) \right ) \right )\\ &= -\sum_{x,y} p(x,y) \log p(x) p(y) \end{align} }[/math]

So the mutual information is

- [math]\displaystyle{ I(X:Y) = \sum_{x,y} p(x,y) \log \frac{p(x,y)}{p(x) p(y)}. }[/math]

But this is precisely the relative entropy between *p*(*x*, *y*) and *p*(*x*)*p*(*y*). In other words, if we assume the two variables *x* and *y* to be uncorrelated, mutual information is the *discrepancy in uncertainty* resulting from this (possibly erroneous) assumption.

It follows from the property of relative entropy that *I*(*X*:*Y*) ≥ 0 and equality holds if and only if *p*(*x*, *y*) = *p*(*x*)*p*(*y*).

## Definition

The quantum mechanical counterpart of classical probability distributions are modelled with density matrices.

Consider a quantum system that can be divided into two parts, A and B, such that independent measurements can be made on either part. The state space of the entire quantum system is then the tensor product of the spaces for the two parts.

- [math]\displaystyle{ H_{AB} := H_A \otimes H_B. }[/math]

Let *ρ*^{AB} be a density matrix acting on states in *H*_{AB}. The von Neumann entropy of a density matrix S(*ρ*), is the quantum mechanical analogy of the Shannon entropy.

- [math]\displaystyle{ S(\rho) = - \operatorname{Tr} \rho \log \rho. }[/math]

For a probability distribution *p*(*x*,*y*), the marginal distributions are obtained by integrating away the variables *x* or *y*. The corresponding operation for density matrices is the partial trace. So one can assign to *ρ* a state on the subsystem *A* by

- [math]\displaystyle{ \rho^A = \operatorname{Tr}_B \; \rho^{AB} }[/math]

where Tr_{B} is partial trace with respect to system *B*. This is the **reduced state** of *ρ ^{AB}* on system

*A*. The

**reduced von Neumann entropy**of

*ρ*with respect to system

^{AB}*A*is

- [math]\displaystyle{ \;S(\rho^A). }[/math]

*S*(*ρ ^{B}*) is defined in the same way.

It can now be seen that the definition of quantum mutual information, corresponding to the classical definition, should be as follows.

- [math]\displaystyle{ \; I(A\!:\!B) := S(\rho^A) + S(\rho^B) - S(\rho^{AB}). }[/math]

Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that

- [math]\displaystyle{ I(A\!:\!B) = S(\rho^{AB} \| \rho^A \otimes \rho^B) }[/math]

where [math]\displaystyle{ S(\cdot \| \cdot) }[/math] denotes quantum relative entropy.

This article does not cite any external source. HandWiki requires at least one external source. See citing external sources. (2021) (Learn how and when to remove this template message) |

Original source: https://en.wikipedia.org/wiki/Quantum mutual information.
Read more |