Entropy

Entropy is a function that computes the von Neumann entropy or Rényi entropy of a density matrix. That is, given a density matrix $\rho$, it computes the following quantity:


 * $$S(\rho) := -\mathrm{Tr}\big(\rho\log_2(\rho)\big)$$

(i.e., the von Neumann entropy) or the following quantity:


 * $$S_\alpha(\rho) := \frac{1}{1-\alpha}\log_2\big(\mathrm{Tr}(\rho^\alpha)\big)$$

(i.e., the Rényi-$\alpha$ entropy).

Syntax

 * ENT = Entropy(RHO)
 * ENT = Entropy(RHO,BASE)
 * ENT = Entropy(RHO,BASE,ALPHA)

Argument descriptions

 * RHO: A density matrix to have its entropy computed.
 * BASE (optional, default 2): The base of the logarithm used in the entropy calculation.
 * ALPHA (optional, default 1): A non-negative real parameter that determines which entropy is computed (ALPHA = 1 corresponds to the von Neumann entropy, otherwise the Rényi-ALPHA</tt> entropy is computed).

The extreme cases: pure states and maximally-mixed states
A pure state has entropy zero:

A d-by-d maximally-mixed state has entropy $\log_2(d)$:

All other states have entropy somewhere between these two extremes: