Jump to navigation Jump to search
Computes the von Neumann or Rényi entropy of a density matrix

Other toolboxes required none
Function category Information theory

Entropy is a function that computes the von Neumann entropy or Rényi entropy of a density matrix. That is, given a density matrix $\rho$, it computes the following quantity:

\[S(\rho) := -\mathrm{Tr}\big(\rho\log_2(\rho)\big)\]

(i.e., the von Neumann entropy) or the following quantity:

\[S_\alpha(\rho) := \frac{1}{1-\alpha}\log_2\big(\mathrm{Tr}(\rho^\alpha)\big)\]

(i.e., the Rényi-\(\alpha\) entropy).


  • ENT = Entropy(RHO)
  • ENT = Entropy(RHO,BASE)
  • ENT = Entropy(RHO,BASE,ALPHA)

Argument descriptions

  • RHO: A density matrix to have its entropy computed.
  • BASE (optional, default 2): The base of the logarithm used in the entropy calculation.
  • ALPHA (optional, default 1): A non-negative real parameter that determines which entropy is computed (ALPHA = 1 corresponds to the von Neumann entropy, otherwise the Rényi-ALPHA entropy is computed).


The extreme cases: pure states and maximally-mixed states

A pure state has entropy zero:

>> Entropy(RandomDensityMatrix(4,0,1)) % entropy of a random 4-by-4 rank-1 density matrix

ans =

   7.3396e-15 % silly numerical errors: this is effectively zero

A d-by-d maximally-mixed state has entropy \(\log_2(d)\):

>> Entropy(eye(4)/4)

ans =


All other states have entropy somewhere between these two extremes:

>> Entropy(RandomDensityMatrix(4))

ans =



The Rényi-\(\alpha\) entropy approaches the von Neumann entropy as \(\alpha \rightarrow 1\).

Source code

Click here to view this function's source code on github.