Where does the formula for the entropy of a Gaussian distribution come from?
The formula being S = k/2 (1+log(2 pi) ) + 1/2 log ( m^2 det A^{-1} ) The way to understand and prove these relationships is to think of the matrix in its diagonal basis. Any symmetric matrix can be diagonalized into the form l1 0 0 0 … 0 l2 0 0 … 0 0 l3 0 … . . . . . where l1, l2 and l3 are its first three eigenvalues. This diagonalization involves an orthogonal transformation, i.e. a rotation, and such transformations preserve both traces and determinants; and entropies too. So whenever you have an integral of the form \int d^k w exp ( -1/2 w A w ) … you can (for convenience of thought) pretend that A is in fact a diagonal matrix, in which case, the integral has the separable form: \int d^k w exp ( -1/2 \sum_i l_i w_i^2 ) Such separable integrals are straightforward. All that remains is to compute the entropy of a one-dimensional gaussian, and that is left as an exercise for the reader. How to derive w_mp=A^(-1)*B*w_ml? Write out the expression for M(w). Differentiate with re