Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Where does the formula for the entropy of a Gaussian distribution come from?

0
Posted

Where does the formula for the entropy of a Gaussian distribution come from?

0

The formula being S = k/2 (1+log(2 pi) ) + 1/2 log ( m^2 det A^{-1} ) The way to understand and prove these relationships is to think of the matrix in its diagonal basis. Any symmetric matrix can be diagonalized into the form l1 0 0 0 … 0 l2 0 0 … 0 0 l3 0 … . . . . . where l1, l2 and l3 are its first three eigenvalues. This diagonalization involves an orthogonal transformation, i.e. a rotation, and such transformations preserve both traces and determinants; and entropies too. So whenever you have an integral of the form \int d^k w exp ( -1/2 w A w ) … you can (for convenience of thought) pretend that A is in fact a diagonal matrix, in which case, the integral has the separable form: \int d^k w exp ( -1/2 \sum_i l_i w_i^2 ) Such separable integrals are straightforward. All that remains is to compute the entropy of a one-dimensional gaussian, and that is left as an exercise for the reader. How to derive w_mp=A^(-1)*B*w_ml? Write out the expression for M(w). Differentiate with re

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.