The derivation of Gaussian distribution entropy.

This is an automatically translated post by LLM. The original post is in Chinese. If you find any translation errors, please leave a comment to help me improve the translation. Thanks!

This article mainly derives the entropy of a univariate Gaussian distribution:

113646456_p0

\[ \begin{equation} p(x) = \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma}} \end{equation} \]

\[ \begin{align} H(x) & = -\int p(x) \ln p(x) \mathrm{d}x \notag \\ & = -\int \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} \cdot \ln \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} \mathrm{d}x \notag \\ & = -\int \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} \cdot (-\ln {\sqrt{2\pi\sigma^2}}-\frac{(x-\mu)^2}{2\sigma^2}) \mathrm{d}x \notag \\ & = \int \frac{\ln {\sqrt{2\pi\sigma^2}}}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} \cdot \mathrm{d}x+\int\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}} \cdot\frac{(x-\mu)^2}{2\sigma^2} \mathrm{d}x \notag \\ & = \ln\sqrt{2\pi\sigma^2}+\frac {1}{\sqrt{\pi}}\int e^{-t^2}\cdot t^2\mathrm{d}t \notag \\ & = \ln\sqrt{2\pi\sigma^2}+\frac 12 \notag \\ & = \frac 12(\ln(2\pi\sigma^2)+1) \end{align} \] where: \[ \begin{equation} \int e^{-x^2}\mathrm{d}x = \sqrt\pi \end{equation} \]

\[ \begin{align} \int x^2\cdot e^{-x^2}\mathrm{d}x & = -\frac12 \int x(-2x)e^{-x^2}\mathrm{d}x \notag \\ &=-\frac12 \int x\mathrm{d}e^{-x^2} \notag \\ &=-\frac12 (x\cdot e^{-x^2}\Big|_{-\infty}^{+\infty}-\int e^{-x^2}\mathrm{d}x) \notag \\ &=\frac12 \int e^{-x^2}\mathrm{d}x) \notag \\ &=\frac {\sqrt{\pi}}{2} \end{align} \]