64 Entropy of the Normal distribution
Consider a univariate Normal distribution with location parameter \(\mu\) and scale parameter \(\sigma\). Because it is continuous, we cannot define an entropy, but instead can us the analogous concept of a differential entropy, which is defined for a univariate continuous distribution as
\[\begin{align} h[f(y)] = -\int\mathrm{d}y\,f(y)\,\ln f(y), \end{align} \tag{64.1}\]
though the choice of the base of the logarithm is arbitrary.
a) Just using your intuition, do you think the entropy of the Normal distribution is a function on \(\mu\)?
b) Compute the entropy of a Normal distribution. (Hint: It may help to look at Gaussian integrals. But you do not have to explicitly integrate if you recall the moments of a Normal distribution, which, admittedly, are themselves computed by evaluating Gaussian integrals.) Was your intuition correct?