Univariate Normal

Usual parameterisation

Probability distribution function

The parameters of an univariate Gaussian distribution are $\mu$, its mean, and $\sigma^2$, its variance.

It can also be parameterised by its precision $\lambda = \frac{1}{\sigma^2}$.

Maximum likelihood estimators

Let $(x_n)$ be a set of observed realisations from a Normal distribution.

$\hat{\mu} \mid (x_n)$ $= \overline{x} = \frac{1}{N}\sum_{n=1}^N x_n$ $\sim \mathcal{N}\left(\mu, \frac{\sigma^2}{N}\right)$
$\hat{\sigma}^2 \mid (x_n)$ $= \overline{x^2} - \overline{x}^2 = \frac{1}{N}\sum_{n=1}^N (x_n - \overline{x})^2$ $\sim \frac{\sigma^2}{N} \chi^2\left(N-1\right)$
$\hat{\sigma}^2 \mid \mu, (x_n)$ $= \overline{x^2} + \mu(\mu - 2\overline{x}) = \frac{1}{N}\sum_{n=1}^N (x_n - \mu)^2$  

I need to check $\hat{\sigma}^2 \mid \mu, (x_n)$

Conjugate prior

We list here the distributions that can be used as conjugate prior for the parameters of an univariate Normal distribution:

$\mu \mid \lambda$ Univariate Normal $\mathcal{N}_\lambda$
$\lambda \mid \mu$ Gamma $\mathcal{G}_\mathcal{N}$
$\sigma^2 \mid \mu$ Inverse-Gamma $\mathrm{Inv-}\mathcal{G}_\mathcal{N}$
$\mu, \lambda$ Normal-Gamma $\mathcal{N}\mathcal{G}$
$\mu, \sigma^2$ Normal-Inverse-Gamma $\mathcal{N}\mathrm{Inv-}\mathcal{G}$

Update equations can be found in the Conjugate prior article.

Kullback-Leibler divergence

The KL-divergence can be written as

where $H$ is the cross-entropy. We have

Consequently

Or, if a parameterisation based on the precision is used,

Normal mean conjugate” parameterisation

When the Normal distribution is used as a conjugate prior for the mean of another Normal distribution with known precision $\lambda$, it makes sense to parameterise it in terms of its expected value, $\mu_0$, and degrees of freedom, $n_0$:

Kullback-Leibler divergence

The KL-divergence can be written as


Created by Yaël Balbastre on 6 April 2018. Last edited on 6 April 2018.