WeiYa's Work Yard

A traveler with endless curiosity, who fell into the ocean of statistics, tries to write down his ideas and notes to save himself.

The Normal Model

Posted on 0 Comments
Tags: Bayesian Inference

The normal model

\[p(y\mid \theta,\sigma^2)=\frac{1}{\sqrt{(2\pi \sigma^2)}}e^{-\frac{1}{2}(\frac{y-\theta}{\sigma})^2}, -\infty<y<\infty\]

Inference for the mean, conditional on the variance

Joint inference for the mean and variance

posterior inference

\[p(\theta,\sigma^2\mid y_1,\ldots,y_n)=p(y_1,\ldots,y_n\mid \theta,\sigma^2)p(\theta,\sigma^2)/p(y_1,\ldots,y_n)\]

and joint distribution can be

\[p(\theta,\sigma^2)=p(\theta\mid\sigma^2)p(\sigma^2)\]

inverse-gamma distribution:

precision = $1/\sigma^2$

\[1/\sigma^2\sim gamma(a,b)\]

variance = $\sigma^2$

\[\sigma\sim inverse-gamma(a,b)\]

posterior inference

\(\begin{align} 1/\sigma^2 & \sim gamma(v_0/2, v_0\sigma^2_0/2)\\ \theta\mid \sigma^2 &\sim normal(\mu_0,\sigma/\kappa_0)\\ Y_1,\ldots,Y_n\mid \theta,\sigma^2 &\sim i.i.d normal(\theta, \sigma^2) \end{align}\)

Bias, variance and mean squared error

\[MSE(\hat\theta\mid\theta_0)=Var(\hat\theta\mid\theta_0)+Bias^2(\hat\theta\mid\theta_0)\]

Prior specification based on expectations

\[p(y\mid\phi) = h(y)c(\phi)exp(\phi^Tt(y))\]
  • $t(y)=(y,y^2)$
  • $\phi = (\theta/\sigma^2,-(2\sigma^2)^{-1})$
  • $c(\phi)=\vert \phi_2\vert^{1/2}exp(\phi_1^2/(2\phi_2))$

a conjugate prior distribution \(p(\phi\mid n_0,t_0)\propto c(\phi)^{n_0}exp(n_0t_0^T\phi)\)

where $t_0=(E(Y), E(Y^2))$.

The normal model for non-normal data


Published in categories Note