Approximation to Log-likelihood of Nonlinear Mixed-effects Model
Posted on
Gaussian quadrature is used to approximate integrals of functions with respect to a given kernel by a weighted average of the integrand evaluated at predetermined abscissas.
Particularly, for the kernel $\exp(-x^2)$, the quadrature is called Gauss-Hermite quadrature.
For the nonlinear mixed-effect model, the $j$-th observation on the $i$-th subject is modeled as
\[y_{ij} = f(\phi_{ij}, x_{ij}) + \varepsilon_{ij}, i=1,\ldots, M, j=1,\ldots, n_i\]where $f$ is a nonlinear function of a subject-specific parameter vector $\phi_{ij}$ and the predictor vector $x_{ij}$.
The subject-specific parameter vector is modeled as
\[\phi_{ij} = A_{ij}\beta + B_{ij} b_i, b_i\sim N(0, \sigma^2D)\]MLE is based on the marginal density of $y$
\[p(y\mid \beta, D, \sigma^2) = \int p(y\mid b, \beta, D, \sigma^2) p(b)db\]In general this integral does not have a closed-form expression when the model function $f$ is nonlinear in $b_i$, so different approximations have been proposed for estimating it.
the paper consider four different approximations
- take the first-order Taylor expansion of the model function $f$ around the conditional (on $D$) modes of the random effects. Lindstrom and Bates’s (1990)’s LME method
- a modified Laplacian approximation
- importance sampling
- Gaussian quadrature
The Gaussian quadrature rule here can be viewed as a deterministic version of Monte Carlo integration. Because importance sampling tends to be much more efficient than simple Monte Carlo integration, then also consider the equivalent of importance sampling in the Gaussian quadrature context, which we will denote by adaptive Gaussian quadrature.