# Fast and Flexible methods for monotone polynomial fitting

##### Posted on

investigate an isotonic parameterization for monotone polynomials

Algorithms based on the new parameterization estimate the fitted monotone polynomials much faster than algorithms based on previous isotonic parameterisations which in turn makes the use of standard bootstrap methodology feasible.

We investigate the use of the bootstrap under monotonicity constraints to obtain confidence bands for the fitted curves and show that an adjustment by using either the “m out of n” bootstrap or a post-hoc symmetrization of the confidence bands is necessary to achieve more uniform coverage probabilities.

one needs to fit a monotone regression curve typically has to make the choice between

- a parametric nonlinear regression model, in particular those models developed in the growth curve literature
- a shape constrained smoothing technique
- popular methods involve the use of either spline smoothing or kernel smoothing techniques
- incorporate shape constraints into spline smoothing has been well studied
- by way of contrast, kernel smoothing techniques for general shape constraints are somewhat less often used.

## Methodology

the usual parameterization for a polynomial regression function is

\[p(x) = \beta_0 +\beta_1x +\beta_2x^2 +\cdots +\beta_q x^q\]fit it by minimizing

\[RSS(\beta) = \sum_{i=1}^n (y_i-p(x_i))^2\]consider parameterisation of isotonic polynomials of the form

\[p(x) = \delta +\alpha \int_0^x\check p(u)du\]where $\check p(u)$ is required to be non-negative.

Let $\check p(t) = \gamma_0 + \gamma_1t +\cdots +\gamma_{q-1}t^{q-1}$, then from $\gamma$ we can readily calculate $\beta$ as

\[\beta = (\delta, \alpha\gamma_0, \alpha\frac{\gamma_1}{2},\ldots, \alpha \frac{\gamma_{q-1}}{q})^T\]