Processing math: 100%

WeiYa's Work Yard

A traveler with endless curiosity, who fell into the ocean of statistics, tries to write down his ideas and notes to save himself.

Multiple Isotonic Regression

Posted on
Tags: Isotonic Regression, Minimax, Directed Graph

The first two sections are based on a good tutorial on the isotonic regression, and the third section consists of the slides for the talk given by Prof. Cun-Hui Zhang at the 11th ICSA International Conference on Dec. 21st, 2019.

Isotonic Regression

Intuitionally, linear ordering isotonic regression can be thought as approximating given series of 1-dimensional observations with non-decreasing function, while general isotonic regression is approximating given series of values which satisfy a given partial ordering.

Formally, given values aIRn and the associated positive weights wIRn>0,

minyni=1wi(yiai)2subjecttoyiyj(i,j)E

where the set of constraints E constitutes a acyclic graph.

In particular, the aforementioned linear ordering isotonic regression is one with constraint y1y2yn. There is a simple linear algorithm for such special case, called Pool Adjacent Violators Algorithm (PAVA).

Pool Adjacent Violators Algorithm

For an optimal solution, if aiai+1, then yi=yi+1.

It is quite clear from the following sketch,

Formally, we can prove it by contradiction, i.e., suppose yi<yi+1, then

Since yi=yi+1, we can combine (wi,ai) and (wi+1,ai+1) as a new point,

(wi+wi+1,wiai+wi+1ai+1wi+wi+1).

Repeating the combining procedure until the value ai<=ai+1, and then the solution exactly equals to the corresponding combined point.

Actually, I had implemented the PAVA for a toy example in R and Julia, but I forgot completely (my guilt).

Multiple Isotonic Regression

Pay attention to the notation difference, we can treat yi,xi in the slide as ai,yi in the previous section, respectively. And there is one more additional non-decreasing function f. More formal description can be found in their paper,


Published in categories ICSA-2019