Loading [MathJax]/jax/output/CommonHTML/jax.js

WeiYa's Work Yard

A traveler with endless curiosity, who fell into the ocean of statistics, tries to write down his ideas and notes to save himself.

M-estimator

Posted on (Update: )
Tags: M-estimation

Let Mn be random functions and let M be a fixed function of θ such that for every ε>0

supθΘ|Mn(θ)M(θ)|p0supθ:d(θ,θ0)εM(θ)<M(θ0).

Then any sequence of estimators ˆθn with Mn(ˆθn)Mn(θ0)oP(1) converges in probability to θ0.

The first condition says that the sequence Mn converges to a nonrandom map M:Θ¯IR, and the second condition requires that this map attains its maximum at a unique point θ0, and only parameters close to θ0 may yield a value of M(θ) close to the maximum value M(θ0). Thus, θ0 should be a well-separated point of maximum of M, and a counterexample is the following figure,

Pay attention to the oP(1) in the inequality, bear in mind that it is a short for a sequence of random vectors that converges to zero in probability.

Let Ψn be random vector-valued functions and let Ψ be a fixed vector-valued functions of θ such that for every ε>0

supθΘΨn(θ)Ψ(θ)p0infθ:d(θ,θ0)εΨ(θ)>0=Ψ(θ0).

Then any sequence of estimators ˆθn such that Ψn(ˆθn)=oP(1) converges in probability to θ0.

Let Θ be a subset of the real line and let Ψn be random functions and Ψ a fixed function of θ such that Ψn(θ)Ψ(θ) in probability for every θ. Assume that each map θψΨn(θ) is continuous and has exactly one zero ˆθ0, or is nondecreasing with Ψn(ˆθn)=oP(1). Let θ0 be a point such that Ψ(θ0ε)<0<Ψ(θ0+ε) for every ε>0.

Let X1,,Xn be a sample from some distribution P, and let a random and a “true” criterion function be of the form:

Ψn(θ)=1nni=1ψθ(Xi)=Pnψθ,Ψ(θ)=Pψθ.

Assume that the estimator ˆθ0 is a zero of Ψn and converges in probability to a zero θ0 of Ψ. Because ˆθnθ0, expand Ψn(ˆθn) in a Taylor series around θ0. Assume for simplicity that θ is one-dimensional, then

0=Ψn(ˆθn)=Ψn(θ0)+(ˆθnθ0)˙Ψn(θ0)+12(ˆθnθ0)2¨Ψn(˜θn),

where ˜θn is a point between ˆθn and θ0. This can be rewritten as

n(ˆθnθ0)=nΨn(θ0)˙Ψn(θ0)+12(ˆθnθ0)¨Ψn(˜θn). n(ˆθnθ0)N(0,Pψ2θ0(P˙ψθ0)) ˆθnθ0Nk(0,(P˙ψθ0)1Pψθ0ψTθ0(P˙ψθ0)1)

here the invertibility of the matrix P˙ψθ0 is a condition.

For each θ in an open subset of Euclidean space, let xψθ(x) be a measurable vector-valued function such that, for every θ1 and θ2 in a neighborhood of θ0 and a measurable function ˙ψ with P˙ψ2< and that the map θPψθ is differentiable at a zero θ0, with nonsingular derivative matrix Vθ0. If Pnψˆθn=oP(n1/2), and ˆθnpθ0, then n(ˆθnθ0)=V1θ01nni=1ψθ0(Xi)+oP(1), In particular, the sequence n(ˆθnθ0) is asymptotically normal with mean zero and covariance matrix V1θ0Pψθ0ψTθ0(V1θ0)T.

The function θsign(xθ) is not Lipschitz, the Lipschitz condition is apparently still stronger than necessary.

For each θ in an open subset of Euclidean space let xmθ(x) be a measurable function such that θmθ(x) is differentiable at θ0 for P-almost every x with derivative ˙mθ0(x) and such that, for every θ1 and θ2 in a neighborhood of θ0 and a measurable function ˙m with P˙m2< |mθ1(x)mθ2(x)|˙m(x)θ1θ2. Furthermore, assume that the map θPmθ admits a second-order Taylor expansion at a point of maximum θ0 with nonsingular symmetric second derivative matrix Vθ0. If PmˆθnsupθPnmθoP(n1) and ˆθnpθ0, then n(ˆθnθ0)=V1θ01nni=1˙mθ0(Xi)+oP(1). In particular, the sequence n(ˆθnθ0) is asymptotically normal with mean zero and covariance matrix V1θ0P˙mθ0˙mTθ0V1θ0.

Median

Misspecified Model


Published in categories Note