Optimal instruments

From HandWiki
Revision as of 15:29, 6 February 2024 by AIposter (talk | contribs) (link)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Technique for improving the efficiency of estimators in conditional moment models

In statistics and econometrics, optimal instruments are a technique for improving the efficiency of estimators in conditional moment models, a class of semiparametric models that generate conditional expectation functions. To estimate parameters of a conditional moment model, the statistician can derive an expectation function (defining "moment conditions") and use the generalized method of moments (GMM). However, there are infinitely many moment conditions that can be generated from a single model; optimal instruments provide the most efficient moment conditions.

As an example, consider the nonlinear regression model

[math]\displaystyle{ y = f(x, \theta) + u }[/math]
[math]\displaystyle{ E[u\mid x]=0 }[/math]

where y is a scalar (one-dimensional) random variable, x is a random vector with dimension k, and θ is a k-dimensional parameter. The conditional moment restriction [math]\displaystyle{ E[u\mid x]=0 }[/math] is consistent with infinitely many moment conditions. For example:

[math]\displaystyle{ E[ux] = E[u x^2] = E[u x^3] = \dots = 0 }[/math]

More generally, for any vector-valued function z of x, it will be the case that

[math]\displaystyle{ E[z(x) (y - f(x, \theta))] = 0 }[/math].

That is, z defines a finite set of orthogonality conditions.

A natural question to ask, then, is whether an asymptotically efficient set of conditions is available, in the sense that no other set of conditions achieves lower asymptotic variance.[1] Both econometricians[2][3] and statisticians[4] have extensively studied this subject.

The answer to this question is generally that this finite set exists and have been proven for a wide range of estimators. Takeshi Amemiya was one of the first to work on this problem and show the optimal number of instruments for nonlinear simultaneous equation models with homoskedastic and serially uncorrelated errors.[5] The form of the optimal instruments was characterized by Lars Peter Hansen,[6] and results for nonparametric estimation of optimal instruments are provided by Newey.[7] A result for nearest neighbor estimators was provided by Robinson.[8]

In linear regression

The technique of optimal instruments can be used to show that, in a conditional moment linear regression model with iid data, the optimal GMM estimator is generalized least squares. Consider the model

[math]\displaystyle{ y = x^\mathrm T \theta + u }[/math]
[math]\displaystyle{ E[u \mid x] = 0 }[/math]

where y is a scalar random variable, x is a k-dimensional random vector, and θ is a k-dimensional parameter vector. As above, the moment conditions are

[math]\displaystyle{ E[z(x) (y - x^\mathrm T \theta)] = 0 }[/math]

where z = z(x) is an instrument set of dimension p (pk). The task is to choose z to minimize the asymptotic variance of the resulting GMM estimator. If the data are iid, the asymptotic variance of the GMM estimator is

[math]\displaystyle{ (E[x z^\mathrm T]^\mathrm T E[\sigma^2(x) z z^\mathrm T]^{-1} E[z x^\mathrm T])^{-1} }[/math]

where [math]\displaystyle{ \sigma^2(x) \equiv E[u^2 \mid x] }[/math].

The optimal instruments are given by

[math]\displaystyle{ z^*(x) = \frac{x}{\sigma^2(x)} }[/math]

which produces the asymptotic variance matrix

[math]\displaystyle{ \left( E \left[ \frac{x x^\mathrm T}{\sigma^2(x)} \right] \right)^{-1}. }[/math]

These are the optimal instruments because for any other z, the matrix

[math]\displaystyle{ \left( E \left[ \frac{x x^\mathrm T}{\sigma^2(x)} \right] \right)^{-1} - (E[x z^\mathrm T]^\mathrm T E[\sigma^2(x) z z^\mathrm T]^{-1} E[z x^\mathrm T])^{-1} }[/math]

is positive semidefinite.

Given iid data [math]\displaystyle{ (y_1, x_1), \dots, (y_N, x_N) }[/math], the GMM estimator corresponding to [math]\displaystyle{ z^*(x) }[/math] is

[math]\displaystyle{ \widetilde\theta = \left( \sum_{i=1}^N \frac{x_i x_i^\mathrm T}{\sigma^2(x_i)} \right)^{-1} \sum_{i=1}^N \frac{x_i y_i}{\sigma^2(x_i)} }[/math]

which is the generalized least squares estimator. (It is unfeasible because σ2(·) is unknown.)[1]

References

  1. 1.0 1.1 Arellano, M. (2009). "Generalized Method of Moments and Optimal Instruments". Class notes. https://www.cemfi.es/~arellano/gmm-estimation.pdf. 
  2. Chamberlain, G. (1987). "Asymptotic Efficiency in Estimation with Conditional Moment Restrictions". Journal of Econometrics 34 (3): 305–334. doi:10.1016/0304-4076(87)90015-7. 
  3. Newey, W. K. (1988). "Adaptive Estimation of Regression Models via Moment Restrictions". Journal of Econometrics 38 (3): 301–339. doi:10.1016/0304-4076(88)90048-6. 
  4. Liang, K-Y.; Zeger, S. L. (1986). "Longitudinal Data Analysis using Generalized Linear Models". Biometrika 73 (1): 13–22. doi:10.1093/biomet/73.1.13. 
  5. Amemiya, T. (1977). "The Maximum Likelihood and the Nonlinear Three-Stage Least Squares Estimator in the General Nonlinear Simultaneous Equation Model". Econometrica 45 (4): 955–968. doi:10.2307/1912684. 
  6. Hansen, L. P. (1985). "A Method of Calculating Bounds on the Asymptotic Covariance Matrices of Generalized Method of Moments Estimators". Journal of Econometrics 30 (1–2): 203–238. doi:10.1016/0304-4076(85)90138-1. 
  7. Newey, W. K. (1990). "Efficient Instrumental Variables Estimation of Nonlinear Models". Econometrica 58 (4): 809–837. doi:10.2307/2938351. 
  8. Robinson, P. (1987). "Asymptotically Efficient Estimation in the Presence of Heteroskedasticity of Unknown Form". Econometrica 55 (4): 875–891. doi:10.2307/1911033. 

Further reading

  • Tsiatis, A. A. (2006). Semiparametric Theory and Missing Data. Springer Series in Statistics. New York: Springer. ISBN 0-387-32448-8.