For semi-parametric statistical estimation, when an estimating function exists, it often provides an ecient or a good consistent estimator of the parameter of interest against nuisance parameters of in®nite dimensions. The present paper elucidates the structure of estimating functions, based on the dual dierential geometry of statistical inference and its extension to ®bre bundles. The paper studies the following problems. First, when does an estimating function exist and what is the set of all the estimating functions? Second, how are the asymptotic variances of the estimators derived from estimating functions and when are the estimators ecient? Third, how do we adaptively choose a practically good (quasi-)estimating function from the observed data? The concept of m-curvature freeness plays a fundamental role in solving the above problems.
[1]
J. Neyman,et al.
Consistent Estimates Based on Partially Consistent Observations
,
1948
.
[2]
V. P. Godambe.
Conditional likelihood and unconditional optimum estimating equations
,
1976
.
[3]
O. E. Barndorff-Nielsen.
Likelihood and Observed Geometries
,
1986
.
[4]
C. Small,et al.
The theory and applications of statistical inference functions
,
1988
.
[5]
P. Bickel,et al.
Achieving Information Bounds in Non and Semiparametric Models
,
1990
.
[6]
Jack Cuzick,et al.
Semiparametric additive regression
,
1992
.
[7]
Shun-ichi Amari,et al.
Information geometry of Boltzmann machines
,
1992,
IEEE Trans. Neural Networks.