[Back]


Publications in Scientific Journals:

A. Jung, S. Schmutzhard, F. Hlawatsch:
"The RKHS approach to minimum variance estimation revisited: Variance bounds, sufficient statistics, and exponential families";
IEEE Transactions on Information Theory, 60 (2014), 7; 4050 - 4065.



English abstract:
The mathematical theory of reproducing kernel Hilbert spaces (RKHS) provides powerful tools for minimum variance estimation (MVE) problems. Here, we extend the classical RKHS-based analysis of MVE in several directions. We develop a geometric formulation of five known lower bounds on the estimator variance (Barankin bound, Cramér-Rao bound, constrained Cramér-Rao bound, Bhattacharyya bound, and Hammersley-Chapman-Robbins bound) in terms of orthogonal projections onto a subspace of the RKHS associated with a given MVE problem. We show that, under mild conditions, the Barankin bound (the tightest possible lower bound on the estimator variance) is a lower semi-continuous function of the parameter vector. We also show that the RKHS associated with an MVE problem remains unchanged if the observation is replaced by a sufficient statistic. Finally, for MVE problems conforming to an exponential family of distributions, we derive novel closed- form lower bounds on the estimator variance and show that a reduction of the parameter set leaves the minimum achievable variance unchanged.

German abstract:
The mathematical theory of reproducing kernel Hilbert spaces (RKHS) provides powerful tools for minimum variance estimation (MVE) problems. Here, we extend the classical RKHS-based analysis of MVE in several directions. We develop a geometric formulation of five known lower bounds on the estimator variance (Barankin bound, Cramér-Rao bound, constrained Cramér-Rao bound, Bhattacharyya bound, and Hammersley-Chapman-Robbins bound) in terms of orthogonal projections onto a subspace of the RKHS associated with a given MVE problem. We show that, under mild conditions, the Barankin bound (the tightest possible lower bound on the estimator variance) is a lower semi-continuous function of the parameter vector. We also show that the RKHS associated with an MVE problem remains unchanged if the observation is replaced by a sufficient statistic. Finally, for MVE problems conforming to an exponential family of distributions, we derive novel closed- form lower bounds on the estimator variance and show that a reduction of the parameter set leaves the minimum achievable variance unchanged.

Keywords:
Minimum variance estimation, exponential family, reproducing kernel Hilbert space, RKHS, Cramér- Rao bound, Barankin bound, Hammersley-Chapman-Robbins bound, Bhattacharyya bound, locally minimum variance unbiased estimator


Electronic version of the publication:
http://publik.tuwien.ac.at/files/PubDat_229646.pdf


Created from the Publication Database of the Vienna University of Technology.