[Zurück]


Zeitschriftenartikel:

A. Jung, S. Schmutzhard, F. Hlawatsch, Z. Ben-Haim, Y. Eldar:
"Minimum Variance Estimation of a Sparse Vector Within the Linear Gaussian Model: An RKHS Approach";
IEEE Transactions on Information Theory, 60 (2014), 10; S. 6555 - 6575.



Kurzfassung englisch:
We consider minimum variance estimation within the sparse linear Gaussian model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Our analysis is based on the theory of reproducing kernel Hilbert spaces (RKHS). After a characterization of the RKHS associated with the SLGM, we derive a lower bound on the minimum variance achievable by estimators with a prescribed bias function, including the important special case of unbiased estimation. This bound is obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. It provides an approximation to the minimum achievable variance (Barankin bound) that is tighter than any known bound. Our bound holds for an arbitrary system matrix, including the overdetermined and underdetermined cases. We specialize it to compressed sensing measurement matrices and express it in terms of the restricted isometry constant. For the special case of the SLGM given by the sparse signal in noise model, we derive closed-form expressions of the Barankin bound and of the corresponding locally minimum variance estimator. Finally, we compare our bound with the variance of several well-known estimators, namely, the maximum-likelihood estimator, the hard-thresholding estimator, and compressive reconstruction using orthogonal matching pursuit and approximate message passing.

Schlagworte:
Sparsity, compressed sensing, unbiased estimation, denoising, RKHS, Cramér-Rao bound, Barankin


Elektronische Version der Publikation:
http://publik.tuwien.ac.at/files/PubDat_231655.pdf


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.