Talks and Poster Presentations (with Proceedings-Entry):
G. Alberti, H. Bölcskei, D. Camillo, G. Koliander, E. Riegler:
"Lossless linear analog compression";
Talk: IEEE International Symposium on Information Theory (ISIT),
- 07-15-2016; in: "Proceedings 2016 IEEE International Symposium on Information Theory (ISIT)",
We establish the fundamental limits of lossless linear analog compression by considering the recovery of random vectors x ∈ ℝm from the noiseless linear measurements y = Ax with measurement matrix A ∈ ℝn×m. Specifically, for a random vector x ∈ ℝm of arbitrary distribution we show that x can be recovered with zero error probability from n > inf dimMB(U) linear measurements, where dimMB(·) denotes the lower modified Minkowski dimension and the infimum is over all sets U ⊆ ℝm with P[x ∈ U] = 1. This achievability statement holds for Lebesgue almost all measurement matrices A. We then show that s-rectifiable random vectors-a stochastic generalization of s-sparse vectors-can be recovered with zero error probability from n > s linear measurements. From classical compressed sensing theory we would expect n ≥ s to be necessary for successful recovery of x. Surprisingly, certain classes of s-rectifiable random vectors can be recovered from fewer than s measurements. Imposing an additional regularity condition on the distribution of s-rectifiable random vectors x, we do get the expected converse result of s measurements being necessary. The resulting class of random vectors appears to be new and will be referred to as s-analytic random vectors.
"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)
Created from the Publication Database of the Vienna University of Technology.