Talks and Poster Presentations (with Proceedings-Entry):

G. Matz:
"The information bottleneck - Principles and applications in communications";
Talk: Joint Workshop on Coding and Communications (JWCC), Santo Stefano Belbo (Italy) (invited); 10-17-2010 - 10-19-2010; in: "Proc. JWCC 2010", (2010), 22.

English abstract:
The information bottleneck method (IBM) was introduced by Tishby et al. in the context of machine learning [1-4]. Using information-theoretic principles, it aims at compressing given data as much as possible while simultaneously preserving as much information as possible about an (unknown) "relevance variable." IBM can be viewed as an extension of rate-distortion theory that doesn´t require the specifica- tion of a distortion measure in advance. From a statistics perspective, IBM provides a generalization of the concept of sufficient statistics to probability distributions that do not belong to exponential families. We provide a self-contained introduc- tion to IBM and highlight some relations with classical information theoretical and statistical concepts like source coding with side information information,sufficient statistics, and canonical correlations. We conclude with an outlook on the use of IBM in the context of digital communications like relaying and sensor networks.

Created from the Publication Database of the Vienna University of Technology.