[Back]


Publications in Scientific Journals:

G. Koliander, G. Pichler, E. Riegler, F. Hlawatsch:
"Entropy and Source Coding for Integer-Dimensional Singular Random Variables";
IEEE Transactions on Information Theory, 62 (2016), 11; 6124 - 6154.



English abstract:
Entropy and differential entropy are important quantities in information theory. A tractable extension to singular random variables-which are neither discrete nor continuous- has not been available so far. Here, we present such an extension for the practically relevant class of integer-dimensional singular random variables. The proposed entropy definition contains the entropy of discrete random variables and the differential entropy of continuous random variables as special cases. We show that it transforms in a natural manner under Lipschitz functions, and that it is invariant under unitary transformations. We define joint entropy and conditional entropy for integer-dimensional singular random variables, and we show that the proposed entropy conveys useful expressions of the mutual information. As first applications of our entropy definition, we present a result on the minimal expected codeword length of quantized integer-dimensional singular sources and a Shannon lower bound for integer-dimensional singular sources.

Keywords:
Entropy, Random variables, Quantization (signal), Mutual information, Rate-distortion, Density functional theory


"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)
http://dx.doi.org/10.1109/TIT.2016.2604248

Electronic version of the publication:
http://publik.tuwien.ac.at/files/publik_253379.pdf


Created from the Publication Database of the Vienna University of Technology.