[Zurück]


Zeitschriftenartikel:

G. Koliander, G. Pichler, E. Riegler, F. Hlawatsch:
"Entropy and Source Coding for Integer-Dimensional Singular Random Variables";
IEEE Transactions on Information Theory, 62 (2016), 11; S. 6124 - 6154.



Kurzfassung englisch:
Entropy and differential entropy are important quantities in information theory. A tractable extension to singular random variables-which are neither discrete nor continuous- has not been available so far. Here, we present such an extension for the practically relevant class of integer-dimensional singular random variables. The proposed entropy definition contains the entropy of discrete random variables and the differential entropy of continuous random variables as special cases. We show that it transforms in a natural manner under Lipschitz functions, and that it is invariant under unitary transformations. We define joint entropy and conditional entropy for integer-dimensional singular random variables, and we show that the proposed entropy conveys useful expressions of the mutual information. As first applications of our entropy definition, we present a result on the minimal expected codeword length of quantized integer-dimensional singular sources and a Shannon lower bound for integer-dimensional singular sources.

Schlagworte:
Entropy, Random variables, Quantization (signal), Mutual information, Rate-distortion, Density functional theory


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.1109/TIT.2016.2604248

Elektronische Version der Publikation:
http://publik.tuwien.ac.at/files/publik_253379.pdf


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.