arXiv Open Access 2019

Non-Stochastic Information Theory

Anshuka Rangi Massimo Franceschetti
Lihat Sumber

Abstrak

In an effort to develop the foundations for a non-stochastic theory of information, the notion of $δ$-mutual information between uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $δ$-mutual information between received and transmitted codewords over $ε$-noise channels equals the $(ε, δ)$-capacity. This notion of capacity generalizes the Kolmogorov $ε$-capacity to packing sets of overlap at most $δ$, and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, and to non-stochastic, memoryless, stationary channels. Finally, sufficient conditions are established for the factorization of the $δ$-mutual information and to obtain a single letter capacity expression. Compared to previous non-stochastic approaches, the presented theory admits the possibility of decoding errors as in Shannon's probabilistic setting, while retaining a worst-case, non-stochastic character.

Topik & Kata Kunci

Penulis (2)

A

Anshuka Rangi

M

Massimo Franceschetti

Format Sitasi

Rangi, A., Franceschetti, M. (2019). Non-Stochastic Information Theory. https://arxiv.org/abs/1904.11632

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓