Semantic Scholar Open Access 1987 3498 sitasi

Information Theory

Rongmei Li R. Kaptein D. Hiemstra Jaap Kamps

Abstrak

INFORMATION THEORY rests on the fundamental observation that information and uncertainty are related (Shannon and Weaver, 1949). Intuitively, a code can be used to send information from one agent (the transmitter) to another (the receiver) or a channel just in case the receiver cannot completely anticipate which message the transmitter will send. A “language” that consisted of only one sentence could not be a useful instrument of communication precisely because there could be neither a real choice (on the part of the transmitter) nor any real uncertainty (on the part of the receiver) about which message could be sent. Entropy is a measure of the uncertainty in a communication system. Given that uncertainty and information can be identified, we can say that a measure of the uncertainty in a system is also a measure of its information content. Suppose that a communication system provides n distinct symbols and that pi is the probability that the i th symbol occurs; then the entropy, H, is given by:

Penulis (4)

R

Rongmei Li

R

R. Kaptein

D

D. Hiemstra

J

Jaap Kamps

Format Sitasi

Li, R., Kaptein, R., Hiemstra, D., Kamps, J. (1987). Information Theory. https://doi.org/10.1007/978-0-387-35973-1_638

Akses Cepat

Informasi Jurnal
Tahun Terbit
1987
Bahasa
en
Total Sitasi
3498×
Sumber Database
Semantic Scholar
DOI
10.1007/978-0-387-35973-1_638
Akses
Open Access ✓