FIRST MEASUREMENTS OF "SHANNON ENTROPY".

SHANNON, C.(LAUDE) E.

Prediction and Entropy of Printed English. (Manuscript received Sept. 15, 1950).

(New York, American Telephone and Telegraph Company, 1951). 8vo. Volume XXX, 1951 of The Bell System Technical Journal. Bound without the general title-page in a nice full green cloth. Library stamp to front free end-paper and first page of tables of contents. Very minor bumping to extremities. A tight and clean copy. Pp. 50-64. [Entire volume: 32, 1255 pp).


First edition of Shannon's famous article in which he measures the entropy rate of English text to be between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter.

"A new method of estimating the entropy and redundancy of a language is described. This method exploits the knowledge of the language statistics possessed by those who speak the language, and depends on experimental results in prediction of the next letter when the preceding text is known. Results of experiments in prediction are given, and some properties of an ideal predictor are developed." (From the introduction to the present article).

"Natural languages are highly redundant; the number of intelligible fifty-letter English sentences is many fewer than 26*50, and the number of distinguishable ten-second phone conversations is far smaller than the number of sound signals that could be generated with frequencies up to 20.000 Hz. This immediately suggests a theory for signal compression. If you can recode the alphabet so that common sequences of letters and abbreviated, while infrequent combinations are spelled out in lengthy fashion, you can dramatically reduce the channel capacity needed to send the data." (Sethna, Statistical Mechanics: Entropy, Order Parameters and Complexity, Oxford University Press, 2006, pp. 100).

Order-nr.: 42745


DKK 1.450,00