A Mathematical Theory of Communication. Offprint from Bell System Technical Journal, Vol. 27 (July and October). [With:] Notes for “Communication in the Presence of Noise” (Lecture 4, AIEE-IRE series, April 25, 1949). Mimeograph typescript.

New York: American Telephone and Telegraph Company, 1948.

First edition, the rare offprint, of “the most famous work in the history of communication theory” (Origins of Cyberspace). “Probably no single work in this century has more profoundly altered man's understanding of communication than C. E. Shannon’s article, ‘A mathematical theory of communication’, first published in 1948” (Slepian). “Th[is] paper gave rise to ‘information theory’, which includes metaphorical applications in very different disciplines, ranging from biology to linguistics via thermodynamics or quantum physics on the one hand, and a technical discipline of mathematical essence, based on crucial concepts like that of channel capacity, on the other” (DSB). On the first page of the paper is the first appearance of the term ‘bit’ for ‘binary digit.’ “A half century ago, Claude Shannon published his epic paper ‘A Mathematical Theory of Communication.’ This paper [has] had an immense impact on technological progress, and so on life as we now know it … One measure of the greatness of the [paper] is that Shannon’s major precept that all communication is essentially digital is now commonplace among the modern digitalia, even to the point where many wonder why Shannon needed to state such an obvious axiom” (Blahut & Hajek). “In 1948 Shannon published his most important paper, entitled ‘A mathematical theory of communication’. This seminal work transformed the understanding of the process of electronic communication by providing it with a mathematics, a general set of theorems rather misleadingly called information theory. The information content of a message, as he defined it, has nothing to do with its inherent meaning, but simply with the number of binary digits that it takes to transmit it. Thus, information, hitherto thought of as a relatively vague and abstract idea, was analogous to physical energy and could be treated like a measurable physical quantity. His definition was both self-consistent and unique in relation to intuitive axioms. To quantify the deficit in the information content in a message he characterized it by a number, the entropy, adopting a term from thermodynamics. Building on this theoretical foundation, Shannon was able to show that any given communications channel has a maximum capacity for transmitting information. The maximum, which can be approached but never attained, has become known as the Shannon limit. So wide were its repercussions that the theory was described as one of humanity’s proudest and rarest creations, a general scientific theory that could profoundly and rapidly alter humanity’s view of the world. Few other works of the twentieth century have had a greater impact; he altered most profoundly all aspects of communication theory and practice” (Biographical Memoirs of Fellows of the Royal Society, Vol. 5, 2009). Remarkably, Shannon was initially not planning to publish the paper, and did so only at the urging of colleagues at Bell Laboratories. The offprint of Shannon’s paper is here accompanied by extremely rare mimeograph notes for Shannon’s April 25, 1949 lecture on communication in the presence of noise, based on his classic paper of the same title published in Proceedings of the IRE 37, issue 1 (Jan. 1949), pp. 10-21. The paper contained Shannon’s proof of the Nyquist-Shannon sampling theorem, a fundamental result in the field of information theory. ABPC/RBH lists two copies of the offprint: Bonham’s New York, October 16, 2013; Christie’s New York, February 23, 2005 (OOC copy).

Provenance: from the library of German-American mathematician Cecilie Froehlich (1900-92). Froehlich was the first female faculty member of the City College School of Engineering, then the first female department chair in the US. After a 23-year career at City College, she moved to Pacific University, where she led the mathematics department until her retirement in 1970.

Relying on his experience in Bell Laboratories, where he had become acquainted with the work of other telecommunication engineers such as Harry Nyquist and Ralph Hartley, Shannon published in two issues of the Bell System Technical Journal his paper ‘A Mathematical Theory of Communication.’ The general approach was pragmatic; he wanted to study ‘the savings due to statistical structure of the original message’ (p. 379), and for that purpose, he had to neglect the semantic aspects of information, as Hartley did for ‘intelligence’ twenty years before. For Shannon, the communication process was stochastic in nature, and the great impact of his work, which accounts for the applications in other fields, was due to the schematic diagram of a general communication system that he proposed. An ‘information source’ outputs a ‘message,’ which is encoded by a ‘transmitter’ into the transmitted ‘signal.’ The received signal is the sum of the transmitted signal and unavoidable ‘noise.’ It is recovered as a decoded message, which is delivered to the ‘destination.’ The received signal, which is the sum between the signal and the ‘noise,’ is decoded in the ‘receiver’ that gives the message to destination. His theory showed that choosing a good combination of transmitter and receiver makes it possible to send the message with arbitrarily high accuracy and reliability, provided the information rate does not exceed a fundamental limit, named the ‘channel capacity.’ The proof of this result was, however, nonconstructive, leaving open the problem of designing codes and decoding means that were able to approach this limit.

“The paper was presented as an ensemble of twenty-three theorems that were mostly rigorously proven (but not always, hence the work of A. I. Khinchin and later A.N. Kolmogorov, who based a new probability theory on the information concept). Shannon’s paper was divided into four parts, differentiating between discrete or continuous sources of information and the presence or absence of noise. In the simplest case (discrete source without noise), Shannon presented the [entropy] formula he had already defined in his mathematical theory of cryptography, which in fact can be reduced to a logarithmic mean. He defined the bit, the contraction of ‘binary digit’ (as suggested by John W. Tukey, his colleague at Bell Labs) as the unit for information. Concepts such as ‘redundancy,’ ‘equivocation,’ or channel ‘capacity,’ which existed as common notions, were defined as scientific concepts. Shannon stated a fundamental source-coding theorem, showing that the mean length of a message has a lower limit proportional to the entropy of the source. When noise is introduced, the channel-coding theorem stated that when the entropy of the source is less than the capacity of the channel, a code exists that allows one to transmit a message ‘so that the output of the source can be transmitted over the channel with an arbitrarily small frequency of errors.’ This programmatic part of Shannon’s work explains the success and impact it had in telecommunications engineering. The turbo codes (error correction codes) achieved a low error probability at information rates close to the channel capacity, with reasonable complexity of implementation, thus providing for the first time experimental evidence of the channel capacity theorem” (DSB).

“The landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper ‘A Mathematical Theory of Communication’ in the Bell System Technical Journal in July and October 1948. Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, ‘Certain Factors Affecting Telegraph Speed,’ contains a theoretical section quantifying ‘intelligence’ and the ‘line speed’ at which it can be transmitted by a communication system, giving the relation W = K log m (recalling Boltzmann's constant), where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, ‘Transmission of Information,’ uses the word information as a measurable quantity, reflecting the receiver’s ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit … Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers. Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs” (Wikipedia, accessed 20 October 2018).

“Shannon had the presight to overlay the subject of communication with a distinct partitioning into sources, source encoders, channel encoders, channels, and associated channel and source decoders. Although his formalization seems quite obvious in our time, it was not so obvious back then. Shannon further saw that channels and sources could and should be described using the notions of entropy and conditional entropy. He argued persuasively for the use of these notions, both through their characterization by intuitive axioms and by presentation of precise coding theorems. Moreover, he indicated how very explicit, operationally significant concepts such as the information content of a source of the information capacity of a channel can be identified using entropy and maximization of functions involving entropy.

“Shannon’s revolutionary work brought forth this new subject of information theory fully formed but waiting for the maturity that fifty years of aging would bring. It is hard to imagine how the subject could have been created in an evolutionary way, though after the conception its evolution proceeded in the hands of hundreds of authors to produce the subject in its current state of maturity …

“The impact of Shannon’s theory of information on the development of telecommunication has been immense. This is evident to those working at the edge of advancing developments, though perhaps not quite so visible to those involved in routine design. The notion that a channel has a specific information capacity, which can be measured in bits per second, has had a profound influence. On the one hand, this notion offers the promise, at least in theory, of communication systems with frequency of errors as small as desired for a given channel for any data rate less than the channel capacity. Moreover, Shannon’s associated existence proof provided tantalizing insight into how ideal communication systems might someday fulfil the promise. On the other hand, this notion also clearly establishes a limit on the communication rate that can be achieved over a channel, offering communication engineers the ultimate benchmark with which to calibrate progress toward construction of the ultimate communication system for a given channel

“The fact that a specific capacity can be reached, and that no data transmission system can exceed this capacity, has been the holy grail of modern design for the last fifty years. Without the guidance of Shannon’s capacity formula, modern designers would have stumbled more often and proceeded more slowly. Communication systems ranging from deep-space satellite links to storage devices such as magnetic tapes and ubiquitous compact discs, and from high-speed internets to broadcast high-definition television, came sooner and in better form because of his work. Aside from this wealth of consequences, the wisdom of Claude Shannon’s insights may in the end be his greatest legacy” (Blahut & Hajek).

“The 1948 paper rapidly became very famous; it was published one year later as a book, with a postscript by Warren Weaver regarding the semantic aspects of information” (DSB). The book was titled The Mathematical Theory of Communication, a small but significant title change reflecting the generality of this work.

Shannon’s ‘Communication in the presence of noise’ “is intimately connected to the earlier classic paper [‘A mathematical theory of communication’]. In fact, since a large part of the material in the second paper is essentially an elaboration of matters discussed in the first, and since it is referenced in the first paper, it can be thought of as an elaboration and extension of the earlier paper, adopting an ‘engineering’ rather than strict mathematical point of view. Yet, this paper comprises ideas, notions, and insights that were not reported in the first paper. In retrospect, many of the concepts treated in this paper proved to be fundamental, and they paved the way for future developments in information theory.

“The focus of Shannon’s paper is on the transmission of continuous-time (or ‘waveform’) sources over continuous-time channels. Using the sampling theorem, Shannon shows how waveform signals can be represented by vectors in finite-dimensional Euclidean space. He then exploits this representation to establish important facts concerning communication of a waveform source over a waveform channel in the presence of waveform noise. In particular, he gives a geometric proof of the theorem that establishes the famous formula W log(1 + S) for the capacity of a channel with bandwidth W, additive thermal (i.e., Gaussian) noise, and signal-to-noise ratio S

“Shannon’s ideas, described in this paper, … affected in a profound fashion the very thinking on the structure, components, design, and analysis of communications systems in general. As such, and through the implementation of state-of-the-art communications systems and their implications on modern communications technology, which plays an ever increasing role in modern society, these ideas of Shannon’s had an influence well beyond the technical professional world of electronics engineers and mathematicians” (Wyner & Shamai, ‘‘Introduction to “Communication in the presence of noise’ by C. E. Shannon,’ Proceedings of the IEEE, Vol. 86, No. 2, February 1998).

The published article ‘Communication in the presence of noise’ contains Shannon’s proof of the ‘Nyquist-Shannon sampling theorem”, stated by Harry Nyquist in 1928 (and earlier by E. T. Whittaker in 1915). It establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth.

We have not been able to identify the lecture series of which the offered notes form a part. Since Shannon begins the lecture with definitions of such basic terms as ‘Information source,’ “Channel,’ etc., this would appear to be the first (and perhaps last) lecture in the series delivered by Shannon (so the preceding three lectures must have been presented by other speakers). The lecture is much more expository in style than the published paper, and much less technical in content, as would be expected for a lecture presented to an audience consisting of researchers in a variety of different fields. The lecture appears to be unpublished. We have been unable to find a record of any other copy, either in commerce or in libraries.

These items are from the library of German-American mathematician Cecilie Froehlich, who worked at the AEG machine factory in Berlin as a scientific consultant until forced to emigrate from Nazi Germany in 1937. “A competent securer of patents, [Froehlich’s] mathematical efforts concerned, among other things, Foucault (eddy) currents, switching operations, rectifiers, mechanical oscillations, and gyroscopes. Her mathematical modeling and calculations of eddy current losses in iron plates and in the support rings of electric machines aroused international attention” (Tobies & Neunzert, Iris Runge: A Life at the Crossroads of Mathematics, Science and Industry [2012], p. 147). She later taught electrical engineering at the College of the City of New York. See Annals of the History of Computing 6 (1984), pp. 152–55.

OOC 880. Blahut & Hajek, Foreword to the book edition, The Mathematical Theory of Communication, University of Illinois Press, 1998. Slepian (ed.), Key papers in the development of information theory, Institute of Electrical and Electronics Engineers, 1974. ‘Origin of the term Bit,’ Annals of the History of Computing 6 (1984), pp. 152-155.



Offprint: 4to (280 x 216mm), pp. 80 (tiny marginal stain on last few leaves). Original printed wrappers, hole punched for ring binder as always (spine skillfully repaired). Typescript: 277 x 213mm, 5 leaves (faint creasing).

Item #5213

Price: $12,500.00

See all items by