## Communication in the Presence of Noise. Offprint from Proceedings of the Institute of Radio Engineers, Vol. 37, January 1949.

[New York: The Institute of Radio Engineers, Inc.], 1949.

First edition, **Shannon’s own copy** of the extremely rare offprint, of this continuation and elaboration of Shannon’s epoch-making ‘Mathematical theory of communication’. “In 1948, C. E. Shannon published his classic paper ‘A Mathematical Theory of Communication’ in the *Bell System Technical Journal*. That paper founded the discipline of information theory … Several months later, he published a second paper, ‘Communication in the Presence of Noise,’ in the *Proceedings of the Institute of Radio Engineers*. This second paper is … intimately connected to the earlier classic paper. In fact, … it can be thought of as an elaboration and extension of the earlier paper, adopting an ‘engineering’ rather than strict mathematical point of view. Yet, this [second] paper comprises ideas, notions, and insights that were not reported in the first paper. In retrospect, many of the concepts treated in this [second paper] proved to be fundamental, and they paved the way for future developments in information theory. The focus of Shannon’s paper is on the transmission of continuous-time (or ‘waveform’) sources over continuous time channels. Using the sampling theorem, Shannon shows how waveform signals can be represented by vectors in finite-dimensional Euclidean space. He then exploits this representation to establish important facts concerning the communication of a waveform source over a waveform channel in the presence of waveform noise. In particular, he gives a geometric proof of the theorem that establishes the famous formula *W *log (1 + *S*) for the capacity of a channel with bandwidth *W*, additive thermal (i.e., Gaussian) noise, and signal-to-noise ratio *S*. Shannon’s paper also calculates the capacity of a colored Gaussian channel and presents the basic ingredients of rate-distortion theory developed later, with focus on continuous-time signals … One of the most profound ideas is coding waveforms with respect to a reconstruction fidelity criterion. These ideas, which later matured as the rate-distortion theory, provide the theoretical basis to quantization of analog signals (for example, speech coding, vector quantization and the like) which now are ubiquitous in our everyday life (cellular phones for example). Shannon’s ideas, described in this paper in a lucid engineering fashion and complementing his celebrated work [‘Mathematical Theory of Communication’], which established the field of information theory, affected in a profound fashion the very thinking on the structure, components, design, and analysis of communication systems in general ... These ideas of Shannon's had an influence well beyond the technical professional world of electronics engineering and mathematicians” (Wyner & Shamai). Shannon presented this paper at the IRE National Convention, New York, N. Y. on March 24, 1948, and IRE New York Section on November 12, 1947. The Institute of Radio Engineers received the original manuscript of this paper on July 23, 1948, and it was printed in their *Proceedings* (Vol. 37, pp. 10-21, January 1949). The Bell System Technical Monograph series (#B-1644: 1949) later reprinted this paper. This copy of the offprint derives from Shannon’s personal files; we know of no copy originating from any other source. Not on OCLC.

*Provenance*: The personal files of Claude E. Shannon (unmarked).

Shannon begins the present paper by describing the five elements that comprise a ‘general communications system’: “1. *An information source*. The source selects one message from a set of possible messages to be transmitted to the receiving terminal … 2. *The transmitter*. This operates on the message in some way and produces a signal suitable for transmission to the receiving point over the channel … 3. *The channel*. This is merely the medium used to transmit the signal from the transmitting to the receiving point … During transmission, or at the receiving terminal, the signal may be perturbed by noise or distortion. Noise and distortion may be differentiated on the basis that distortion is a fixed operation applied to the signal, while noise involves statistical and unpredictable perturbations. Distortion can, in principle, be corrected by applying the inverse operation, while a perturbation due to noise cannot always be

removed, since the signal does not always undergo the same change during transmission. 4. *The receiver*. This operates on the received signal and attempts to reproduce, from it, the original message … 5. *The destination*. This is the person or thing for whom the message is intended.”

In Section II Shannon states and proves Theorem 1, the ‘Sampling Theorem’: “If a function contains no frequencies higher than *W* cps, it is completely determined by giving its ordinates at a series of points spaced 1/2*W* seconds apart”. Shannon quickly notes, “This is a fact which is common knowledge in the communication art.” He later adds, “Theorem 1 has been given previously in other forms by mathematicians but in spite of its evident importance seems not to have appeared explicitly in the literature of communication theory.”

“The sampling theorem first introduced to the engineering community in the paper is probably the most useful tool to the analysis of signals and their processing and

has had an enormous spectrum of applications. This theorem, and the notion of

essentially time- and band-limited signals, motivated many future studies …

“One of Shannon’s most important ideas, which set the stage for the profound developments in information and communications theory in the last 50 years, was the geometric interpretation of the communications systems in general and messages, code words, and the encoding and decoding procedures in particular. This important approach is summarized in Table 1 of Shannon’s paper, and its essence is the representation of a signal as a point in *N*-dimensional Euclidean space … Thus, the encoding procedure: within this view, the transmitter is a mapping from the message space to the coded signal space, while the decoding is viewed as a mapping from the received signal space to the message space.

“This geometric representation not only facilitated the adoption and use of the terminology, results, reasoning, and techniques of vector spaces and their geometry (as explicitly mentioned by Shannon and used to derive the capacity formula) but in fact has been crucial in the understanding and designing of sophisticated codes. These state-of-the-art modern coding methods, about 50 years after Shannon’s paper, come close in performance to the ultimate limits of Shannon’s capacity formula …

“Before using this geometric interpretation to prove the classical Gaussian capacity formula, Shannon in Section VI of his paper gave a qualitative explanation for the threshold effect in bandwidth-expanding analog modulation methods (such as classical wide-band FM) … Shannon went on to present the information compression idea qualitatively, where redundancy and irrelevant information in the message space should be ignored so that only the so-called effective message coordinates are isolated, giving rise to the reduction of dimensionality of the message space. The possibility to

reduce the bandwidth of a signal at the expense of increasing the signal resolution, or deteriorating the signal-to-noise-ratio, has also been addressed.

“In Section VII of his paper, Shannon exploits the geometric representation to prove Theorem 2, the celebrated capacity formula for the white Gaussian channel *C* … This result, as Shannon explicitly points out, was counterintuitive at the time. The theory and practice of that time supported the belief that reduction of the error probability

must come at the cost of gradual monotonic decrease of the corresponding information rate. This belief was derived from the common practice of that time in analog systems

and uncoded digital systems, where the increase of output signal-to-noise ratio (or the decrease of bit/symbol error rate) with a given input power implied the reduction of the information transmitted bandwidth (or rate, in the digital communications case). In Shannon’s words, his capacity formula,

‘is a rather surprising result, since one would expect that reducing the frequency of errors would require reducing the rate of transmissions and that the rate must approach zero as the error frequency does. Actually, we can send at the rate *C* but reduce errors by using more involved encodings and longer delays at the transmitter and receiver.

“Using the observation that the noise turns the coded signal point into a cloud sphere of an essentially fixed radius, … Shannon argued that no reliable transmission at rates larger than capacity is possible. On the other hand, using the fundamental idea of random selection of code books along with the geometrical interpretation, Shannon translated the determination of the probability of error to the determination of the probability of not having any of the other randomly and independently chosen code words closer to the received point than the correct (transmitted) code word. This probability is estimated in a rather straightforward fashion to yield the capacity *C* …

“Shannon went on in Section VIII to discuss his results and provide the expression for the capacity of the colored Gaussian channel, which is now known as the ‘water

pouring’ technique … Shannon’s result hinges on a sound engineering interpretation of splitting the available band into small subbands over which the noise spectral density is essentially constant. This conclusion that capacity can be achieved by independent narrow-band ideal communications systems, operating on disjoint frequency fragments, propelled the recent use of multicarrier systems for a variety of applications …

“In the last section in Shannon’s paper dealing with continuous sources, Shannon presents the basic ideas underlying his later development of the rate-distortion theory, that is, the required rate of source coding where the information is to be regenerated with respect to a criterion of fidelity. Hinging on the geometric approach, not only does he provide the rate-distortion expression for the Gaussian memoryless source with respect to the mean square error distortion measure but, in Theorem 5, he gives upper and lower bounds on this rate-distortion function for band-limited non-Gaussian sources as well …

“One of the main reasons that Shannon’s paper received immediate recognition is that it is written from an intuitive engineering point of view rather than as a rigorous

mathematical exposition. This paved the way to a rather immediate understanding of the revolutionary ideas of the paper, which dramatically affected the way communications theory is understood and practiced. In this sense, the paper can be viewed as a restatement from an engineering point of view of the fundamental elements of information theory first created in [‘Mathematical theory of communication’]” (Wyner & Shamai).

“Claude Elwood Shannon was born in Petoskey, Michigan, on Sunday, April 30, 1916 … The first sixteen years of Shannon’s life were spent in Gaylord, where he attended the Public School, graduating from Gaylord High School in 1932. As a boy, Shannon showed an inclination toward things mechanical. His best subjects in school were science and mathematics, and at home he constructed such devices as model planes, a radio-controlled model boat and a telegraph system to a friend’s house half a mile away. The telegraph made opportunistic use of two barbed wires around a nearby pasture. He earned spending money from a paper route and delivering telegrams, as well as repairing radios for a local department store. His childhood hero was Edison, who he later learned was a distant cousin … In 1932 he entered the University of Michigan, following his sister Catherine, who had just received a master’s degree in mathematics there. While a senior, he was elected a member of Phi Kappa Phi and an associate member of Sigma Xi. In 1936 he obtained the degrees of Bachelor of Science in Electrical Engineering and Bachelor of Science in Mathematics. This dual interest in mathematics and engineering continued throughout his career” (*Collected Papers*, p. xi).

“After graduating from the University of Michigan in 1936 with bachelor’s degrees in mathematics and electrical engineering, Shannon obtained a research assistant’s position at the Massachusetts Institute of Technology (MIT). There, among other duties, he worked with the noted researcher Vannevar Bush, helping to set up differential equations on Bush’s differential analyzer. A summer internship at American Telephone and Telegraph’s Bell Laboratories in New York City in 1937 inspired much of Shannon’s subsequent research interests. In 1940 he earned both a master’s degree in electrical engineering and a Ph.D. in mathematics from MIT. He joined the mathematics department at Bell Labs in 1941, where he first contributed to work on antiaircraft missile control systems. He remained affiliated with Bell Labs until 1972. Shannon became a visiting professor at MIT in 1956, a permanent member of the faculty in 1958, and professor emeritus in 1978” (Britannica).

Wyner & Shamai (Shitz), ‘Introduction to 'Communication in the Presence of Noise’ by C. E. Shannon,’ *Proceedings of the IEEE* 86 (1998), pp. 442-446.

4to (283 x 217mm), pp. [1], [10], 11-21, [1, blank]. Self-wrappers, stapled at spine. Near fine.

Item #5549

**
Price:
$15,000.00
**