1
Engineering Theory and Mathematics in the Early Development of
Information Theory
Lav R. Varshney
School of Electrical and Computer Engineering
Cornell University
Abstract— A historical study of the development of information theory advancements in the first few years after Claude E. information theory in the American engineering community and also touch on the course that information theory took in Shannon’s seminal work of 1948 is presented. The relationship between strictly mathematical advances, information theoretic evaluations of existing communications systems, and the development of engineering theory for applications is explored. It is shown that the contributions of American communications engineering theorists are directly tied to the socially constructed meanings of information theory held by members of the group. The role played by these engineering theorists is compared with the role of American mathematical scientists and it is also shown that the early advancement of information is linked to the mutual interactions between these two social groups.
I. INTRODUCTION
Although information theory had multiple versions at the time of its conception, including the theories of Norbert Wiener, Donald MacKay, and Dennis Gabor, Claude E. Shannon’s statistical variety of information theory quickly came to predominate and now forms the basis of the field, delineated by its modern definition as the processing, transmission, storage, and use of information, and the foundations of the communication process [1]. Perhaps presumptuously, in 1953 Colin Cherry wrote of information theory’s importance, “Man’s development and the growth of civilisations has depended in the main on progress in… his abilities to receive, communicate and to record his knowledge” [2], however the tremendous growth of the field in the first few years after its establishment by Shannon’s seminal paper of 1948, [3] corroborates its perceived importance at the time. After Shannon’s initial work, there were grand visions for the application of information theory to such diverse fields as biology, linguistics, physics, and engineering, and the hope that these fields would be able to contribute advances. The growth of information theory in American engineering is seen by the establishment of the Institute of Radio Engineers (I.R.E.) Professional Group on Information Theory in May 1951, the three technical sessions devoted to information theory held at the I.R.E. National Convention in March 1952, and a symposium on information theory, with five technical sessions, that was held in New York City in October 1952 [4]. In addition to emerging American interests in information theory, work was being done by Aleksandr Khinchin [5] and others [6] in the Soviet Union, as well as Philip Woodward [7] and others in the United Kingdom.
This paper will comprehensively trace the early progress of
the American mathematics community. As the early development of information theory was being made, social meanings of the nascent field were just being constructed and incorporated by the two different social groups, mathematical scientists and communications engineering theorists. The meanings constructed had significant influence on the direction that future developments took, both in goals and in methodology. Mutual interaction between the two social groups was also highly influenced by these social meanings. In the half century since its birth, Shannon information theory has paved the way for electronic systems such as compact discs, cellular phones, and modems, all built upon the developments and social meanings of its first few years. II. MATHEMATICAL SCIENCE AND ENGINEERING THEORY The complex relationship between science and technology pervades the history of electrics and electronics. Counter to the widely held belief that technology arises from science or that it is simply the handmaiden of science, numerous occasions have shown that in fact, the interaction between science and technology has been defined by mutual influence, spurring advances in both scientific understanding and technological development [8]. The invention and enhancement of electrical and electronic communication technologies such as the telegraph, telephone, and radio, and developments in electromagnetic and electronic theory in the nineteenth century and early twentieth century typify the interplay between technologists and physical scientists [9]. Throughout this time, however, the many coding, modulation, and compression techniques developed for telecommunication relied solely on the knowledge, skills, and principles developed by engineers themselves, with little influence from mathematical scientists. By the 1920s, a process of mathematization of communication theory had started, initiated by Harry Nyquist, Ralph V.L. Hartley and others [10]. By the late 1940s, the necessity for a theory that included the fundamental tradeoffs among transmission rate, bandwidth, signal to noise ratio, and reliability was recognized by many researchers including A.G. Clavier, C.W. Earp, Stanford Goldman, Jacques Laplume, Claude Shannon, William Tuller, and Norbert Wiener [11]-[12]. Claude Shannon’s “A Mathematical Theory of Communication,” published in 1948, came to prepotency and formed the basis of the field that is now known as information theory.
2004 IEEE Conference on the History of Electronics
2
Two different social groups have made significant communications engineers worked on different aspects of contributions to information theory, communications information theory, yet their contributions were engineering theorists and mathematical scientists; however complementary. In some instances, the mutual interaction defining these two groups is not a trivial matter. In fact, some between the two groups resulted in developments that might have suggested that the merger between science and not have been achieved if only one aspect of information engineering was an important feature of the period after the theory had been pursued. Second World War [13]. In 1941, Thornton C. Fry, a
III. SHANNON’S INITIAL FORMULATION mathematician in the Bell System, tried to define
mathematicians entirely by the way they thought, rather than Although the information theory of Claude Shannon was the academic credentials they held. In particular, he felt that built upon previous ideas, it was in many regards strikingly mathematicians were defined by confidence in reasoning rather novel. As Robert J. McEliece, then of the California Institute than experiment, requirements for perfection in arguments, of Technology’s Jet Propulsion Laboratory retrospectively tendencies for idealization, and desire for generality. Further, he held that the engineer was convinced by experimental evidence, believed the requirement for perfection in argument to be hair-splitting rather than rigorous thinking, and over-idealization to be ignoring facts [14]. Following the general concept of Fry, I will differentiate the two social groups not by academic credentials, but with a definition based on the socially constructed meaning of information theory held. Socially constructed meaning is defined to be the perception of a particular field of inquiry or a particular technology held by the social group. Within the first few years of the establishment of information theory, communications engineering theorists made many efforts to incorporate, apply, and develop the field. Mathematical scientists, however, did not pursue information theory until the mid-1950s. The specific contributions made by each group, the chronology of the contributions, and the mutual interaction between the two groups can be attributed to the meanings of information theory that were constructed by each. Some engineering theorists saw information theory as science that would allow a theoretical basis for the evaluation of information sources and existing communications systems. Other engineering theorists saw the fundamental bounds on communications that information theory established as an ideal to work towards, through the development of new communication schemes and systems. These engineers felt that it brought a concrete foundation to the task of communications engineering. As Jerome Wiesner, director of the Massachusetts Institute of Technology’s Research Laboratory of Electronics said in 1953, “Before we had the theory, … we had been dealing with a commodity that we could never see or really define. We were in the situation petroleum engineers would be in if they didn’t have a measuring unit like the gallon. We had intuitive feelings about these matters, but we didn’t have a clear understanding” [15]. There were, however, a group of communications engineers and engineering theorists that held the private opinion that information theory was too abstract for any real purpose or application [16]. Mathematical scientists saw Shannon’s work in a very different light, thinking of it as unimportant at first. Later, the meaning shifted to that of an engineer’s sketch requiring added mathematical rigor to fully construct the field as a true mathematical discipline. Mathematical scientists and wrote in 1977, “While, of course, Shannon was not working in a vacuum in the 1940s, his results were so breathtakingly original that even the communication specialists of the day were at a loss to understand their significance” [17] Looking back in 1973, John R. Pierce, who saw Shannon as a hero [18], hailed information theory as shedding “about as much light on the problem of the communication engineer as can be shed” [19]. Even a more neutral observer, Fortune magazine, said in 1953 that information theory bore the hallmarks of greatness [20]. Thus an examination of information theory’s conception by Shannon serves to elucidate the origins of the relationship between mathematical and engineering thought in information theory. Claude Shannon was academically trained as both an electrical engineer and a mathematician, earning bachelors’ degrees in both subjects at the University of Michigan, followed by a master’s degree in electrical engineering and a doctorate in mathematics at the Massachusetts Institute of Technology (M.I.T.). After receiving his doctorate in 1939, Shannon spent one year as a National Research Fellow at the Institute for Advanced Study (I.A.S.) in Princeton, studying mathematics and Boolean algebra under Hermann Weyl, a mathematician and mathematical philosopher [21]-[22]. Shannon said that one of the questions motivating his early work on information theory was whether television could be compressed into a smaller bandwidth [23], but also that the idea of an information measure with an entropy form occurred to him while at I.A.S. [24]. After taking a position at Bell Laboratories, Shannon developed much of his information theory at home on nights and weekends during the 1940-1945 period, and only after much urging from his supervisor, Hendrik Bode, and colleagues did Shannon finally publish his work in 1948 [25]. Certainly a superior understanding of stochastic processes, n-dimensional geometry, and the philosophical meaning of communication were vital to Shannon’s development of information theory, but just as important was his engineering knowledge of various modulation techniques and cryptography. In fact, Shannon’s wartime Bell Laboratories confidential report of 1945, “A Mathematical Theory of Cryptography,” uses many of the information theory concepts that he was simultaneously developing [26]-[27]. The synthesis of mathematical science and engineering knowledge
2004 IEEE Conference on the History of Electronics
led to information theory. Shannon’s own perception of information theory was as mathematical science, stating in 1956 that, “The hard core of information theory is, essentially, a branch of mathematics, a strictly deductive system” [28]. Many other engineering theorists followed this social construction to a degree, but recognized its practical importance as well. An article in the Bell Laboratories Record, a monthly magazine with a general electrical engineering audience, referred to information theory as a subject which, “in certain applications becomes a complex mathematical subject” [29]. Joseph L. Doob, a probabilist, however, commented in Mathematical Review that in 3
communication” [33]. Towards this goal, Shannon developed methods to measure the redundancy and entropy of language, publishing results in 1951. Using an experimental approach, the statistical nature and redundancy of printed English was determined through the use of human subjects’ predictive powers, rather than the mathematical approach of determining the stochastic process that governs language production. For a given sentence, subjects had to try to guess the next letter based on the previous letters, and the average number of guesses determined the entropy [34]. One of Shannon’s favorite sentences for this type of experiment was There is no reverse on a motorcycle a friend of mine found this out rather Shannon’s work, “The discussion is suggestive throughout, rather than mathematical, and it is not always clear that the author’s mathematical intentions are honorable” [30].
The first idea presented by Shannon is that of a general communication system, with an information source that generates messages, a transmitter that encodes messages into signals, a possibly noisy channel through which signals are sent, a receiver that receives signals and decodes messages, and a destination for the messages. Then he considers noiseless channels, and defines the channel capacity to be the logarithm of the number of possible symbols that can be sent in one channel usage. After a survey of various information source models, he introduces the concept of entropy, a statistical measure which defines how much information the source produces, based on the uncertainty of producing various messages. The more uncertain a message is, the more information it conveys. A fundamental theorem of information theory is then established, namely that if and only if the information rate (entropy) of the source is less than the capacity of the channel can the encoder encode the message so that it can be transmitted over the channel. This theorem is extended to noisy channels, and again it is shown that if and only if the rate of the source is less than or equal to the channel capacity, can messages be encoded so that they will be received with arbitrarily small probability of error. That is to say, the source must be matched to the channel for reliable communication. The first part of the paper deals with discrete sources and channels; extensions to continuous channels and sources complete the main ideas of the work [31]. Before this work, it was commonly held that as the channel became noisier, then the rate would have to become smaller and smaller until one could not transmit any information (the conclusion one would draw from investigating repetition coding methods). The great revelation provided by Shannon was that no matter how noisy the channel, some information could still be transmitted [32].
IV. INFORMATION THEORY AS SCIENCE The engineering theorists that constructed information theory as mathematical science needed to characterize information sources in the information theoretic way to apply it to communication systems. Shannon himself was “specially concerned to push the applications to engineering dramatically the other day [35]. Shannon’s work on cryptography was released in the open literature in 1949. Cryptographic enciphering schemes based on language texts were discussed in the context of information theory, and the entropy of message sources and keys was determined. He found that complete concealment is only possible if the key entropy is at least equal to the message entropy [36]. While Shannon considered language, many communications engineers considered more prominent objectives such as telephone and television systems that, “as one Bell Labs engineer phrase[d] it, ‘ignore the past and pretend each [message] sample is a complete surprise’” [37]. Bell’s Ernest R. Kretzmer experimentally measured statistical quantities characterizing picture signals, and used these results to estimate the amount by which channel capacity can be reduced for television transmission by the exploitation of the statistical distribution in 1952 [38]. In the mid-1950s, a reasonably simple code was designed by Michel, Fleckstein, and Kretzmer to transmit drawings and diagrams by facsimile, using only about 12 percent of the rate of a conventional system [39]. The most conspicuous practical success of information theory actually came in color television transmission [40].
At the time of information theory’s origin, the art of communications was at a very advanced state. The communications engineer could use amplitude modulation, frequency modulation, phase modulation, single sideband modulation, or pulse code modulation, or easily invent something new [41]. In order to use the channel capacity aspects of information theory, extended beyond the original by Shannon in his 1949 work, “Communication in the Presence of Noise” [42], it was necessary to characterize existing communication systems. Nelson Blachman did that by characterizing the channel capacity of various modulation schemes with different noise levels. He also found the optimum statistics of communication systems consisting of an amplitude modulated and/or phase modulated transmitter, a transmission medium, and a receiver that responds to the
amplitude and/or phase of the signal. He also found that only
using amplitude modulation or only using phase modulation reduces the information rate below capacity and that at very high signal to noise ratios, amplitude modulation and phase modulation each account for half of the channel capacity [43]. The derivations of the optimal statistics were more
2004 IEEE Conference on the History of Electronics
mathematical than the experimental statistics derived for sources, but the mathematics used was of an engineering theory style, rather than a mathematical science style. That is to say, specific modulation schemes were transformed using Fourier analysis, specific probabilistic noise models were introduced, and particular decoding schemes were specified. The results were characterizations for specific conditions, rather than mathematical science results of proving general theorems. In a similar work, Erling D. Sunde developed a compendium of theoretical fundamentals relating to pulse transmission for engineering applications. Again, the 4
in Syracuse, N.Y. noted in 1952 that in most communication systems, the opportunity for coding before transmission exists and that complex coding processes had been developed. However, applications of these coding techniques “will be restricted by the complexity of the terminal equipment required” [49]. Shannon commented on the application of information theoretic ideas to telephone memory design, but conceded that, “incorporating this feature appears too complicated to be practical” [50]. Arthur W. Horton, Jr. and H. Earle Vaughn studied methods of transmitting and receiving short coded messages of about 10 decimal digits in characterization, although mathematically-based, was specific to systems with either low-pass, symmetrical band-pass, or asymmetrical band-pass characteristics. Discussion on random imperfections in a communication system, as opposed to random noise, was also discussed [44]. This topic is clearly of interest to communications engineers, rather than mathematical scientists who in their assumptions of ideality assume ideal random noise as well. A desire to apply the science of information theory to communications engineering led to the characterization of sources and existing systems in information theoretic terms.
V. INFORMATION THEORY AS AN IDEAL
Another group of engineering theorists perceived the results of information theory not just as a way to characterize current systems, but as an ideal of optimal performance to work towards. The goal was to achieve the maximal information rate. Steven O. Rice described two encoding schemes in which the ideal rate of information transmission in the presence of noise is approached. He considered two explicit construction schemes for choosing the transmitted signal and gave an exact result for the probability of error of the decoded message. Through numerical approximations, he showed that both schemes approach the ideal rate of transmission in the presence of random noise when the signal length is increased [45]-[46]. David A. Huffman was able to develop an algorithm to construct the optimal instantaneous source code in 1952 [47]. His motivation came from the fact that the coding procedure developed by Shannon and Robert Fano of M.I.T. was not optimum, but approached the optimum
behavior only when the block length approached infinity. No
definite procedure had been conceived for the construction of
an optimum code, until Huffman’s derivation for a project in Fano’s class at M.I.T. Thus, engineering theorists were able to make some progress towards developing constructive, though still theoretical, methods for achieving optimum communication.
Various engineers and engineering theorists tried to incorporate information theory-inspired designs into complete communications systems, but “it was shown that there are inherent difficulties in approaching the theoretical maximum rate of information transmission. Drastic attempts to approach the ideal lead to excessive complication of apparatus” [48] For example, John P. Costas of the General Electric Company length over telephone lines with simple and reliable equipment and at the highest speed possible with reliable operation. They determined that when the complete message to be sent is short, the advantages of one code over another are not great, and so information theory provides nothing useful [51]. Those engineering theorists who had constructed information theory as an ideal to work towards achieved some successes, but other setbacks because the theory’s bounds could only be approached with coding of great complexity and delay. In this way the physical devices of communication set tighter bounds on achievable communication than information theory. Later developments in solid-state electronics and digital computers would allow some of information theory’s greater successes to be achieved. After all “A theory builds no machinery. But inevitably, when good theories are enunciated, they make machinery easier to build” [52].
In a strong departure from most communications engineers, Edgar N. Gilbert of Bell Telephone Laboratory felt that the information theory coding theorems themselves were not applicable to the telephone industry. The reason being that the theorems state that the average error probability becomes arbitrarily small, but that the error probabilities for specific letters in the alphabet are allowed to be high. Consequently when transmitting phone numbers for example, it is possible to miss specific numbers. Gilbert concluded in 1952 that error-correcting codes are much more suited for the telephone industry [53]. Although Gilbert’s view differed from many engineering theorists, it corresponded to a certain extent with the view of information theory held by mathematical scientists.
VI. MATHEMATICAL SCIENTISTS’ CONCEPTIONS In the first few years after the establishment of information theory, communications engineering theorists had pursued applications of information theory to various systems and sources, and had advanced the theory, but mathematical scientists took very little interest in the subject. “A conviction
on the part of mathematicians that Shannon had done something important and a motivation to search for proofs more satisfactory to them” [], had not emerged. Brockway McMillan, a research mathematician and colleague of Shannon at Bell, took the first steps in changing this perception by presenting an exposition and partial critique of Shannon’s models for communication systems in a mathematical journal [55]. Many of the contributions made by mathematical
2004 IEEE Conference on the History of Electronics
scientists to information theory dealt with error-correcting codes, first described by Bell’s Richard W. Hamming in 1950 [56]. David Slepian, also of Bell Laboratory, described in 1956 a class of binary signaling alphabets, generalizations of Hamming’s error correcting codes, called group alphabets for use in binary symmetric channels. Peter Elias of M.I.T. had previously shown that there exist group alphabets which signal at a rate arbitrarily close to the channel capacity of a binary symmetric channel with arbitrarily small probability of error [57]. These works firmly established the relationship between information theory and algebraic coding, and strongly established information theory as a true mathematical 5
the practicing engineer. Engineering theorists pursued information theory for that very purpose, and mathematicians produced results as a byproduct of their quest for mathematical rigor. Although the results of information theory remained strictly on paper during this period, later growth in electronic communication, and particularly digital communication, would come to rely on them.
The socially constructed meanings developed by engineering theorists and mathematicians during information theory’s formative period not only determined the research directions pursued at the time, but also set the course of the field for long afterwards. The mutual interaction and influence discipline within the social group of mathematical scientists. In 1966, Gilbert would comment on Shannon’s paper, “Mathematicians found his paper a gold mine of statistical and combinatorial problems… mainly because he raised difficult mathematical questions which… are unanswered” [58].
By the time Jacob Wolfowitz, a professor of mathematics at Cornell University, proved a channel coding theorem about the probability of transmitting any word incorrectly in 1957 [59], a change in perception of information theory in the mathematical science community from useless to incomplete had occurred. He refers to Shannon’s A Mathematical Theory of Communication, as a “fundamental and already classical paper” [60]. Wolfowitz’s motivations for proving the strong channel coding theorem were twofold; one arose from the need for mathematical rigor, whereas the other derived from engineering theory. Shannon’s proof of the channel coding theorem is based on random codes, but it seemed questionable to mathematicians, “whether random codes are properly codes at all” [61]. Wolfowitz went on to say that “the desirability of proving the existence of an error correcting code which satisfies the conclusion of the coding theorem has always been recognized and well understood” [62], referring to Gilbert’s claim that error correcting codes should be used in the telephone industry. The mutual interaction between engineering theorists’ desire for theorems about transmitting error correcting codes and mathematicians’ desire to make the coding theorem more rigorous resulted in Wolfowitz’s strong noisy channel coding theorem.
VII. CONCLUSIONS
Pre-information theory inventions such as the Morse Code, the vocoder, the compandor, and pulse code modulation [63] demonstrate that even without the support of mathematical science or engineering theory, engineers were able to achieve efficient communications technologies. Shannon, in 1956, opined that while “information theory is indeed a valuable tool in providing fundamental insights into the nature of communication problems and will continue to grow in importance, it is certainly no panacea for the communication engineer” [], engineering ingenuity is still essential. During the formative period of information theory’s establishment as a discipline, both engineering theory and mathematical science approaches led to developments that would eventually assist
between the two groups that was a hallmark of the early period has also continued. To this day, both mathematicians and engineering theorists contribute to the advancement of information theory in their own ways, based on their own socially constructed meanings, so that more insights become apparent and applications become more optimal. Sometimes, engineering knowledge and ingenuity are necessary to make advances, and sometimes “the mathematical crank does turn to good advantage” [65], yet other times the interaction of the two is requisite for the advancement of information theory.
ACKNOWLEDGMENT
The author would like to thank Prof. Ronald Kline of Cornell University for reviewing early drafts of this paper and for his guidance and support.
REFERENCES
[1] IEEE Information Theory Society. [Online] Available:
http://golay.uvic.ca/society/benefits.html.
[2] E.C. Cherry, “A History of the Theory of Information,” I.R.E. Trans.
Inform. Theory, vol. 1, no. 1, pp. 22-43, Feb. 1953.
[3] C.E. Shannon, “A Mathematical Theory of Communication,” Bell Sys.
Tech. J., vol. 27, pp. 379-423, pp. 623-656, July and October 1948. [4] Proceedings of the I.R.E. [5] A.I. Khinchin. Uspekhi Matematicheskikh Nauk, vol. 7, no. 3, pp. 3-20,
1953.
[6] P.E. Green, Jr., “A bibliography of Soviet Literature on Noise,
Correlation, and Information Theory,” I.R.E. Trans. Inform. Theory, vol. 2, no. 2, pp. 91-94 June 1956. [7] P.M. Woodward, Probability and Information Theory, with
Applications to Radar, New York, McGraw-Hill, 1953.
[8] R. Kline, “Science and Engineering Theory in the Invention and
Development of the Induction Motor, 1880-1900,” Technology and Culture, vol. 28, no. 2, pp. 283-313, Apr. 1987.
[9] E. Weber and F. Nebeker, The Evolution of Electrical Engineering: A
Personal Perspective, New York: IEEE Press, 1994, ch. 2-4. [10] W. Aspray, “The Scientific Conceptualization of Information: A
Survey,” Annals of the History of Computing, vol. 7, no. 2, pp. 117-140, Apr. 1985.
[11] E.C. Cherry, “A History of the Theory of Information,” I.R.E. Trans.
Inform. Theory, vol. 1, no. 1, pp. 22-43, Feb. 1953.
[12] S. Verdu, “Fifty Years of Shannon Theory,” IEEE Trans. Inform.
Theory, vol. 44, no. 6, pp. 2057-2078, Oct. 1998. [13] S.W. Leslie, The Cold War and American Science: The Military-Industry-Academic Complex at MIT and Stanford, New York:
Columbia University Press, 1993, p. 261.
[14] S. Millman, ed., A History of Engineering and Science in the Bell
System: Communications Sciences (1925-1980), AT&T Bell
Laboratories: 1984, p. 4, quoting T.C. Fry, “Industrial Mathematics,” Bell Sys. Tech. J., July 1941.
2004 IEEE Conference on the History of Electronics
[15] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953.
[16] E.N. Gilbert, “Information Theory after 18 Years,” Science, vol. 152,
no. 3720, pp. 320-326, Apr. 1966.
[17] S. Millman, ed., A History of Engineering and Science in the Bell
System: Communications Sciences (1925-1980), AT&T Bell Laboratories: 1984, p. 46, quoting R.J. McEliece, The Theory of Information and Coding: A Mathematical Framework for Communication, Reading, MA: Addison-Wesley, 1977.
[18] John Pierce, Electrical Engineer, an oral history conducted in 1992 by
Andrew Goldstein, IEEE History Center, Rutgers University, New Brunswick, NJ, USA.
[19] J.R. Pierce, “The Early Days of Information Theory,” IEEE Trans.
Inform. Theory, vol. IT-19, no. 1, pp. 3-8, Jan. 1973.
[20] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953.
[21] E.M. Rogers and T.W. Valente, “A History of Information Theory in
Communication Research,” in Between Communication and
Information, ed. J.R. Schement and B.D. Ruben, New Brunswick: Transaction Publishers: 1993, pp. 35-56. [22] J. Campbell, Grammatical Man: Information, Entropy, Language, and
Life, New York: Simon and Schuster, 1982, p. 20.
[23] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953. [24] J. Campbell, Grammatical Man: Information, Entropy, Language, and
Life, New York: Simon and Schuster, 1982, p. 20.
[25] E.M. Rogers and T.W. Valente, “A History of Information Theory in
Communication Research,” in Between Communication and
Information, ed. J.R. Schement and B.D. Ruben, New Brunswick: Transaction Publishers: 1993, pp. 35-56.
[26] E.M. Rogers and T.W. Valente, “A History of Information Theory in
Communication Research,” in Between Communication and
Information, ed. J.R. Schement and B.D. Ruben, New Brunswick: Transaction Publishers: 1993, pp. 35-56.
[27] W. Aspray, “The Scientific Conceptualization of Information: A
Survey,” Annals of the History of Computing, vol. 7, no. 2, pp. 117-140, Apr. 1985.
[28] C. E. Shannon, “The Bandwagon,” IRE Trans. Inform. Theory, vol. 2,
no. 1, p. 3, Mar. 1956.
[29] C.B. Feldman, “Information Theory,” Bell Laboratories Record, vol.
31, pp. 326-332, Sep. 1953.
[30] J.L. Doob, rev., in C.E. Shannon, “A Mathematical Theory of
Communication,” Mathematical Review, vol. 10, p. 133, Feb. 1949. [31] C.E. Shannon, “A Mathematical Theory of Communication,” Bell Sys.
Tech. J., vol. 27, pp. 379-423, pp. 623-656, July and October 1948. [32] T. Berger, Fundamental Information Theory lecture, Ithaca, NY, Spring
2003.
[33] W. Weaver, “Recent Contributions to the Mathematical Theory of
Communication,” in The Mathematical Theory of Communication, University of Illinois Press, Urbana, Ill.: 1949.
[34] C.E. Shannon, “Prediction and Entropy of Printed English,” Bell Sys.
Tech. J., Jan. 1951.
[35] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953.
[36] C.E. Shannon, “Communication Theory of Secrecy Systems,” Bell Sys.
Tech. J., Oct. 1949.
[37] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953.
[38] E.R. Krezmer, “Statistics of Television Signals,” Bell Sys. Tech. J., July
1952.
6
[39] E.N. Gilbert, “Information Theory after 18 Years,” Science, vol. 152,
no. 3720, pp. 320-326, Apr. 1966. [40] J. Campbell, Grammatical Man: Information, Entropy, Language, and
Life, New York: Simon and Schuster, 1982, p. 16.
[41] E.N. Gilbert, “Information Theory after 18 Years,” Science, vol. 152,
no. 3720, pp. 320-326, Apr. 1966.
[42] C.E. Shannon, “Communication in the Presence of Noise,” vol. 37, no.
1, pp. 10-22, Jan. 1949.
[43] N.M. Blachman, “A Comparison of the Informational Capacities of
Amplitude- and Phase-Modulation Communication Systems,” Proc. IRE, June 1953, pp. 748-759.
[44] E.D. Sunde, “Theoretical Fundamentals of Pulse Transmission–I,” Bell
Sys. Tech. J., vol. 33, no. 3, pp. 721-788, May 19 and “Theoretical Fundamentals of Pulse Transmission–II,” Bell Sys. Tech. J., vol. 33, no. 4, pp. 987-1010, July 19.
[45] S.O. Rice, “Communication in the Presence of Noise-Probability of
Error for Two Encoding Schemes,” Bell Sys. Tech. J., Jan. 1950. [46] D. Slepian and A.D. Wyner, “S.O. Rice’s Contributions to Shannon
Theory,” IEEE Trans. Information Theory, vol. 34, no. 6, pp. 1374, Nov. 1988.
[47] D.A. Huffman, “A Method for the Construction of Minimum-Redundancy Codes,” Proc. IRE, Sep. 1952, pp. 1098-1101. [48] “Radio Progress During 1952,” Proc. IRE, April 1952, p. 484.
[49] J.P. Costas, “Coding with Linear Systems,” Proc. IRE, Sep. 1952, pp.
1101-1104.
[50] C.E. Shannon, “Memory Requirements in a Telephone Exchange,” Bell
Sys. Tech. J., July 1950.
[51] A.W. Horton, Jr. and H.E. Vaughn, “Transmission of Digital
Information over Telephone Circuits,” Bell Sys. Tech. J., vol. 34, no. 3, pp. 511-528, May 1955.
[52] F. Bello, “The Information Theory,” Fortune, vol. 48, pp. 136-158, Dec.
1953.
[53] E.N. Gilbert, “A Comparison of Signaling Alphabets,” Bell Sys. Tech.
J., May 1952.
[] J.R. Pierce, “The Early Days of Information Theory,” IEEE Trans.
Inform. Theory, vol. IT-19, no. 1, pp. 3-8, Jan. 1973.
[55] B. McMillan, “Basic Theorems of Information Theory,” Ann. Math.
Stat., vol. 24, pp. 196-219, June 1953.
[56] R.W. Hamming, “Error detecting and error correcting codes,” Bell Sys.
Tech. J., Apr. 1950.
[57] David Slepian, “A Class of Binary Signaling Alphabets,” Bell Sys. Tech.
J., Jan. 1956.
[58] E.N. Gilbert, “Information Theory after 18 Years,” Science, vol. 152,
no. 3720, pp. 320-326, Apr. 1966.
[59] J. Wolfowitz, “The Coding of Messages Subject to Chance Errors,”
Illinois Journal of Mathematics, vol. 1, no. 4, Dec. 1957, pp. 591-606. [60] J. Wolfowitz, “The Coding of Messages Subject to Chance Errors,”
Illinois Journal of Mathematics, vol. 1, no. 4, Dec. 1957, pp. 591-606. [61] J. Wolfowitz, “The Coding of Messages Subject to Chance Errors,”
Illinois Journal of Mathematics, vol. 1, no. 4, Dec. 1957, pp. 591-606. [62] J. Wolfowitz, “The Coding of Messages Subject to Chance Errors,”
Illinois Journal of Mathematics, vol. 1, no. 4, Dec. 1957, pp. 591-606. [63] S. Verdu, “Fifty Years of Shannon Theory,” IEEE Trans. Inform.
Theory, vol. 44, no. 6, pp. 2057-2078, Oct. 1998.
[] C. E. Shannon, “The Bandwagon,” IRE Trans. Inform. Theory, vol. 2,
no. 1, p. 3, Mar. 1956.
[65] J.R. Pierce, “The Early Days of Information Theory,” IEEE Trans.
Inform. Theory, vol. IT-19, no. 1, pp. 3-8, Jan. 1973.
2004 IEEE Conference on the History of Electronics
因篇幅问题不能全部显示,请点此查看更多更全内容
Copyright © 2019- jqkq.cn 版权所有 赣ICP备2024042794号-4
违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com
本站由北京市万商天勤律师事务所王兴未律师提供法律服务