4406 entries. 94 themes. Last updated December 26, 2016.

Communication / Information Theory Timeline

Theme

1850 – 1875

Darwin Founds Ethology, Studies the Conveyance of Information, and Contributes to Psychology 1872

In 1872 Charles Darwin issued The Expression of the Emotions in Man and Animals through his publisher, John Murray. This book, which contained numerous wood-engraved text illustrations, was also illustrated with seven heliotype plates of photographs by pioneering art photogapher Oscar Gustave Rejlander, and was the only book by Darwin illustrated with photographs.

“With this book Darwin founded the study of ethology (animal behavior) and conveyance of information (communication theory) and made a major contribution to psychology” (DSB). Written as a rebuttal to the idea that the facial muscles of expression in humans were a special endowment, the work contained studies of facial and other types of expression (sounds, erection of hair, etc.) in man and mammals, and their correlation with various emotions such as grief, love, anger, fear and shame. The results of Darwin’s investigations showed that in many cases expression is not learned but innate, and enabled Darwin to formulate three principles governing the expression of emotions—relief of sensation or desire, antithesis, and reflex action.

View Map + Bookmark Entry

1920 – 1930

A Logarithmic Law for Communication 1924

In “Certain Factors Affecting Telegraph Speed,” Bell System Technical Journal 3 (1924) 324–346, information theorist Harry Nyquist analyzed factors affecting telegraph transmission speed, presenting the first statement of a logarithmic law for communication, and the first examination of the theoretical bounds for ideal codes for the transmission of information.

View Map + Bookmark Entry

Hartley's Law 1928

In 1928 information theorist Ralph V. R. Hartley of Bell Labs published “Transmission of Information,” in which he proved "that the total amount of information that can be transmitted is proportional to frequency range transmitted and the time of the transmission."

Hartley's law eventually became one of the elements of Claude Shannon's Mathematical Theory of Communication.

View Map + Bookmark Entry

The Relationship between Information and Thermodynamics 1929

While working at Humboldt-Universität zu Berlin in 1929, Austro-Hungarian physicist and inventor Leo Szilard published "Über die Entropieverminderung in einem thermodynamischen System bei Eingrffen intelligenter Wesen," Zeitschrift für Physik 53 (1929) 840-856. The paper described a theoretical model that served both as a heat engine and an information engine, establishing the relationship between thermodynamics (manipulation and transfer of energy and entropy,) and information (manipulation and transmission of bits).

Szilard was one of the first to show that "Nature seems to talk in terms of information" (Seife, Decoding the Universe, 77).

View Map + Bookmark Entry

1930 – 1940

Shannon's "Symbolic Analysis of Relay and Switching Circuits," "The Most Significant Master's Thesis of the 20th Century" August 10, 1937

Claude Shannon, in his master’s thesis entitled A Symbolic Analysis of Relay and Switching Circuits, submitted to MIT on August 10, 1937, showed that the two-valued algebra developed by George Boole could be used as a basis for the design of electrical circuits. It was first published in a revised and abridged version in Transactions of the American Institute of Electrical Engineers 57 (1938) 713-23.

This thesis became the theoretical basis for the electronics and computer industries that were developed after World War II. Shannon wrote the thesis while working at Bell Telephone Laboratories in New York City. As examples of circuits that could be built using relays, Shannon appended to the thesis theoretical descriptions of "An Electric Adder to the Base Two," and "A Factor Table Machine." The "Factor Table Machine" was not included in the published version.

Shannon's thesis was later characterized as "the most significant master's thesis of the 20th century."  

♦ In October 2013 I was surprised to learn that as early as 1886 the American philosopher and logician Charles Sanders Peirce recognized that logical operations could be carried out by electrical switching circuits, and that circuit diagrams for a logic machine constructed from electrical circuits were produced for one of Peirce's students, Allan Marquand. Neither Peirce nor Marquand published on an electrical logic machine, and the concept seems not to have been pursued by either Peirce or Marquand beyond the drawing stage. Nor have I seen evidence of any further development of the concept until Shannon's thesis.

View Map + Bookmark Entry

1940 – 1950

Norbert Weiner Formulates Communication Theory as a Statistical Problem 1942

In 1942, having collaborated with engineer Julian Bigelow, mathematician Norbert Wiener published, as a classified document from MIT, The Extrapolation, Interpretation and Smoothing of Stationery Time Series. According to Claude Shannon, this work contained “the first clear-cut formulation of communication theory as a statistical problem, the study of operations on time series.”

View Map + Bookmark Entry

Communication Theory of Secrecy Systems 1945 – 1949

Claude Shannon's report, originally issued as a classified document entitled A Mathematical Theory of Cryptography, Memorandum MM 45-110-02, September 1, 1945, was formally published in 1949 as "Communication Theory of Secrecy Systems" in Bell System Technical Journal, 28 (1949)  656–715.  This paper, discussing cryptography from the viewpoint of information theory, contained a proof that all theoretically unbreakable ciphers must have the same requirements as the one-time pad.

View Map + Bookmark Entry

Norbert Wiener Issues "Cybernetics", the First Widely Distributed Book on Electronic Computing 1948

"Use the word 'cybernetics', Norbert, because nobody knows what it means. This will always put you at an advantage in arguments."

— Widely quoted: attributed to Claude Shannon in a letter to Norbert Wiener in the 1940s.

 In 1948 mathematician Norbert Wiener at MIT published Cybernetics or Control and Communication in the Animal and the Machine, a widely circulated and influential book that applied theories of information and communication to both biological systems and machines. Computer-related words with the “cyber” prefix, including "cyberspace," originate from Wiener’s book. Cybernetics was also the first conventionally published book to discuss electronic digital computing. Writing as a mathematician rather than an engineer, Wiener’s discussion was theoretical rather than specific. Strangely the first edition of the book was published in English in Paris at the press of Hermann et Cie. The first American edition was printed offset from the French sheets and issued by John Wiley in New York, also in 1948. I have never seen an edition printed or published in England. 

Independently of Claude Shannon, Wiener conceived of communications engineering as a brand of statistical physics and applied this viewpoint to the concept of information. Wiener's chapter on "Time series, information, and communication" contained the first publication of Wiener's formula describing the probability density of continuous information. This was remarkably close to Shannon's formula dealing with discrete time published in A Mathematical Theory of Communication (1948). Cybernetics also contained a chapter on "Computing machines and the nervous system." This was a theoretical discussion, influenced by McCulloch and Pitts, of differences and similarities between information processing in the electronic computer and the human brain. It contained a discussion of the difference between human memory and the different computer memories then available. Tacked on at the end of Cybernetics were speculations by Wiener about building a chess-playing computer, predating Shannon's first paper on the topic.

Cybernetics is a peculiar, rambling blend of popular and highly technical writing, ranging from history to philosophy, to mathematics, to information and communication theory, to computer science, and to biology. Reflecting the amazingly wide range of the author's interests, it represented an interdisciplinary approach to information systems both in biology and machines. It influenced a generation of scientists working in a wide range of disciplines. In it were the roots of various elements of computer science, which by the mid-1950s had broken off from cybernetics to form their own specialties. Among these separate disciplines were information theory, computer learning, and artificial intelligence.

It is probable that Wiley had Hermann et Cie supervise the typesetting because they specialized in books on mathematics.  Hermann printed the first edition by letterpress; the American edition was printed offset from the French sheets. Perhaps because the typesetting was done in France Wiener did not have the opportunity to read proofs carefully, as the first edition contained many typographical errors which were repeated in the American edition, and which remained uncorrected through the various printings of the American edition until a second edition was finally published by John Wiley and MIT Press in 1961. 

Though the book contained a lot of technical mathematics, and was not written for a popular audience, the first American edition went through at least 5 printings during 1948,  and several later printings, most of which were probably not read in their entirety by purchasers. Sales of Wiener's book were helped by reviews in wide circulation journals such as the review in TIME Magazine on December 27, 1948, entitled "In Man's Image." The reviewer used the word calculator to describe the machines; at this time the word computer was reserved for humans.

"Some modern calculators 'remember' by means of electrical impulses circulating for long periods around closed circuits. One kind of human memory is believed to depend on a similar system: groups of neurons connected in rings. The memory impulses go round & round and are called upon when needed. Some calculators use 'scanning' as in television. So does the brain. In place of the beam of electrons which scans a television tube, many physiologists believe, the brain has 'alpha waves': electrical surges, ten per second, which question the circulating memories.

"By copying the human brain, says Professor Wiener, man is learning how to build better calculating machines. And the more he learns about calculators, the better he understands the brain. The cyberneticists are like explorers pushing into a new country and finding that nature, by constructing the human brain, pioneered there before them.

"Psychotic Calculators. If calculators are like human brains, do they ever go insane? Indeed they do, says Professor Wiener. Certain forms of insanity in the brain are believed to be caused by circulating memories which have got out of hand. Memory impulses (of worry or fear) go round & round, refusing to be suppressed. They invade other neuron circuits and eventually occupy so much nerve tissue that the brain, absorbed in its worry, can think of nothing else.

"The more complicated calculating machines, says Professor Wiener, do this too. An electrical impulse, instead of going to its proper destination and quieting down dutifully, starts circulating lawlessly. It invades distant parts of the mechanism and sets the whole mass of electronic neurons moving in wild oscillations" (http://www.time.com/time/magazine/article/0,9171,886484-2,00.html, accessed 03-05-2009).

Presumably the commercial success of Cybernetics encouraged Wiley to publish Berkeley's Giant Brains, or Machines that Think in 1949.

♦ In October 2012 I offered for sale the copy of the first American printing of Cybernetics that Wiener inscribed to Jerry Wiesner, the head of the laboratory at MIT where Wiener conducted his research. This was the first inscribed copy of the first edition (either the French or American first) that I had ever seen on the market, though the occasional signed copy of the American edition did turn up. Having read our catalogue description of that item, my colleague Arthur Freeman emailed me this story pertinent to Wiener's habit of not inscribing books:

"Norbert, whom I grew up nearby (he visited our converted barn in Belmont, Mass., constantly to play frantic theoretical blackboard math with my father, an economist/statistician at MIT, which my mother, herself a bit better at pure math, would have to explain to him later), was a notorious cheapskate. His wife once persuaded him to invite some colleagues out for a beer at the Oxford Grill in Harvard Square, which he did, and after a fifteen-minute sipping session, he got up to go, and solemnly collected one dime each from each of his guests. So when *Cybernetics* appeared on the shelves of the Harvard Coop Bookstore, my father was surprised and flattered that Norbert wanted him to have an inscribed copy, and together they went to Coop, where Norbert duly picked one out, wrote in it, and carried it to the check-out counter--where he ceremoniously handed it over to my father to pay for. This was a great topic of family folklore. I wonder if Jerry Wiesner paid for his copy too?"

View Map + Bookmark Entry

Shannon's "A Mathematical Theory of Communication" July – October 1948

During July and October 1948 Claude Shannon of MIT and Bell Labs published his Mathematical Theory of Communication. The theory determined how much information could be sent per unit of time in a system with a given, limited amount of transmission power. In this work Shannon also introduced the term "bit" into the literature, and provided its current meaning in the context of information.  Shannon attributed the origin of this word usage to John W. Tukey, who had written a Bell Labs memo on January 9, 1947 in which Tukey contracted "binary digit" to simply "bit".  

View Map + Bookmark Entry

The Nyquist-Shannon Sampling Theorem 1949

In 1949 American mathematician, electrical engineer, and cryptographer Claude Shannon published Communication in the Presence of Noise

"The sampling theorem was implied by the work of Harry Nyquist in 1928 ('Certain topics in telegraph transmission theory'), in which he showed that up to 2B independent pulse samples could be sent through a system of bandwidth B; but he did not explicitly consider the problem of sampling and reconstruction of continuous signals. About the same time, Karl Küpfmüller showed a similar result, and discussed the sinc-function impulse response of a band-limiting filter, via its integral, the step response Integralsinus; this bandlimiting and reconstruction filter that is so central to the sampling theorem is sometimes referred to as a Küpfmüller filter (but seldom so in English).

"The sampling theorem, essentially a dual of Nyquist's result, was proved by Claude E. Shannon in 1949 ('Communication in the presence of noise'). V. A. Kotelnikov published similar results in 1933 ('On the transmission capacity of the 'ether' and of cables in electrical communications', translation from the Russian), as did the mathematician E. T. Whittaker in 1915 ('Expansions of the Interpolation-Theory', 'Theorie der Kardinalfunktionen'), J. M. Whittaker in 1935 ('Interpolatory function theory'), and Gabor in 1946 ('Theory of communication')" (Wikipedia article on Nyquist-Shannon Sampling Theorem, accessed 01-04-2010).

View Map + Bookmark Entry

1950 – 1960

Chomsky's Hierarchy of Syntactic Forms September 1956

In September 1956 American linguist, philosopher, cognitive scientist, and activist Noam Chomsky published "Three Models for the Description of Language" in IRE Transactions on Information Theory IT-2, 113-24. In the paper Chomsky introduced two key concepts, the first being “Chomsky’s hierarchy” of syntactic forms, which was widely applied in the construction of artificial computer languages.

“The Chomsky hierarchy places regular (or linear) languages as a subset of the context-free languages, which in turn are embedded within the set of context-sensitive languages also finally residing in the set of unrestricted or recursively enumerable languages. By defining syntax as the set of rules that define the spatial relationships between the symbols of a language, various levels of language can be also described as one-dimensional (regular or linear), two-dimensional (context-free), three-dimensional (context sensitive) and multi-dimensional (unrestricted) relationships. From these beginnings, Chomsky might well be described as the ‘father of formal languages’ ” (Lee, Computer Pioneers [1995] 164). 

The second concept Chomsky presented here was his transformational-generative grammar theory, which attempted to define rules that can generate the infinite number of grammatical (well-formed) sentences possible in a language, and seeks to identify rules (transformations) that govern relations between parts of a sentence, on the assumption that beneath such aspects as word order a fundamental deep structure exists. As Chomsky expressed it in his abstract of the present paper,

"We investigate several conceptions of linguistic structure to determine whether or not they can provide simple and “revealing” grammars that generate all of the sentences of English and only these. We find that no finite-state Markov process [a random process whose future probabilities are determined by its most recent values] that produces symbols with transition from state to state can serve as an English grammar. We formalize the notion of “phrase structure” and show that this gives us a method for describing language which is essentially more powerful. We study the properties of a set of grammatical transformations, showing that the grammar of English is materially simplified if phrase-structure is limited to a kernel of simple sentences from which all other sentences are constructed by repeated transformation, and that this view of linguistic structure gives a certain insight into the use and understanding of language" (p. 113).

Minsky, "A Selected Descriptor-Indexed Bibliography to the Literature on Artificial Intelligence" in Feigenbaum & Feldman eds., Computers and Thought (1963) 453-523, no. 484. Hook & Norman, Origins of Cyberspace (2002) no. 531.

View Map + Bookmark Entry

1960 – 1970

Leonard Kleinrock Writes the First Paper on Data Networking Theory May 31, 1961

On May 31, 1961 Leonard Kleinrock submitted his MIT thesis proposal, Information Flow in Large Communication NetsKleinrock's thesis proposal was the first paper on what later came to be known as data communications, or data networking theory.

View Map + Bookmark Entry

Mathematical Theory of Data Communications 1964

In 1964 American computer scientist Leonard Kleinrock published his 1962 PhD thesis in book form as Communication Nets: Stochastic Message Flow and Delay, providing a technology and mathematical theory of data communications. (See Reading 13.4.)

View Map + Bookmark Entry

Solomonoff Begins Algorithmic Information Theory March – June 1964

In March and June, 1964 American mathematician and researcher in artificial intelligence Ray Solomonoff published "A Formal Theory of Inductive Inference, Part I" Information and Control, 7, No. 1, 1-22,  and  "A Formal Theory of Inductive Inference, Part II," Information and Control, 7, No. 2,  224-254. This two-art paper is considered the beginning of algorithmic information theory.

Solomonoff first described his results at a Conference at Caltech, 1960, and in a report of February, 1960: "A Preliminary Report on a General Theory of Inductive Inference."

View Map + Bookmark Entry

1980 – 1990

One of the First Practical Data Compression Systems 1983

In 1983 Oscar Bonello, an acoustical engineer at the University of Buenos Aires and founder of Solydyne, began developing the one of the first practical data compression systems, in this case for audio compression in broadcast automation. His starting point was the psychoacoustic principle of the masking of critical bands first published by the German acoustics scientist Eberhard Zwicker in 1967 in his Das Ohr als Nachrichtenempfänger. (The Ear as Message Receiver). From that base, Bonello started developing a practical application based on the recently developed IBM PC. The problems that he faced were: 1) Create an audio PC card of good audio quality; 2) Create a bit compression algorithm; 3) Create the automation software to be run on the PC.

Bonello's broadcast automation system was launched in 1987 under the name Audicom. In 2013 Bonello's compression system was used in all the lossy audio bit compression systems, including MP3, and many radio stations were using similar technology manufactured by a number of companies.

View Map + Bookmark Entry

2012 – 2016

The First Teleportation from One Macroscopic Object to Another November 8, 2012

Xiao-Hui Bao and colleagues at the University of Science and Technology of China in Hefei teleported quantum information from one ensemble of atoms to another 150 meters away, a demonstration seen as a significant milestone towards quantum routers and a quantum Internet.

"Quantum teleportation and quantum memory are two crucial elements for large-scale quantum networks. With the help of prior distributed entanglement as a “quantum channel,” quantum teleportation provides an intriguing means to faithfully transfer quantum states among distant locations without actual transmission of the physical carriers [Bennett CH, et al. (1993) Phys Rev Lett 70(13):1895–1899]. Quantum memory enables controlled storage and retrieval of fast-flying photonic quantum bits with stationary matter systems, which is essential to achieve the scalability required for large-scale quantum networks. Combining these two capabilities, here we realize quantum teleportation between two remote atomic-ensemble quantum memory nodes, each composed of ∼108 rubidium atoms and connected by a 150-m optical fiber. The spin wave state of one atomic ensemble is mapped to a propagating photon and subjected to Bell state measurements with another single photon that is entangled with the spin wave state of the other ensemble. Two-photon detection events herald the success of teleportation with an average fidelity of 88(7)%. Besides its fundamental interest as a teleportation between two remote macroscopic objects, our technique may be useful for quantum information transfer between different nodes in quantum networks and distributed quantum computing" (Xiao-Hui Bao Xiao-Fan Xuc, Che-Ming Lic, Zhen-Sheng Yuana, Chao-Yang Lua, and Jian-Wei Pana, "Quantum teleportation between remote atomic-ensemble quantum memories," Proc. Nat. Acad. Sci. America 10.1073/pnas.1207329109).

View Map + Bookmark Entry