4406 entries. 94 themes. Last updated December 26, 2016.

1960 to 1970 Timeline


John Horty Pioneers Computer-Assisted Legal Research 1960

In 1960 John Horty at the Health Law Center, University of Pittsburgh, pioneered computer-assisted legal research by having the texts of relevant statutes keyed into punched cards and then transferred to computer tapes where they could be searched and retrieved by “key words in combination” (KWIC).

View Map + Bookmark Entry

The First Commercially Available General Purpose Computer with Transistor Logic 1960

In 1960 IBM introduced a transistorized version of its vacuum-tube-logic 709 computer, the 7090. The 7090 was the first commercially available general purpose computer with transistor logic. It became the most popular large computer of the early 1960s.

View Map + Bookmark Entry

The Monotype Monomatic Hot Type Machine is Introduced 1960

In 1960 the Lanston Monotype Machine Company of Washington, D.C. introduced the Monomatic composing machine, a typesetting system perpetuating the concept of a separate keyboard and caster interfaced by a 31-channel punched paper tape.

“The keyboard consisted of a two-alphabet layout (instead of the customary five or seven) augmented by four shift keys. In the caster, the matrix-case contained 324 characters arranged in 18 ¥ 18 rows. There were no restrictions on unit values within the rows.”

This was, presumably, the final evolution of the Monotype hot metal typesetting system. 

View Map + Bookmark Entry

William Fetter Coins the Term "Computer Graphics" 1960

William A Fetter: while working for Boeing, made the first computer model of the human body ("Boeing Man"), and coined the term computer graphics.

In 1960 William A. Fetter, an art director at The Boeing Company in Seattle, Washington, coined the term “computer graphics.” With Walter Bernhardt, assistant professor of applied mechanics from Wichita State University, Kansas, Fetter outlined a new concept of perspective which Bernhardt converted to mathematics. The same year Boeing established a formal research program to determine how computing technology could be used for design.

See also the entry on "Boeing Man."

(This entry was last revised on 10-18-2014.)

View Map + Bookmark Entry

6000 Computers are Operational in the U.S., Out of 10,000 Worldwide 1960

In 1960 about six thousand computers were operational in the United States, and perhaps ten thousand were operational worldwide.

View Map + Bookmark Entry

Greatbatch, Chardack & Gage Implant the First Self-Contained Internally Powered Artificial Pacemaker in a Human 1960 – December 9, 2013

In 1960 Anerican electrical engineer Wilson Greatbatch and American physicians William Chardack and Andrew Gage of the University at Buffalo reported the success of the first successful long-term implant in a human patient of a self-contained, internally powered artificial pacemaker in their paper entitled A Transistorized, Self-contained, Implantable Pacemaker for the Long-term Correction of Complete Heart Block. (U. S. patent no. 3,057,356).


Fifty-three years later, on December 9, 2013 doctors in Austria performed the first experimental implantation of what was then the smallest pacemaker in the world. The device. produced by Medtronic, was 24 millimeters long and 0.75 cubic centimeters in volume—a tenth the size of a conventional pacemaker. The Micra Transcatheter Pacing System (TPS) was delivered to the inside of the heart via the femoral vein where it grabbed onto endocardial tissue and provided pacing signals through its electrode tip.

View Map + Bookmark Entry

John McCarthy Intoduces LISP, The Language of Choice for Artificial Intelligence 1960

In 1960 artifical intelligence pioneer John McCarthy of Stanford University introduced LISP (LISt Processor), the language of choice for artificial intelligence (AI) programming.

(This entry was last revised on 03-21-2014.)

View Map + Bookmark Entry

PLATO 1: The First Electronic Learning System 1960

In 1960 PLATO I (Programmed Logic for Automatic Teaching Operations), the first electronic learning system, developed by Donald Bitzer, operated on the ILLIAC 1 at the University of Illinois at Urbana-Champaign. Plato I included a television for a display, and a special system to navigate the system's menu. It serviced a single user. In 1961 PLATO II allowed two students to operate the system at one time.

View Map + Bookmark Entry

"Prater-Wei," The First Software Patent 1960 – November 20, 1968

Widely considered the first software patent, "Prater-Wei" was about calculating temperatures for petroleum fractionation.  This patent, originally filed by Mobil Oil Corporation in 1960, addressed computerized spectographic analysis. It had many method and apparatus claims that could be performed either on an analog or digital computer, or with pencil and paper. At the time, software was not patentable, so the authors described a non-computer method of choosing the temperatures, using matrix inversion.  However, the description in the patent application used linear algebra notation similar to that of textbooks published late in the 19th century to disguise the more obvious matrix notation that was invented much later. (adapted from Henry Gladney, Digital Document Quarterly 4.2, and Digital Document Quarterly 7.3, accessed 01-01-2009).

"A Court of Customs and Patent Appeals (CCPA) decision is famous because the question "whether computer programs could contain patentable subject matter" was also before the CCPA.  See Application of Charles D. Prater and James Wei, U.S. CCPA, 415 F.2d 1378, November 20, 1968." (Henry Gladney, Digital Document Quarterly 7,3, accessed 01-01-2009).

View Map + Bookmark Entry

Theodore Maiman Invents the First Working Laser 1960

In 1960 American physicist Theodore Maiman, head of the Quantum Electronics Section at Hughes Aircraft Company in Malibu, California, created the first working laser.  

"Maiman initially sent a description of his device to Physical Review Letters. But it was rejected because so many manuscripts on masers had been submitted to the journal that its editors made the unusual decision to accept no more papers in the field. So Maiman sent it to Nature, where is now famous paper, "Stimulated optical radiation in ruby", appeared on 6 August 1960 (T. H. Maiman Nature 187, 493-94; 1960). It was very brief, and I have previously commented that this article was probably more important per word than any of the papers published by Nature over the past century" (Charles H. Townes, "Obituary Theodore H. Maiman [1927-2007]. Maker of the first laser," Nature Vol. 447, June 7, 2007, p. 654).

"When lasers were invented in 1960, they were called 'a solution looking for a problem'. Since then, they have become ubiquitous, finding utility in thousands of highly varied applications in every section of modern society, including consumer electronics, information technology, science, medicine, industry, law enforcement, entertainment, and the military.

"The first use of lasers in the daily lives of the general population was the supermarket barcode scanner, introduced in 1974. The laserdisc player, introduced in 1978, was the first successful consumer product to include a laser but the compact disc player was the first laser-equipped device to become common, beginning in 1982 followed shortly by laser printers. Some other uses are:

"Medicine: Bloodless surgery, laser healing, surgical treatment, kidney stone treatment, eye treatment, dentistry

"Industry: Cutting, welding, material heat treatment, marking parts, non-contact measurement of parts

"Military: Marking targets, guiding munitions, missile defence, electro-optical countermeasures (EOCM), alternative to radar, blinding troops.

"Law enforcement: used for latent fingerprint detection in the forensic identification field

"Research: Spectroscopy, laser ablation, laser annealing, laser scattering, laser interferometry, LIDAR, laser capture microdissection, fluorescence microscopy

"Product development/commercial: laser printers, optical discs (e.g. CDs and the like), barcode scanners, thermometers, laser pointers, holograms, bubblegrams. Laser lighting displays: Laser light shows

"Cosmetic skin treatments: acne treatment, cellulite and striae reduction, and hair removal" (Wikipedia article on laser, accessed 11-04-2012).

Maiman published a detailed account of his research as The Laser Odyssey (Blaine, WA: The Laser Press, 2000).

View Map + Bookmark Entry

The Johns Hopkins Beast Circa 1960

Built during the 1960s at the Applied Physics Laboratory at Johns Hopkins University, the Johns Hopkins Beast was a mobile automaton. The machine had a rudimentary intelligence and the ability to survive on its own. 

"Controlled by dozens of transistors, the Johns Hopkins University Applied Physics Lab's "Beast" wandered white hallways, centering by sonar, until its batteries ran low. Then it would seek black wall outlets with special photocell optics, and plug itself in by feel with its special recharging arm. After feeding, it would resume patrolling. Much more complex than Elsie, the Beast's deliberate coordinated actions can be compared to the bacteria hunting behaviors of large nucleated cells like paramecia or amoebae."

"The robot was cybernetic. It did not use a computer. Its control circuitry consisted of dozens of transistors controlling analog voltages. It used photocell optics and sonar to navigate. The 2N404 transistors were used to create NOR logic gates that implemented the Boolean logic to tell it what to do when a specific sensor was activated. The 2N404 transistors were also used to create timing gates to tell it how long to do something. 2N1040 Power transistors were used to control the power to the motion treads, the boom, and the charging mechanism" Wikipedia article on Johns Hopkins Beast, accessed 11-13-2013).

View Map + Bookmark Entry

The First and Only Use of Smell-O-Vision 1960

The 1960 mystery film, Scent of Mystery, starring Denholm Elliott, Peter Lorre and Elizabeth Taylor, was the only film to feature Smell-O-Vision, a system that timed odors to points in the film's plot. It was the first film in which aromas were integral to the story, providing important details to the audience. It was produced by Mike Todd, Jr., the stepson of Elizabeth Taylor. In 2014, when I wrote this entry, Smell-O-Vision was considered an early, kitschy step in the direction of virtual reality.

"The film opened in three specially equipped theaters in February, 1960, in New York City, Los Angeles, and Chicago. Unfortunately, the mechanism did not work properly. According to Variety, aromas were released with a distracting hissing noise and audience members in the balcony complained that the scents reached them several seconds after the action was shown on the screen. In other parts of the theater, the odors were too faint, causing audience members to sniff loudly in an attempt to catch the scent.

"Technical adjustments by the manufacturers of Smell-O-Vision solved these problems, but by then it was too late. Negative reviews, in conjunction with word of mouth, caused the film to fail miserably. Comedian Henny Youngman quipped, "I didn't understand the picture. I had a cold." Todd did not produce another film until 1979's The Bell Jar, which was also his last film.

"The film was eventually retitled as Holiday in Spain and re-released, sans odors. However, as The Daily Telegraph described it, "the film acquired a baffling, almost surreal quality, since there was no reason why, for example, a loaf of bread should be lifted from the oven and thrust into the camera for what seemed to be an unconscionably long time."

"Scent of Mystery was aired once on television by MTV in the 1980s, in conjunction with a convenience store promotion that offered scratch and sniff cards that viewers were to use to recreate the theater experience" (Wikipedia article on Scent of Mystery, accessed 04-03-2014.)

View Map + Bookmark Entry

The National Library of Medicine Introduces Medical Subject Headings (MeSH) 1960

In 1960 the National Library of Medicine introduced Medical Subject Headings (MeSH), a comprehensive controlled vocabulary for the purpose of indexing journal articles and books in the life sciences. MeSH serves as a thesaurus that facilitates searching; it is used by the MEDLINE/PubMed article database and by NLM's catalog of book holdings. 

View Map + Bookmark Entry

Licklider Describes "Man-Computer Symbiosis" March 1960

In March 1960 computer scientist J. C. R. Licklider of Bolt, Baranek and Newman published "Man-Computer Symbiosis," IRE Transactions on Human Factors in Electronics, volume HFE-1 (March 1960) 4-11, postulating that the computer should become an intimate symbiotic partner in human activity, including communication. (See Reading 10.5.)

View Map + Bookmark Entry

The TIROS 1 Satellite Transmits the First Television Picture from Space April 1, 1960

On April 1, 1960 the first Television InfraRed Observation Satellite (TIROS 1), the first successful low-Earth orbital weather satellite, was launched by NASA from Cape Canaveral, Florida. That day the satellite transmitted the first television picture of the earth from space.

View Map + Bookmark Entry

The First Symposium on Bionics September 13 – September 15, 1960

From September 13-15, 1960 the first symposium on bionics (biological electronics) took place at Wright-Patterson Air Force Base in Ohio. (See Reading 11.7.)

View Map + Bookmark Entry

The Kennedy-Nixon Presidential Debates: The Beginning of Television's Dominance over Print as a Popular News Medium September 21, 1960

In a Chicago television studio on September 26, 1960 Senator John F. Kennedy and Vice President Richard Nixon stood before an audience of 70 million Americans—two-thirds of the nation's adult population—in the first nationally televised Presidential debate. This first of four debates held before the end of October gave a vast national audience the opportunity to see and compare the two candidates, and ushered in a new age of Presidential politics. It also proved the influence of television on social and political events, and may be considered the beginning of television's dominance over print as a popular news medium. 

The four Kennedy-Nixon television debates were a key turning point in the 1960 presidential campaign.

"Nixon insisted on campaigning until just a few hours before the first debate started. He had not completely recovered from his hospital stay and thus looked pale, sickly, underweight, and tired. He also refused makeup for the first debate, and as a result his beard stubble showed prominently on the era's black-and-white TV screens. Nixon's poor appearance on television in the first debate is reflected by the fact that his mother called him immediately following the debate to ask if he was sick. Kennedy, by contrast, rested and prepared extensively beforehand, appearing tanned, confident, and relaxed during the debate. An estimated 70 million viewers watched the first debate. It is often claimed that people who watched the debate on television overwhelmingly believed Kennedy had won, while radio listeners (a smaller audience) believed Nixon had won. A study has found that the alleged viewer/listener disagreement is unsupported. . . .

"After the first debate, polls showed Kennedy moving from a slight deficit into a slight lead over Nixon. For the remaining three debates Nixon regained his lost weight, wore television makeup, and appeared more forceful than in his initial appearance.

"However, up to 20 million fewer viewers watched the three remaining debates than the first one. Political observers at the time believed that Kennedy won the first debate, Nixon won the second and third debates, and that the fourth debate, which was seen as the strongest performance by both men, was a draw.

"The third debate is notable because it brought about a change in the debate process. This debate was a monumental step for television. For the first time ever split screen technology was used to bring two people from opposite sides of the country together so they were able to converse in real time. Nixon was in Los Angeles while Kennedy was in New York. The men appear to be in the same room, thanks to identical sets. Both candidates had monitors in their respective studios containing the feed from the opposite studio so they could respond to questions. Bill Shadel moderated the debate from a third television studio in Chicago. The main topic of this debate was Quemoy and Matsu. It was a question of the US position over whether military force should be used to prevent buffer islands between China and Taiwan from falling under Chinese control" (Wikipedia article on United States presidential election, 1960, accessed 05-30-2014).

View Map + Bookmark Entry

Technical Basis for the Development of Phreaking November 1960

In November 1960 C. Breen and D. A. Dahlbaum of Bell Labs in New York published "Signaling Systems for the Control of Telephone Switching," Bell System Technical Journal, 39 (1960) 1381-1444.

"Telephone signaling is basically a matter of transferring information between machines, and between humans and machines. The techniques developed to accomplish this have evolved over the years in step with advances in the total telephone art. The history of this evolution is traced, starting from the early simple manual switchboard days to the present Direct Distance Dialing era. The effect of the increasing sophistication in automatic switching and transmission systems and their influence on signaling principles are discussed. Emphasis is given to the signaling systems used between central offices of the nationwide telephone network and the influence on such systems of the characteristics of switching systems and their information requirements, the transmission media and the compatibility problem. A review is made of the forms and characteristics of some of the interoffice signaling systems presently in use. In addition, the problem of signaling between Bell System and overseas telephone systems is reviewed with reference to delivering information requirements, signaling techniques and new transmission media. Finally, some speculation is made on the future trends of telephone signaling systems" (abstract of the paper).

According to http://www.historyofphonephreaking.org/docs.php, the Breen and Dahlbaum paper is

"often cited as the article that gave away the keys to the kingdom," leading to the development of the underground "phreaker" culture.  Other papers that included the in-band trunk signaling tones which provided the technical information needed to build Blue Boxes are cited at http://www.lospadres.info/thorg/bstj.html, accessed 09-17-2009).

My thanks to Jeffrey Odel for this reference.

View Map + Bookmark Entry

"Colossal Typewriter" : One of the Earliest Computer Text Editors December 1960

Colossal Typewriter, a program written by John McCarthy and Roland Silver running on the DEC PDP-1 at Bolt Beranek and Newman in December 1960, was one of the earliest computer text editors. 

View Map + Bookmark Entry

COBOL Allows Compatibility Between Computers Made by Different Manufacturers December 6 – December 7, 1960

On December 6 and 7, 1960  essentially the same COBOL program was run on two different makes of computers— an RCA computer and a Remington-Rand Univac computer— demonstrating for the first time that compatibility between computers produced by different manufacturers could be achieved.

View Map + Bookmark Entry

Arthur C. Clarke Publishes "Dial F for Frankenstein," an Inspiration for Tim Berners-Lee 1961

In 1961 British science fiction writer, inventor and futurist Arthur C. Clarke of Sri Lanka published a short story entitled "Dial F for Frankenstein."

". . . it foretold an ever-more-interconnected telephone network that spontaneously acts like a newborn baby and leads to global chaos as it takes over financial, transportation and military systems" (John Markoff, "The Coming Superbrain," New York Times, May 24, 2009).

"The father of the internet, Sir Tim Berners-Lee, credits Clarke's short story, Dial F for Frankenstein, as an inspiration" (http://www.independent.co.uk/news/science/arthur-c-clarke-science-fiction-turns-to-fact-799519.html, accessed 05-24-2009).

View Map + Bookmark Entry

The QUOTRON Computerized Stock-Quotation System Is Introduced 1961

In 1961 QUOTRON, a computerized stock-quotation system using a Control Data Corporation computer, was introduced.

Quotron became popular with stockbrokers, signaling the end of traditional ticker tape.

View Map + Bookmark Entry

Crick & Brenner Propose The Genetic Code 1961

At Cambridge in 1961 Francis Crick, Sydney Brenner and colleagues proposed that DNA code is written in “words” called codons formed of three DNA bases. DNA sequence is built from four different bases, so a total of 64 (4 x 4 x 4) possible codons can be produced. They also proposed that a particular set of RNA molecules subsequently called transfer RNAs (tRNAs) act to “decode” the DNA.

“There was an unfortunate thing at the Cold Spring Harbor Symposium that year. I said, ‘We call this messenger RNA’ Because Mercury was the messenger of the gods, you know. And Erwin Chargaff very quickly stood up in the audience and said he wished to point out that Mercury may have been the messenger of the gods, but he was also the god of thieves. Which said a lot for Chargaff at the time! But I don’t think that we stole anything from anybody— except from nature. I think it’s right to steal from nature, however” (Brenner, My Life, 85).

Francis Crick, L. Barnett, Sydney Brenner and R. J. Watts-Tobin, “General Nature of the Genetic code for Proteins,” Nature 192 (1961): 1227-32.

J. Norman (ed) Morton's Medical Bibliography 5th ed (1991) no. 256.8.

View Map + Bookmark Entry

Compugraphic Develops a Special-Purpose Typesetting Computer 1961

In 1961 engineers at Compugraphic in Brookline, Massachusetts recognized that a computer could be programmed to handle repetitious typesetter coding automatically. The firm developed a prototype model of the Directory Tape Processor (DTP) which eliminated all operator decisions, and produced a fully coded tape used for typesetting.

View Map + Bookmark Entry

"Compatible Time Sharing System," Precursor of Word Processing and Email 1961

In 1961 Fernando J. Corbató and team at MIT developed one of the first time-sharing operating systems, CTSS (Compatible Time-Sharing System.)

CTSS had one of the first computerized text formatting utilities, called RUNOFF, the precursor of word processing, and one of the first inter-user messaging implementations, presaging instant messaging and electronic mail.

View Map + Bookmark Entry

George Forsythe Coins the Term "Computer Science" 1961

In 1961 mathematician and founder of Stanford University's Computer Science department George E. Forsythe coined the term "computer science" in his paper "Engineering Students Must Learn both Computing and Mathematics", J. Eng. Educ. 52 (1961) 177-188, quotation from p. 177.

Of this Donald Knuth wrote, "In 1961 we find him using the term 'computer science' for the first time in his writing:

[Computers] are developing so rapidly that even computer scientists cannot keep up with them. It must be bewildering to most mathematicians and engineers...In spite of the diversity of the applications, the methods of attacking the difficult problems with computers show a great unity, and the name of Computer Sciences is being attached to the discipline as it emerges. It must be understood, however, that this is still a young field whose structure is still nebulous. The student will find a great many more problems than answers. 

"He [Forsythe] identified the "computer sciences" as the theory of programming, numerical analysis, data processing, and the design of computer systems, and observed that the latter three were better understood than the theory of programming, and more available in courses" (Knuth, "George Forsythe and the Development of Computer Science," Communications of the ACM, 15 (1972) 722).

View Map + Bookmark Entry

"Expensive Typewriter": The First Word Processing Program 1961 – 1962

Expensive Typewriter, a text editing program written in 1961-62 for the DEC PDP-1 at MIT by Stephen D. Piner, has been called the first word processing program. It could drive an IBM Selectric typewriter, and was called "expensive" because the DEC PDP-1, the first minicomputer, then cost $100,000.  The name was also taken in the spirit of a 1960 computer text editor called Colossal Typewriter.

♦ In December 2013 a report on the program written by Stephen Piner and dated August 1, 1972 was available the Computer History Museum at this link.

View Map + Bookmark Entry

Brenner, Jacob & Meselson Demonstrate the Existence of Messenger RNA 1961

In 1961 South African molecular biologist Sydney Brenner working at the Cavendish Laboratory in Cambridge, French molecular biologist François Jacob at the Institut Pasteur in Paris, and American molecular biologist Matthew Meselson at Caltech in Pasadena showed that short-lived RNA molecules that they called messenger RNA (mRNA) carry the genetic instructions from DNA to structures in the cell called ribosomes. They also demonstrated that ribosomes are the site of protein synthesis.

Brenner, Jacob & Meselson, "An Unstable Intermediate Carrying Information from Genes to Ribosomes for Protein Synthesis," Nature 190 (1961) 576-80.

J. Norman (ed) Morton's Medical Bibliography 5th ed (1991) no. 256.10.

In January 2014 images of Sydney Brenner's original autograph manuscript for this paper, and typed drafts were available from the Cold Spring Harbor Laboratories CSHL Archives Repository at this link.

View Map + Bookmark Entry

Jacob & Monod Explain the Basic Process of Regulating Gene Expression in Bacteria 1961

In 1961 French biologists François Jacob and Jacques Monod explained the basic process of regulating gene expression in bacteria, showing that enzyme expression levels in cells is a result of regulation of transcription of DNA sequences. Their experiments and ideas gave impetus to the emerging field of molecular developmental biology, and of transcriptional regulation in particular.

Jacob & Monod, "Genetic Regulatory Mechanisms in the Synthesis of Proteins," Journal of Molecular Biology 3 (1961) 318-56.

View Map + Bookmark Entry

The IBM 7094 is The First Computer to Sing 1961

A recording made at Bell Labs in Murray Hill, New Jersey on an IBM 7094 mainframe computer in 1961 is the earliest known recording of a computer-synthesized voice singing a song— Daisy Bell, also known as "Bicycle Built for Two." The recording was programmed by physicist John L. Kelly Jr., and Carol Lockbaum, and featured musical accompaniment written by computer music pioneer Max  Mathews.

The science fiction novelist Arthur C. Clarke witnessed a demonstration of the piece while visiting his friend, the electric engineer and science fiction writer, John R. Pierce, who was a Bell Labs employee at the time. Clarke was so impressed that he incorporated the 7094's musical performance in the 1968 novel, and the script for the 1968 film 2001: A Space Odyssey. One of the first things that Clarke’s fictional HAL 9000 computer had learned when it was originally programmed was the song "Daisy Bell". Near the end of the story, when the computer was being deactivated, or put to sleep by astronaut Dave Bowman, it lost its mind and degenerated to singing "Daisy Bell."

(This entry was last revised on 03-21-2015.)

View Map + Bookmark Entry

President Eisenhower Warns About the Increasing Influence of the Military-Industrial Complex January 17, 1961

On January 17, 1961 President Dwight D. Eisenhower delivered his Farewell Address. In this speech Eisenhower, former General of the Army, former Supreme Commander of the Allied Forces in Europe, and the first Supreme Commander of NATO, warned the nation about the increasing influence of the "military-industrial complex," the threats this influence could pose to our democratic system of government, and the need for political leaders to balance our military requirements with the maintenance of our democracy.

From the speech I quote:

"A vital element in keeping the peace is our military establishment. Our arms must be mighty, ready for instant action, so that no potential aggressor may be tempted to risk his own destruction.

"Our military organization today bears little relation to that known by any of my predecessors in peacetime, or indeed by the fighting men of World War II or Korea.

"Until the latest of our world conflicts, the United States had no armaments industry. American makers of plowshares could, with time and as required, make swords as well. But now we can no longer risk emergency improvisation of national defense; we have been compelled to create a permanent armaments industry of vast proportions. Added to this, three and a half million men and women are directly engaged in the defense establishment. We annually spend on military security more than the net income of all United States corporations.

"This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence -- economic, political, even spiritual -- is felt in every city, every State house, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.

"In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the militaryindustrial complex. The potential for the disastrous rise of misplaced power exists and will persist.

"We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted. Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.

"Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades.

"In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

"Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

"The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present--and is gravely to be regarded.

"Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientifictechnological elite.

"It is the task of statesmanship to mold, to balance, and to integrate these and other forces, new and old, within the principles of our democratic system -- ever aiming toward the supreme goals of our free society" (http://www.eisenhower.archives.gov/research/online_documents/farewell_address/1961_01_17_Press_Release.pdf, accessed 11-02-2013).

View Map + Bookmark Entry

Yuri Gagarin Becomes the First Human to Travel into Space and the First to Orbit the Earth April 12, 1961

On April 12, 1961 Russian cosmonaut Yuri Gagarin, aboard the Vostok 3KA-3 (Vostok 1) spacecraft, launched from the Baikonur Cosmodrome Site  No. 1 became both the first human to travel into space, and the first to orbit the earth. Gagarin's spaceflight about the Vostok 1 consisted of a single orbit of the earth lasting 108 minutes. Gagarin ejected from the spacecraft at 7 km, 23,000 feet, and parachuted to earth separately from the spacecraft.

In his secret postflight report, Gagarin described the first human experience of spaceflight, and prolonged microgravity: 

"I ate and rank normally, I could eat and drink. I noticed no physiological difficulties. The feeling of weightlessness was somewhat unfamilar compared with Earth conditions. Here, you feel as if you were hanging in a horzontal position in straps. You feel as if you are suspended. Obviously, the tightly fitted suspension system presses upon the thorax. . . . Later I got used to it and had no unpleasant sensations. I made entries into the logbook, reported, worked with the telegraph key. When I had meals, I also had water. I let the writing pad out of my hands and it floated together with the pencil in front of me. Then, when I had to write the next report, I took the pad, but the pencil wasn't where it had been. It had flown off somewhere. The eye was secured to the pencil with a screw, but obviously they should have used glue or secured the pencil more tightly. The screw got loose and flew away. I closed up the journal and put it in my pocket. It wouldn't be any good anyway, because I had nothing to write with" (quoted by Siddiqi, Challenge to Apollo: The Soviet Union and the Space Race: 1945-1974 (2000) 278).

A minor detail mentioned in this quote is that Gagarin communicated with earth by radio, using a telegraph key, rather than by voice. His call sign was Kedr (Siberian Pine, Russian: Кедр).

View Map + Bookmark Entry

Wesley Clark Builds the LINC, Perhaps the First Mini-Computer May 1961 – 1962

In May 1961 Wesley A. Clark, a computer scientist at MIT's Lincoln Laboratory, started building the LINC (Laboratory INstrument Computer). This machine, which some later called both the first mini-computer and a forerunner of  the personal computer, was first used in 1962. It was small table-top size, “low cost” ($43,000), had keyboard and display, file system and an interactive operating system. It's design was placed in the public domain. Eventually fifty of the machines were sold by Digital Equipment Corporation.

View Map + Bookmark Entry

Leonard Kleinrock Writes the First Paper on Data Networking Theory May 31, 1961

On May 31, 1961 Leonard Kleinrock submitted his MIT thesis proposal, Information Flow in Large Communication NetsKleinrock's thesis proposal was the first paper on what later came to be known as data communications, or data networking theory.

View Map + Bookmark Entry

Texas Instruments Delivers the First Integrated Circuit Computer: An Achievement in Miniaturization October 19, 1961

On October 19, 1961 Texas Instruments delivered the first integrated circuit computer to the U.S. Air Force.

“The advanced experimental equipment has a total volume of only 6.3 cubic inches and weighs only 10 ounces. It provides the identical electrical functions of a computer using conventional components which is 150 times its size and 48 times its weight and which also was demonstrated for purposes of comparison. It uses 587 digital circuits (Solid Circuit™ semiconductor net works) each formed within a minute bar of silicon material. The larger computer uses 8500 conventional components and has a volume of 1000 cubic inches and weight of 480 ounces.”

View Map + Bookmark Entry

Origins of the IBM System/360 December 28, 1961

On December 28, 1961 John W. Haanstra, Chairman, Bob O. Evans, Vice Chairman, and others at IBM issued as a confidential internal document Processor Products—Final Report of SPREAD Task Group.

In the period from 1952 through 1962, IBM produced seven families of systems—the 140, 1620, 7030 (Stretch), 7040, 7070, 7080, and 7090 groups. They were incompatible with one another, and both users and IBM staff recognized problems caused by this incompatibility. The SPREAD report, as adopted by IBM, led to the development of the IBM System/360 family of compatible computers and peripherals, and essentially reformed the company.

"IBM's public commitment to the SPREAD plan was embodied in the System/360, announced in Poughkeepsie on April 7, 1964. Six machines were announced: the 360 Model 30, 40, 50, 60, 62 and 70. Over the next few years, a number of additional systems were added to the 360 family.

"The SPREAD plan eventually allowed IBM to direct substantial resources toward the development of the full system—peripherals, programming, communications, and new applications. The success of System/360 is perhaps best measured by IBM's financial performance. In the six years from January 1, 1966 to December 31, 1971, IBM's gross income increased 2.3 times, from $3.6 billion to $8.3 billion, and net earnings after taxes increrased 2.3 times, from $477 million to $1.1 billion. In 1982 direct descendants of System/360 accounted for more than half of IBM's gross income and earnings.

"Perhaps most important, the SPREAD Report permitted IBM to focus on an excellence not possible with multiple architectures. It resulted in powerful new peripherals, programming, terminals, high-volume applications, and complementary diversifications whose future can only be imagined" (Bob O. Evans, "Introduction to SPREAD Report," Annals of the History of Computing 5 [1983] 5).  The text of the report was reprinted in the same journal issue on pp. 6-26.

Nearly all copies of this confidential report were destroyed. An original copy, donated by one of the authors, Jerome Svigals, is preserved in the Computer History Museum, Mountain View, California.

View Map + Bookmark Entry

Spacewar, the First Computer Game for a Commercially Available Computer 1962

In 1962 Programmer and computer scientist Steve Russell, aka Steve "Slug" Russell, and his team at MIT, including members of the Tech Model Railroad Club, took about 200 hours to program the first computer game for a commercially available computer on a DEC PDP-1.

Inspired by the space battles in the Lensman serial of science fiction space opera by E. E. "Doc" Smith, the computer game, or videogame, was called Spacewar .

View Map + Bookmark Entry

Rachel Carson Issues "Silent Spring" 1962

In 1962 American biologist, writer, and ecologist Rachel Carson published Silent Spring in Boston through Houghton, Mifflin. This very carefully documented book convincingly proved the disastrous effects of DDT in the environment, and generated a storm of controversy. It was later credited with founding the "environmental movement" in the United States.

View Map + Bookmark Entry

McLuhan Issues "The Gutenberg Galaxy" 1962

In 1962 Canadian professor of English literature, literary critic, rhetorician, and communication theorist at the University of Toronto Marshall McLuhan published The Gutenberg Galaxy: The Making of Typographic Man in which he divided history in four epochs: oral tribe culture, manuscript culture, the Gutenberg galaxy and the electronic age.

McLuhan argued that a new communications medium was responsble for the break between each of the four time periods. Writing before computing was pervasive in society, he was concerned with the influence of radio, television and film on print culture, and on the impact of media, independent of content, upon thinking, and social organization:

"The main concept of McLuhan's argument (later elaborated upon in The Medium is the Massage) is that new technologies (like alphabets, printing presses, and even speech itself) exert a gravitational effect on cognition, which in turn affects social organization: print technology changes our perceptual habits ('visual homogenizing of experience'), which in turn impacts social interactions ('fosters a mentality that gradually resists all but a. . . specialist outlook'). According to McLuhan, the advent of print technology contributed to and made possible most of the salient trends in the Modern period in the Western world: individualism, democracy, Protestantism, capitalism, and nationalism. For McLuhan, these trends all reverberate with print technology's principle of 'segmentation of actions and functions and principle of visual quantification."

View Map + Bookmark Entry

Computers Drive Linotype Hot Metal Typesetters 1962

In 1962 the Los Angeles Times newspaper drove Linotype hot metal typesetters with perforated tape created from RCA computers, greatly speeding up typesetting.

The key to this advance was development of a dictionary and a method to automate hyphenation and justification of text in columns. These tasks had taken 40 percent of a manual Linotype operator's time.

View Map + Bookmark Entry

Inforonics Develops One of the First Data Publishing and Retrieval Systems 1962

Inforonics, founded in 1962 by MIT graduate Larry Buckland in Littleton, Massachusetts, developed and maintained "one of the first data publishing and retrieval systems used by organizations such as the U.S. Library of Congress and the Boston Public Library."

View Map + Bookmark Entry

Nick Holonyak, Jr. Invents the First Visible LED 1962

In 1962, while working as a consulting scientist at General Electric Company in Syracuse, New York, Nick Holonyak Jr. invented the first visible light-emitting-diode (LED). 

View Map + Bookmark Entry

ICPSR, The Largest Archive of Digital Social Science Data, is Founded at the University of Michigan 1962

In 1962 ICPSR, the Inter-university Consortium for Political and Social Research, was founded at the University of Michigan, Ann Arbor. ICPSR became the world's largest archive of digital social science data,  acquiring, preserving, and distributing original research data, and providing training in its analysis.

View Map + Bookmark Entry

Fritz Machlup Introduces the Concept of "The Information Economy" 1962

In 1962 Austrian-American economist Fritz Machlup of Princeton published The Production and Distribution of Knowledge in the United States.

In this book Machlup introduced the concept of the knowledge industry.

"He distinguished five sectors of the knowledge sector: education, research and development, mass media, information technologies, information services. Based on this categorization he calculated that in 1959 29% per cent of the GNP in the USA had been produced in knowledge industries" (Wikipedia article on Information Society, accessed 04-25-2011).

View Map + Bookmark Entry

Bell Labs Develops the First Digitally Multiplexed Transmission of Voice Signals 1962

"In 1962, Bell Labs developed the first digitally multiplexed transmission of voice signals. This innovation not only created a more economical, robust and flexible network design for voice traffic, but also laid the groundwork for today's advanced network services such as 911, 800-numbers, call-waiting and caller-ID. In addition, digital networking was the foundation for the convergence of computing and communications."

View Map + Bookmark Entry

Roger Tomlinson Develops the First True Operational Geographic Information System (GIS) 1962

In 1962 English geographer Roger F. Tomlinson, then of Spartan Air Services, and IBM began the development of the Canada Geographic Information System (CGIS) for the Federal Department of Foresty and Rural Development in Ottawa, Ontario, Canada. This was the first true operational geographic information system (GIS). The system was used to store, analyze, and manipulate data collected for the Canada Land Inventory – to determine the land capability for rural Canada by mapping information about soils, agriculture, recreation, wildlife, waterfowlforestry and land use at a scale of 1:50,000. A rating classification factor was also added to permit analysis.

"In 1960, Roger Tomlinson was working at an aerial survey company in Ottawa— Spartan Air Services. The company was focused on producing large-scale photogrammetric and geophysical maps. In the early 1960s, Tomlinson and the company were asked to produce a map for site-location analysis in an east African nation. Tomlinson immediately recognized that the new automated computer technologies might be applicable and even necessary to complete such a detail-oriented task more effectively and efficiently than humans. Eventually, Spartan met with IBM offices in Ottawa to begin developing a relationship to bridge the previous gap between geographic data and computer services. Tomlinson brought his geographic knowledge to the table as IBM brought computer programming and data management.

"The Canadian government and Tomlinson began working towards the development of a national program after a 1962 meeting between Tomlinson and Lee Pratt, head of the Canada Land Inventory (CLI). Pratt was charged with creation of maps covering the entire region of Canada's commercially productive areas by showing agriculture, forestry, wildlife, and recreation, all with the same classification schemes. Not only was the development of such maps a formidable task, but Pratt understood that computer automation may assist in the analytical processes as well. Tomlinson was the first to produce a technical feasibility study on whether computer mapping programs would be viable solution for the land-use inventory and management programs, such as CLI. He is also given credit for coining the term geographic information system and is recognized as the 'Modern Father of GIS' " (Wikipedia article on Canada Geographic Informatio System, accessed 12-07-2013).

"CGIS was an improvement over 'computer mapping' applications as it provided capabilities for overlay, measurement, and digitizing/scanning. It supported a national coordinate system that spanned the continent, coded lines as arcs having a true embedded topology and it stored the attribute and locational information in separate files. As a result of this, Tomlinson has become known as the "father of GIS", particularly for his use of overlays in promoting the spatial analysis of convergent geographic data" (Wikipedia article Geographic information sytem, accessed 12-07-2013).

In 1974 Tomlinson received a PhD from the University College London after writing a doctoral thesis entitled The application of electronic computing methods and techniques to the storage, compilation, and assessment of mapped data. In 1976 with H. W. Calkins and Duane F. Marble, he issued through UNESCO Press Computer Handling of Geographical Data: An Examination of Selected Geographic Information Systems. Natural Resources Research Ser. XIII.

View Map + Bookmark Entry

Alvar Ellegård Makes the First Use of Computers to Study Disputed Authorship 1962

The first use of computers in the study of disputed authorship study was probably Alvar Ellegård's study of the Junius letters. Ellegård, professor of English at the University of Gothenberg in Sweden, did not use a computer to make the word counts, but did use machine calculations which helped him get an overall picture of the vocabulary from hand counts.

Ellegård, A. A Statistical Method for Determining Authorship: The Junius Letters 1769–1772. Gothenburg: Gothenburg Studies in English, 1962. 

A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell, 200

View Map + Bookmark Entry

Kleinrock Introduces the Concept Later Known as Packet Switching April 1962

In April 1962 Leonard Kleinrock published "Information Flow in Large Communication Nets" in RLE Quarterly Progress Reports. This was the first publication to describe and analyze an algorithm for chopping messages into smaller pieces, later to be known as packets. Kleinrock's MIT doctoral thesis, Message Delay in Communication Nets with Storage, filed in December 1962, elaborated on the impact of this algorithm on data networks. (See Reading 13.3.)

View Map + Bookmark Entry

Licklider & Clark Publish "Online Man-Computer Communication" Circa June 1962

About June 1962 J.C.R. Licklider of Bolt, Baranek, and Newman and Welden E. Clark published “Online Man-Computer Communication,” calling for time-sharing of computers, for graphic displays of information, and the need for an improved graphical interface. (See Reading 10.6.)

View Map + Bookmark Entry

TELSTAR 1: The First Satellite to Relay Signals from Earth to Satellite and Back July 10, 1962

On June 10, 1962 a Delta rocket from Cape Canaveral launched the AT&T TELSTAR 1 satellite, designed and built at Bell Labs. It was the first privately owned active communications satellite, and the first satellite to relay signals from the earth to a satellite and back.

"Belonging to AT&T, the original Telstar was part of a multi-national agreement between AT&T (US), Bell Telephone Laboratories (US), NASA (US), GPO (UK) and the National PTT (France) to develop experimental satellite communications over the Atlantic Ocean. Bell Labs held a contract with NASA, reimbursing the agency three million pounds for each launch, independent of success.[citation needed] The US ground station was Andover Earth Station in Andover, Maine, built by Bell Labs. The main British ground station was at Goonhilly Downs in southwestern England. This was used by the BBC, the international coordinator. The standards 525/405 conversion equipment (filling a large room) was researched and developed by the BBC and located in the BBC Television Centre, London. The French ground station was at Pleumeur-Bodou (48°47′10″N 3°31′26″W) in north-western France" (Wikipedia article on Telstar 1, accessed 10-28-2014).

View Map + Bookmark Entry

Telstar 1 Relays the First Live Trans-Atlantic TV Broadcasts July 11 – July 23, 1962

On July 11, 1962 thw Telstar 1 satellite relayed the first, and non-public, television pictures—a flag outside Andover Earth Station—to Pleumeur-Bodou on July 11, 1962. Almost two weeks later, on July 23, at 3:00 p.m. EDT, it relayed the first publicly available live transatlantic television signal.The broadcast was made possible in Europe by Eurovision and in North America by NBCCBSABC, and the CBC

"The first public broadcast featured CBS's Walter Cronkite and NBC's Chet Huntley in New York, and the BBC's Richard Dimbleby in Brussels.The first pictures were the Statue of Liberty in New York and the Eiffel Tower in Paris. The first broadcast was intended to have been remarks by President John F. Kennedy, but the signal was acquired before the president was ready, so the lead-in time was filled with a short segment of a televised game between the Philadelphia Phillies and the Chicago Cubs at Wrigley Field. The batter, Tony Taylor, was seen hitting a ball pitched by Cal Koonce to the right fielder George Altman. From there, the video switched first to Washington, DC; then to Cape Canaveral, Florida; to the Seattle World's Fair; then to Quebec and finally to Stratford, Ontario. The Washington segment included remarks by President Kennedy, talking about the price of the American dollar, which was causing concern in Europe" (Wikipedia article Telstar 1, accessed 10-28-2014).

View Map + Bookmark Entry

First of the "Ten Greatest Software Bugs of All Time" July 28, 1962

On July 28, 1962 a bug in the flight software for the Mariner I space probe caused the rocket to divert from its intended path on launch. Mission control destroyed the rocket over the Atlantic Ocean.

"The investigation into the accident discovered that a formula written on paper in pencil was improperly transcribed into computer code, causing the computer to miscalculate the rocket's trajectory."

In 2005 Wired Magazine characterized this bug as the first of the "ten greatest software bugs of all time."

View Map + Bookmark Entry

Among the Beginnings of Computer Art in the United States August 28, 1962

"During the summer of 1962, A. Michael Noll . . . had an assignment working in the research division of Bell Telephone Laboratories, where he was employed as a Member of Technical Staff. His summer project involved the programming of a new method for the determination of the pitch of human speech – the short-term cepstrum. The results of the computer calculations were plotted on the Stromberg Carlson SC-4020 microfilm plotter.

"The SC-4020 plotter had a cathode ray tube that was photographed automatically with a 35-mm camera. The SC-4020 was intended as a high-speed printer in which the electron beam was passed through a character mask and the shaped beam positioned on the screen while the shutter of the camera remained open. The staff of the computer center wrote a FORTRAN software package to interface with the SC-4020 in positioning the electron beam to draw images on the screen, mostly plots of scientific data, with a 1024-by-1024 resolution.

"A colleague (Elwyn Berlekamp) had a programming error that produced a graphic mess on the plotter, which he comically called 'computer art.' Noll decided to program the computer to create art deliberately, drawing on his past training in drawing and interests in abstract painting. He described [and illustrated] the results in an internal published Technical Memorandum 'Patterns by 7090' dated August 28, 1962.

"Noll’s early pieces combined mathematical equations with pseudo randomness. Today his work would be called programmed computer art or algorithmic art. Much art is produced today by drawing and painting directly on the screen of the computer using programs designed expressly for such purposes.

"Two early works by Noll were 'Gaussian-Quadratic' and 'Vertical Horizontal Number Three.' Stimulated by 'op art,' he created 'Ninety Parallel Sinusoids' as a computer version of Bridget Riley’s 'Currents.' Noll believed that in the computer, the artist had a new artistic partner. Noll used FORTRAN and subroutine packages he wrote using FORTRAN for all his art and animation" (A. Michael Noll, "First Hand: Early Digital Art at Bell Telephone Laboratories, Inc., accessed 01-19-2014).

View Map + Bookmark Entry

Douglas Engelbart Issues "Augmenting Human Intellect: A Conceptual Framework" October 1962

In October 1962 Douglas Engelbart of the Stanford Research Institute, Menlo Park, California, completed his report, Augmenting Human Intellect: A Conceptual Framework, for the Director of Information Sciences, Air Force Office of Scientific Research. This report led J. C. R. Licklider of DARPA to fund SRI's Augmentation Research Center.

View Map + Bookmark Entry

Licklider at the Information Processing Techniques Office, Begins Funding Research that Leads to the ARPANET October 1, 1962

On October 1, 1962 J.C. R. Licklider was appointed Director of The Pentagon’s Information Processing Techniques Office (IPTO), a division of ARPA (the Advanced Research Projects Agency).

Licklider's  initial budget was $10,000,000 per year. Licklider eventually initiated the sequence of events leading to ARPANET.

View Map + Bookmark Entry

"The potential contributions of computers depend upon their use by very human human beings." November 1962

In November 1962 electrical engineer David L. Johnson and clinical-social psychologist Arthur L. Kobler, both at the University of Washington, Seattle, published "The Man-Computer Relationship. The potential contributions of computers crucially depend upon their use by very human human beings," Science 138 (1962) 873-79. The introductory and concluding sections of the paper are quoted below:

"Recently Norbert Wiener, 13 years after publication of his Cybernetics, took stock of the man-computer relationship [Science 131, 1355 (1960).] He concluded, with genuine concern, that computers may be getting out of hand. In emphasizing the significance of the position of the computer in our world, Wiener comments on the crucial use of computers by the military: 'it is more than likely that the machine may produce a policy which would win a nominal victory on points at the cost of every interest we have at heart, even that of national survival.' 

"Computers are used by man; man must be considered a part of any system in which they are used. Increasingly in our business, scientific, and international life the results of data processing and computer application are, necessarily and properly, touching the individuals of our society significantly. Increasing application of computers is inevitable and requisite for the growth and progress of our society. The purpose of this article is to point out certain cautions which must be observed and certain paths which must be emphasized if the man-computer relationship is to develop to its full positive potential and if Wiener's prediction is to be proved false. In this article on the problem of decision making we set forth several concepts. We have chosen decision making as a suitable area of investigation because we see both man and machine, in all their behavior actions, constantly making decisions. We see the process of decision making as being always the same: within the limits of the field, possibilities exist from which choices are made. Moreover, there are many decisions of great significance being made in which machines are already playing an active part. For example, a military leader recently remarked, "At the heart of every defense system you will find a computer." In a recent speech the president of the National Machine Accountants Association stated that 80 to 90 percent of the executive decisions in U.S. industry would soon be made by machines. Such statements indicate a growing trend-a trend which need not be disadvantageous to human beings if they maintain proper perspective. In the interest of making the man-machine relationship optimally productive and satisfactory to the human being, it is necessary to examine the unique capabilities of both man and machine, giving careful attention to the resultant interaction within the
mixed system."


"The levels of human knowledge of the environment and the universe are increasing, and it is obviously necessary that man's ability to cope with this knowledge should increase—necessary for his usefulness and for his very survival. The processes of automation have provided a functional agent for this purpose. Successful mechanized solution of routine problems has directed attention toward the capacity of the computer to arrive at apparent or real solutions of routine-learning and special problems. Increasing use of the computer in such problems is clearly necessary if our body of knowledge and information is to serve its ultimate function. Along with such use of the computer, however, will come restrictions and cautions which have not hitherto been necessary. We find that the computer is being given responsibilities with which it is less- able- to cope than man is. It is being called on to act for man in areas where man cannot define his own ability to perform and where he feels uneasy about his own performance- where he would like a neat, well-structured solution and feels that in adopting the machine's partial solution he is closer to the "right" than he is in using his own. An aura of respectability surrounds a computer output, and this, together with the time-balance factor, makes unqualified acceptance tempting. The need for caution, then, already exists and will be much greater in the future. It has little to do with the limited ability of the computer per se, much to do with the ability of man to realistically determine when and how he must use the tremendous ability which he has developed in automation. Let us continue to work with learning machines, with definitions of meaning and 'artificial intelligence.' Let us examine these processes as 'games' with expanding values, aiming toward developing improved computer techniques as well as increasing our knowledge of human functions. Until machines can satisfy the requirements discussed, until we can more perfectly determine the functions we require of the machines, let us not call upon mechanized decision systems to act upon human systems without intervening realistic human processing. As we proceed with the inevitable development of computers and means of using them, let us be sure that careful analysis is made of all automation (either routine-direct, routine-learning, or special) that is used in systems of whichman is a part-sure that man reflects upon his own reaction to, and use of mechanization. Let us be certain that, in response to Samuel Butler's question, "May not man himself become a sort of parasite upon the machines; an affectionate machine tickling aphid?' we will always be able to answer 'No.' "

View Map + Bookmark Entry

General Motors and IBM Develop the First CAD Program December 1962

In December 1962 DAC-1 (Design Augmented by Computers), the first computer-assisted design (CAD) program, was demonstrated. Development of the program began in 1959 as a joint effort between General Motors in Detroit and IBM. 

View Map + Bookmark Entry

Edward Zajac Produces the First Computer-Animated Film 1963

In 1963 Edward E. Zajac at Bell Labs, Murray Hill, New Jersey, produced the first computer-animated film, a 1.25 minute film entitled Simultation of Two-Gyro Gravity-Gradient Attitude Control System to define how a particular type of satellite would move through space. The film, also narrated by Zajac, simulated the motion and autorotation of a communication satellite as a succession of single phases.   

"Zajac programmed the calculations in FORTRAN, then used a program written by Zajac's colleague, Frank Sinden, called ORBIT. The original computations were fed into the computer via punch cards, then the output was printed onto microfilm using the General Dynamics Electronics Stromberg-Carlson 4020 microfilm recorder. All computer processing was done on an IBM 7090 or 7094 series computer" (http://techchannel.att.com/play-video.cfm/2012/7/18/AT&T-Archives-First-Computer-Generated-Graphics-Film, accessed 01-19-2014).

In 1964 Zajac published "Computer-Made Perspective Movies as a Scientific and Communication Tool," Communications of the ACM 7 no. 3 (March 1964) 169-170. Two years later he published, "Film Animation by Computer," New Scientist 29 (1966) 346-49, which described the making of Two-Gyro Gravity-Gradient Attitude Control System. This article also incorporated illustrations from Kenneth C. Knowlton's "A Computer technique for Producing Animated Movies," AFIPS '64 (Spring) Proceedings of the April 21-23, 1964 Spring Joint Computer Conference, 67-87.  My copy of Zajac's article was reprinted as Bell Telephone System Techical Publications Monograph 5150 (April, 1966).

Franke, Computer Graphics, Computer Art (1971) 93.

View Map + Bookmark Entry

ASCII is Promulgated 1963

In 1963 the ASCII (American Standard Code for Information Interchange) standard was promulgated, specifying the pattern of seven bits to represent letters, numbers, punctuation, and control signals in computers.

"Historically, ASCII developed from telegraphic codes. Its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on ASCII formally began October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963, a major revision during 1967, and the most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters. ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) that affect how text and space is processed; 94 are printable characters, and the space is considered an invisible graphic. The most commonly used character encoding on the World Wide Web was US-ASCII until 2008, when it was surpassed by UTF-8" (Wikipedia article on ASCII, accessed 01-29-2010).

View Map + Bookmark Entry

Ivan Sutherland Creates the First Graphical User Interface 1963

In 1963 Ivan Sutherland, a student at MIT's Lincoln Laboratory in Lexington, Massachusetts, working on the experimental TX- 2 computer, created the first graphical user interface, or first interactive graphics program, in his Ph.D. thesis, Sketchpad: A Man-Machine Graphical Communication System. 

Sketchpad was an early application of vector graphics.

View Map + Bookmark Entry

Compugraphic Develops the First General Typesetting Computers 1963

In 1963 Compugraphic of Brookline, Massachusetts introduced the Linasec I and II, the first general typesetting computers. These automated tape processors produced justified tapes to drive the Linotype machines used in the newspaper industry.

Net production of the Linasec— in excess of 3,600 lines per hour compared to the manually-set 600 lines per hour— enabled newspapers to carry more detailed, late breaking news stories.

View Map + Bookmark Entry

Bob Dylan's "The Times They Are A-Changin' " 1963

On December 10, 2010 Sotheby's in New York sold a single rather worn sheet of binder paper on which Bob Dylan wrote the original lyrics of his most famous song, The Times They Are A-Changin, probably in October 1963. This battered piece of paper with messy writing sold for $422,500.

"Dylan's friend, Tony Glover, recalls visiting Dylan's apartment in September 1963, where he saw a number of song manuscripts and poems lying on a table. 'The Times They Are a-Changin'  had yet to be recorded, but Glover saw its early manuscript. After reading the words 'come senators, congressmen, please heed the call', Glover reportedly asked Dylan: 'What is this shit, man?', to which Dylan responded, 'Well, you know, it seems to be what the people like to hear'.

"Dylan recalled writing the song as a deliberate attempt to create an anthem of change for the moment. In 1985, he told Cameron Crowe: 'This was definitely a song with a purpose. It was influenced of course by the Irish and Scottish ballads . . .'Come All Ye Bold Highway Men', 'Come All Ye Tender Hearted Maidens'. I wanted to write a big song, with short concise verses that piled up on each other in a hypnotic way. The civil rights movement and the folk music movement were pretty close for a while and allied together at that time.'

"The climactic lines of the final verse: 'The order is rapidly fadin'/ And the first one now/ Will later be last/ For the times they are a-changin' have a Biblical ring, and several critics have connected them with lines in the Gospel of Mark, 10:31, 'But many that are first shall be last, and the last first.'

"A self-conscious protest song, it is often viewed as a reflection of the generation gap and of the political divide marking American culture in the 1960s. Dylan, however, disputed this interpretation in 1964, saying 'Those were the only words I could find to separate aliveness from deadness. It had nothing to do with age.' A year later, Dylan would say: 'I can't really say that adults don't understand young people any more than you can say big fishes don't understand little fishes. I didn't mean 'The Times They Are a-Changin' ' as a statement. . . It's a feeling" (Wikipedia article on The Times They Are a-Changin', accessed 12-11-2010).

View Map + Bookmark Entry

Foundation of Engelbart's Augmentation Research Center 1963

As a result of Engelbart's 1962 reportJ. C. R. Licklider, the first director of the US Defense Department's Advanced Research Project Agency (DARPA) Information Processing Techniques Office (IPTO), funded Douglas Engelbart's Augmentation Research Center at Stanford Research Institute in early 1963. The first experiments done there included trying to connect a display at SRI to the massive and unique AN/FSQ-32 computer at System Development Corporation in Santa Monica, California.

View Map + Bookmark Entry

Using his BEFLIX Computer Animation Language, Ken Knowlton Produces "A Computer Technique for the Production of Animated Movies" 1963 – 1966

In 1963, Kenneth C. Knowlton, working at Bell Labs in Murray Hill, NJ, developed the BEFLIX (Bell Flicks) programming language for bitmap computer-produced movies, using an IBM 7094 computer and a Stromberg-Carlson 4020 microfilm recorder. This was the first computer animation language. Each frame created with BEFLIX contained eight shades of grey and a resolution of 252 x 184. Using this technique, Knowlton in 1963 created a 10 minute 16mm silent film entitled A Computer Technique for the Production of Animated Movies.  

At the Spring Joint Computer Conference of AFIPS, on April 21-23, 1964 Knowlton delivered a paper entitled, appropriately enough, "A computer technique for producing animated movies." This was published in the Proceedings on pp. 67-87. The paper reproduced some images from Knowlton's film and indicated that the film could be borrowed from Bell Labs. Most of the paper reproduced programming code in Beflix.

The following year published "Computer-Produced Movies. A computer-controlled display tube and camera can produce animated movies quickly and economically," Science 150 (November 16, 1965) 1116-1120. This was offprinted as Bell Telephone System Technical Publications Monograph 5112.

And, in 1966 Knowlton published "Computer-Generated Movies, Designs and Diagrams," Design Quarterly, No. 66/67, Design and the Computer (1966), 58-63.

View Map + Bookmark Entry

Feigenbaum & Feldman Issue "Computers and Thought," the First Anthology on Artificial Intelligence 1963

In 1963 computer scientist and artificial intelligence researchers at the University of California at Berkeley Edward A. Feigenbaum and Julian Feldman issued Computers and Thought, the first anthology on artificial intelligence. At the time there were almost no published books on AI and no textbook; the anthology became a kind of de facto textbook by default. It was translated into Russian, Japanese, Polish and Spanish.

An unusual feature of the anthology was its reprinting of "A Selected Descriptor-Indexed Bibliography to the Literature on Artificial Intelligence" (1961) prepared by Marvin Minsky as a companion to his survey on the literature of the field entitled "Steps toward Artificial Intelligence," which was also republished in the anthology. In the bibliography of Minsky's selected publications that was available on his website in December 2013 Minsky indicated that this "may have been the first keyword-descriptor indexed bibliography."                                                                                                                         Authors represented in the anthology included Paul Armer, Carol Chomsky, Geoffrey P. E. Clarkson, Edward A. Feigenbaum. Julian Feldman, H. Gelernter, Bert F. Green, Jr., John T. Gullahorn, Jeanne E. Gullahorn, J. R. Hansen, Carl I. Hovland, Earl B. Hunt. Kenneth Laughery. Robert K. Lindsay. D. W. Loveland. Marvin Minsky. Ulric Neisser. Allen Newell. A. L. Samuel. Oliver G. Selfridge. J. C. Shaw, Herbert A. Simon, James R. Slagle, Fred M. Tonge, A. M. Turing, Leonard Uhr, Charles Vossler, and Alice K. Wolf. 

Hook & Norman, Origins of Cyberspace (2002) no. 599.

View Map + Bookmark Entry

Julio Cortázar Issues the First "Hypertext" Novel, Before Hypertext 1963

In 1963 Argentine writer Julio Cortázar, writing in Paris, published Rayuela (English: Hopskotch), an introspective stream-of-consciousness novel with multiple endings that can be read in different ways. It was translated into English by Gregory Rabassa in 1963. This has been called the first hypertext novel, though the concept of hypertext hardly existed at the time.

"Written in an episodic, snapshot manner, the novel has 155 chapters, the last 99 being designated as "expendable." Some of these "expendable" chapters fill in gaps that occur in the main storyline, while others add information about the characters or record the aesthetic or literary speculations of a writer named Morelli who makes a brief appearance in the narrative. Some of the 'expendable chapters' at first glance seem like random musings, but upon closer inspection solve questions that arise during the reading of the first two parts of the book.

"An author's note suggests that the book would best be read in one of two possible ways, either progressively from chapters 1 to 56 or by "hopscotching" through the entire set of 155 chapters according to a "Table of Instructions" designated by the author. Cortázar also leaves the reader the option of choosing his/her own unique path through the narrative.

"Several narrative techniques are employed throughout the book, and frequently overlap, including first person, third person, and a kind of stream-of-consciousness. Traditional spelling and grammatical rules are often bent and sometimes broken outright. A few chapters purport to be written by other authors, and there is even a whole section taken almost verbatim from another novel that may or may not exist in actuality" (Wikipedia article on Hopskotch, accesseed 01-04-2014).

"I suppose it's unreasonable to expect the world's first so-called hypertext novel - one in which you can read the chapters sequentially, or in an order recommended by the author, or in any other order you choose - to have a compelling plot. After all, plot relies on anticipation and surprise, both of which come from authorial control over how and when information is revealed. A lot of the delight in fiction comes from this, and most of the rest from character, theme and the texture of the language. Cortazar's revolutionary novel is big on the last few, but not unexpectedly fails to be very engaging when it comes to story. It's more of a character study, or rather an elaboration of a philosophical position through the depiction of certain people in a particular place and time, i.e. left-leaning international emigres in 1950s Paris, and later the locals in Buenos Aires, who spend most of their time smoking, drinking, listening to jazz, competing for affection, philosophizing about life, and trying not to be the creative geniuses they obviously know they are. There are some wonderful set pieces: the infamous Chapter 28 involving a baby in a darkened room; the afternoon a plank bridge is erected to join two hotel rooms on opposite sides of a busy Buenos Aires street; an elaborate booby trap of water-filled basins, tangled threads and ball-bearings to thwart a vengeful lover in the night; and, obviously, the hopscotch squares of the title which are drawn in the courtyard of an insane asylum. These incidents are all engaging, comic, and wonderfully laden with a metaphorical/philosophical import which serves Cortazar's embedded theme: that is, the conundrum of consciousness; the unending desire to break through "the wall" to the other side of life in order to achieve the "unity" we intuitively feel exists but to which there is no easy path. This is the novel's engine, but it does take a while to fire up. If slowly savouring 500+ pages of that kind of thing interests you, then you'll enjoy "Hopscotch" immensely. If it doesn't, then reading this novel will be somewhat like being trapped at a really bad party with drunk and depressive philosophy undergraduates who think they know everything about jazz. I had the urge to leave early, but I'm glad I stayed until the end. Eventually, someone shut the music off, opened all the windows, and in the silence of dawn something clicked" (review by  Steven ReynoldsDecember 31, 2004, on Amazon.com, accessed 01-04-2014).

View Map + Bookmark Entry

John Q. Morton Applies Computing to Authorship of the Pauline Epistles 1963

In 1963 Scottish clergyman Andrew Q. Morton published an article in a British newspaper claiming that, according to his work with computer at the University of Edinburgh St Paul only wrote four of his epistles. Morton based his claim on word counts of common words in the Greek text, plus some elementary statistics. He continued to examine a variety of Greek texts producing more papers and books concentrating on an examination of the frequencies of common words (usually particles) and also on sentence lengths, even though the punctuation identifying sentences was added to the Greek texts by editors long after the Pauline Epistles were written.

Morton, The Authorship of the Pauline Epistles: A Scientific Solution. Saskatoon, 1965. 

Morton, A. Q. and Winspear, A. D. It's Greek to the Computer. Montreal, 1971.

A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell, 2004.

View Map + Bookmark Entry

Licklider Describes the "Intergalactic Computer Network" April 25, 1963

From his office at The Pentagon on April 25, 1963 J.C.R. Licklider, Director of Behavioral Sciences Command & Control Research at ARPA,  the U. S. Department of Defense Advanced Research Projects Agency, sent a memo to members and affiliates of what he jokingly called the "Intergalactic Computer Network, "outlining a key part of his strategy to connect all their individual computers and time-sharing systems into a single computer network spanning the continent” (Waldrop).

View Map + Bookmark Entry

Machine Perception of Three Dimensional Solids May 1963 – 1965

In May 1963 computer scientist Lawrence G. Roberts published Machine Perception of Three Dimensional Solids, MIT Lincoln Laboratory Report, TR 315, May 1963. This contained "the first algorithm to eliminate hidden or obscured surfaces from a perspective picture" (Carlson, A Critical History of Computer Graphics and Animation, accessed 05-30-2009).

In 1965, Roberts implemented a homogeneous coordinate scheme for transformations and perspective,  publishing Homogenous Matrix Representation and Manipulation of N-Dimensional Constructs, MIT MS-1505. Roberts's "solutions to these problems prompted attempts over the next decade to find faster algorithms for generating hidden surfaces" (Carlson, op. cit.).

View Map + Bookmark Entry

The Printing and the Mind of Man Exhibition July 16 – July 27, 1963

Detail of cover of Printing and the Mind of Man.  Please click to see entire image.

Detail of back cover of Printing and the Mind of Man.  Please click to see entire image.

The Printing and the Mind of Man exhibition took place in London at the British Museum and at Earls Court Exhibition Centre during a period of only two weeks, from July 16 to July 27, 1963.

The lengthy and complex title of its catalogue, with an emblem and tailpiece designed and engraved by Reynolds Stone, read: Catalogue of a display of printing mechanisms and printed materials arranged to illustrate the history of Western civilization and the means of the multiplication of literary texts since the XV century, organised in connection with the eleventh International Printing Machinery and Allied Trades Exhibition, under the title Printing and the Mind of Man, assembled at the British Museum and at Earls Court, London, 16-27 July 1963. The catalogue described and illustrated with 32 black & white plates, and a color plate reproducing a page from the Mainz Psalter, more than 656 examples of printing and printing technology documenting the influence of print on the development of Western civilization. This exhibition occurred at Earls Court.  The catalogue also described, and illustrated with 16 black & white plates, an exhibition of 163 examples of Fine Printing mounted at the British Museum from July to September 1963.  At the end of their Acknowledgements on p. 9 of the catalogue the Supervisory Committee for the exhibition– librarian Frank Francis, typographer and historian of typography Stanley Morison and writer and antiquarian bookseller John Carter– stated:

"We pay tribute to the organizers of the Gutenberg Quincentenary Exhibition of Printing, assembled at Cambridge in 1940 (and prematurely disassembled because of the risks from enemy bombing). It was our original inspiration for several sections of our display, and its invigorating catalogue has been our constant friend."

Comparison of the 641 items described in the catalogue of 1940 with those described in the catalogue of 1963 show a great deal of overlap, especially as Percy Muir and John Carter, who had been prime movers in the exhibition in 1940, were extensively involved with the exhibition of 1963. The 1963 exhibition and its catalogue were, of course, significant expansions and improvements over the early wartime effort.

The 1963 catalogue was  followed in 1967 by a further-expanded larger format cloth-bound edition with a dramatic double-page engraved title by Reynolds Stone, significantly more detailed annotations, and without discussion of "printing mechanisms," entitled Printing and the Mind of Man. A Descriptive Catalogue Illustrating the Impact of Print on the Evolution of Western Civilization, compiled and edited by antiquarian booksellers and bibliographers John Carter and Percy H. Muir, assisted by book historian and writer Nicolas Barker, antiquarian bookseller H.A. Feisenberger, bibliographer Howard Nixon and historian of printing S.H. Steinberg.

This exhibition, and especially the 1967 book based on it, was, and remains, immensely influential on both institutional and private collectors of landmark books that influenced the development of Western Civilization.   

Taking place at the dawn of online searching and the ARPANET, and roughly twenty years before the development of the personal computer, this exhibition and its catalogues may also record the peak of the print-centric view of information before the development of electronic information technology leading to the Internet. The only references to computing in the exhibition and its catalogues were to Napier on logarithms, and to Leibniz's stepped-drum calculator. The exhibition and catalogues included references to the invention of radio, telephone and films, but not to television. 

Sebastian Carter, "Printing & the Mind of Man," Matrix 20 (2000) 172-180.

View Map + Bookmark Entry

The First Geosynchronous Communications Satellite is Launched July 26, 1963

On July 26, 1963 the first geosynchronous communications satellite, Syncom 2, was launched by NASA on a Delta rocket B booster from Cape Canaveral. "Its orbit was inclined rather than geostationary. . . The satellite successfully kept stationary at the altitude calculated by Herman Potočnik Noordung in the 1920s.

"During Syncom 2's first year, NASA conducted voice, teletype, and facsimile tests, as well as 110 public demonstrations to acquaint people with Syncom's capabilities and invite their feedback. In August 1963, President John F. Kennedy in Washington, D.C., telephoned Nigerian Prime Minister Abubakar Balewa aboard USNS Kingsport docked in Lagos Harbor; the first live two-way call between heads of state by satellite. The Kingsport acted as a control station and uplink stationa' (Wikipedia article on Syncom, accessed 05-24-2009).

View Map + Bookmark Entry

Probably the First Book Typeset by Computer October 6, 1963

For the 26th Annual meeting of the American Documentation Institute, held in Chicago from October 6-11, 1963, Hans Peter Luhn of IBM, then president of the American Documentation Institute, issued Automation and Scientific Communication. Short Papers Contributed to the Theme Sessions. . . . On the verso of the title page of this quarto volume a statement reads:

"This 128 page book has been printed from type set automatically with the aid of electronic information processing equipment. It is believed that this is the first volume of technical articles ever produced in this manner."  

Further down the page it states,

"Oklahoma Publishing Co. performed keypunching from manuscripts, processing on an IBM 1620, automatic type-setting on Linotype and printing of reproduction proofs (Bill Wlliams, Chairman, Research Committee)."

A special printed slip pasted onto the front pastedown endpaper of a cloth-bound copy in my collection reads:


"This is No. 75 of 100 copies of a special edition of this book, prepared as a memento as a token of recognition to those who were involved in its creation and who are here identified by their signatures:

[manually signed by] "H. P. Luhn, S.E. Furth, B Williams, Doris Craig, S. L. Reed Jr. J P Blandean, R M Maxwell, Haribert H Luhn, John Bustin (?). 

[manually] "Countersigned Chicago, Ill, October 6, 1963, R M Hayes, President, American Documentation Institute."

The work was issued in two parts. Part one, described above, was mailed to participants before the meeting. Ordinary copies were in printed wrappers. Part two was available at the meeting which took place from October 6 to 11.  After its title page and table of contents part two was paginated continuously with part one (pp. 129-382). The verso of the title page of part two stated, "International Business Machines Corp., Data Processing Division performed keypunching of bibliographic informatiion, processing on an IBM 1401 for creating table of contents and KWIC and author index to titles of the papers published; a bibliography and citation index to the titles of all referenced papers, a KWIC and author index thereto; and furnished reproduction proofs of this material (R. M. Maxwell, Manager)."

Luhn "planned and directed the efforts that led to the first volume of technical papers produced by fully automatic typesetting techniques. These efforts, moreover, were successfuly carried out within the remarkable dealine requirements of often not more than three weeks from receipt of author manuscript to inclusion in a printed and bound volume, typeset by computer.

"In addition, within the same brief time period, the bibliographic information for the approximately 600 'short papers' accepted was keypunched and processed on a computer to produce the table of contents, a KWIC index, an author index, a citation index to the bibliographical references in the papers, a KWIC index to the titles of these cited references, a bibliography of the cited papers, and an author index to the citations" (Schultz [ed] H. P. Luhn: Pioneer of Information Science. Selected Works [1968] 29).

When I wrote this entry on June 23, 2012 I did not know of any earlier printed book on any subject typeset by computer.

View Map + Bookmark Entry

The Beinicke Rare Book & Manuscript Library Opens at Yale October 14, 1963

On October 14, 1963 the Beinicke Rare Book & Manuscript Library opened at Yale University. Designed by Pritzker Prize-winning architect Gordon Bunshaft of the firm of Skidmore, Owings & Merrill, it is the largest building in the world reserved exclusively for the preservation of rare books and manuscripts. In my opinion it is the also greatest and most dramatic "temple" devoted to the display, study and preservation of rare books and manuscripts built at a university in the twentieth century.

Like the Printing and the Man of Man exhibition which coincidentally occurred in London in July 1963, the opening of the Beinicke Library reflected one of the historical peaks of recognition of the role of the physical book in the creation, distribution and storage of information.

View Map + Bookmark Entry

Touch-Tone Dialing is Introduced November 1963

In November 1963 touch-tone telephone dialing, developed at Bell Labs, was introduced, enabling calls to be switched digitally. The research leading to the design of the touch-tone keyboard was conducted by industrial psychologist John E. Karlin, head of Bell Labs’ Human Factors Engineering department, the first department of its kind at any American company.

"The rectangular design of the keypad, the shape of its buttons and the position of the numbers — with 1-2-3' on the top row instead of the bottom, as on a calculator — all sprang from empirical research conducted or overseen by Mr. Karlin.  

"The legacy of that research now extends far beyond the telephone: the keypad design Mr. Karlin shepherded into being has become the international standard on objects as diverse as A.T.M.’s, gas pumps, door locks, vending machines and medical equipment" (http://www.nytimes.com/2013/02/09/business/john-e-karlin-who-led-the-way-to-all-digit-dialing-dies-at-94.html, accessed 02-10-2013).

View Map + Bookmark Entry

First Use of the Term "Hacker" in the Context of Computing November 20, 1963

On November 20, 1963 the first use of the term "hacker" in the context of computing appeared in the MIT student newspaper, The Tech:

"Many telephone services have been curtailed because of so-called hackers, according to Prof. Carlton Tucker, administrator of the Institute phone system. . . .The hackers have accomplished such things as tying up all the tie-lines between Harvard and MIT, or making long-distance calls by charging them to a local radar installation. One method involved connecting the PDP-1 computer to the phone system to search the lines until a dial tone, indicating an outside line, was found. . . . Because of the 'hacking,' the majority of the MIT phones are 'trapped.' "

View Map + Bookmark Entry

The Assassination of John F. Kennedy & its Coverage on Radio & Television November 23, 1963

On November 22, 1963 at 12:30 p.m. Central Standard Time John Fitzgerald Kennedy, the 35th President of the United States, was assassinated by a sniper in Dealey Plaza, Dallas, Texas while traveling with his wife Jacqueline, Texas Governor John Connally, and Connally's wife Nellie, in a presidential motorcade. 

According to Howard Rosenberg and Charles Feldman, No Time to Think. The Menace of Media Speed and the 24-Hour News Cycle, p. 18, the story was first reported by United Press International White House reporter Merriman Smith, who "outfought rival Associated Press reporter Jack Bell for a radiophone in their wire services limo so that he could be first to report that the president's motorcade had taken fire. In fact, 68 percent of Americans learned that Kennedy had been shot within 30 minutes of the attack." The story was arguably the biggest spot news, or breaking news story since Japan's attack on Pearl Harbor on December 7, 1941.

"No radio or television stations broadcast the assassination live because the area through which the motorcade was traveling was not considered important enough for a live broadcast. Most media crews were not even with the motorcade but were waiting instead at the Dallas Trade Mart in anticipation of President Kennedy's arrival. Those members of the media who were with the motorcade were riding at the rear of the procession.

"The Dallas police were recording their radio transmissions over two channels. A frequency designated as Channel One was used for routine police communications; Channel Two was an auxiliary channel dedicated to the President's motorcade. Up until the time of the assassination, most of the broadcasts on the second channel consisted of Police Chief Jesse Curry's announcements of the location of the motorcade as it wound through the city.

"President Kennedy's last seconds traveling through Dealey Plaza were recorded on silent 8 mm film for the 26.6 seconds before, during, and immediately following the assassination. This famous film footage was taken by garment manufacturer and amateur cameraman Abraham Zapruder, in what became known as the Zapruder film. Frame enlargements from the Zapruder film were published by Life magazine shortly after the assassination. The footage was first shown publicly as a film at the trial of Clay Shaw in 1969, and on television in 1975. According to the Guinness Book of World Records, an arbitration panel ordered the U.S. government to pay $615,384 per second of film to Zapruder's heirs for giving the film to the National Archives. The complete film, which lasts for 26 seconds, was valued at $16 million.

"Zapruder was not the only person who photographed at least part of the assassination; a total of 32 photographers were in Dealey Plaza. Amateur movies taken by Orville NixMarie Muchmore (shown on television in New York on November 26, 1963), and photographer Charles Bronson captured the fatal shot, although at a greater distance than Zapruder. Other motion picture films were taken in Dealey Plaza at or around the time of the shooting by Robert Hughes, F. Mark Bell, Elsie Dorman, John Martin Jr., Patsy Paschall, Tina Towner, James Underwood, Dave Wiegman, Mal Couch, Thomas Atkins, and an unknown woman in a blue dress on the south side of Elm Street.

"Still photos were taken by Phillip WillisMary Moorman, Hugh W. Betzner Jr., Wilma Bond, Robert Croft, and many others. The lone professional photographer in Dealey Plaza who was not in the press cars was Ike Altgens, photo editor for the Associated Press in Dallas.

"An unidentified woman, nicknamed the Babushka Lady by researchers, might have been filming the Presidential motorcade during the assassination. She was seen apparently doing so on film and in photographs taken by the others.

"Previously unknown color footage filmed on the assassination day by George Jefferies was released on February 19, 2007 by the Sixth Floor Museum, Dallas, Texas. The film does not include the shooting, having been taken roughly 90 seconds beforehand and a couple of blocks away. The only detail relevant to the investigation of the assassination is a clear view of President Kennedy's bunched suit jacket, just below the collar, which has led to different calculations about how low in the back President Kennedy was first shot. . . ." (Wikipedia article on Assassination of John F. Kenneday, accessed 10-18-2014.)

View Map + Bookmark Entry

Mathematical Theory of Data Communications 1964

In 1964 American computer scientist Leonard Kleinrock published his 1962 PhD thesis in book form as Communication Nets: Stochastic Message Flow and Delay, providing a technology and mathematical theory of data communications. (See Reading 13.4.)

View Map + Bookmark Entry

On Distributed Communications 1964

In 1964 Paul Baran of the Rand Corporation, Santa Monica, California, wrote On Distributed Communications, describing the use of redundant routing and message blocks to send information across a decentralized network topology.

View Map + Bookmark Entry

The First Commercial Computers to Use Integrated Circuits 1964

In 1964 RCA announced the Spectra series of computers, which could run the same software as IBM’s 360 machines. The Spectra computers were also the first commercial computers to use integrated circuits.

View Map + Bookmark Entry

The First Online Reservation System 1964

SABRE (Semi-Automatic Business-Related Environment), an online airline reservation system developed by American Airlines and IBM, and based on two IBM mainframes in Briarcliff Manor, New York, became operational in 1964. SABRE worked over telephone lines in “real time” to handle seat inventory and passenger records from terminals in more than 50 cities.

View Map + Bookmark Entry

Social Security Numbers as Identifiers 1964

In 1964 the Internal Revenue Service (IRS) began using social security numbers as tax ID numbers.

View Map + Bookmark Entry

Thomas Kurtz & John Kemeny Invent BASIC 1964

In 1964 nearly all computer use required writing custom software, which was something only scientists and mathematicians tended to do. To make programming accessible to a wider range of people, Dartmouth College mathematicians and computer scientists Thomas E. Kurtz and  John G. Kemeny invented BASIC (Beginner’s All-Purpose Symbolic Instruction Code), a general-purpose, high-level programming language designed for ease of use.

Kurtz and Kemeny designed BASIC to allow students to write mainframe computer programs for the Dartmouth Time-Sharing System, the first large-scale time-sharing system to be implemented successfully, which was operational on May 1, 1964. Developed by Darthmouth students under Kurtz and Kemeny's supervision, BASIC was intended specifically for less technical users who did not have or want the mathematical background previously expected. 

In the mid 1970s and 1980s versions of BASIC became widespread on microcomputers. Microcomputers usually shipped with BASIC, often in the machine's firmware. Having an easy-to-learn language on these early personal computers allowed small business owners, professionals, hobbyists, and consultants to develop custom software on computers they could afford.

View Map + Bookmark Entry

The First Computerized Encyclopedia 1964

In 1964 Systems Development Corporation, Santa Monica, California, developed the first computerized encyclopedia.

View Map + Bookmark Entry

Science Citation Index 1964

IN 1964 Eugene Garfield and the Institute for Scientific Information published the first Science Citation Index in five printed volumes, indexing 613 journals and 1.4 million citations, using the method of citation analysis.

Two years later Science Citation Index became available on magnetic tape.

View Map + Bookmark Entry

The First to Create Three-Dimensional Images of the Human Body Using a Computer 1964

"Boeing Man" or "Human Figure," a wireframe drawing printed on a Gerber Plotter.  It was used as a standard figure of a pilot.

In 1964 William A. Fetter, an art director at The Boeing Company in Seattle, Washington, supervised development of a  computer program that allowed him to create the first three-dimensional images of the human body through computer graphics. Using this program Fetter and his team produced the first computer model of a human figure for use in the study of aircraft cockpit design. It was called the “First Man” or "Boeing Man." Though Fetter's wire frame drawings could be called commercial art, they were of a high aesthetic standard.

Herzogenrath & Nierhoff-Wielk, Ex-Machina–Frühe Computergrafik bis 1979. Die Sammlunge Franke. . . . Ex-Machina– Early Computer Graphics up to 1979 (2007) 239.

View Map + Bookmark Entry

IBM's Magnetic Tape/Selectric Typewriter Begins "Word Processing" 1964

In 1964 IBM introduced the Magnetic Tape/Selectric Typewriter (MT/ST).

"With this, for the first time, typed material could be edited without having to retype the whole text or chop up a coded copy. On the tape, information could be stored, replayed (that is, retyped automatically from the stored information), corrected, reprinted as many times as needed, and then erased and reused for other projects.

"This development marked the beginning of word processing as it is known today. It also introduced word processing as a definite idea and concept. The term was first used in IBM's marketing of the MT/ST as a 'word processing' machine. It was a translation of the German word textverabeitung, coined in the late 1950s by Ulrich Steinhilper, an IBM engineer. He used it as a more precise term for what was done by the act of typing. IBM redefined it 'to describe electronic ways of handling a standard set of office activities -- composing, revising, printing, and filing written documents.' "

View Map + Bookmark Entry

Bitzer & Willson Invent the First Plasma Video Display (Neon Orange) 1964

In 1964 Donald Bitzer, H. Gene Slottow, and Robert Willson at the University of Illinois at Urbana-Champaign invented the first plasma video display for the PLATO Computer System.

The display was monochrome neon orange and incorporated both memory and bitmapped graphics. Built by Owens-Illinois glass, the flat panels were marketed under the name "Digivue."

View Map + Bookmark Entry

Woodrow Bledsoe Originates of Automated Facial Recognition 1964 – 1966

From 1964 to 1966 Woodrow W. "Bledsoe, along with Helen Chan and Charles Bisson of Panoramic Research, Palo Alto, California, researched programming computers to recognize human faces (Bledsoe 1966a, 1966b; Bledsoe and Chan 1965). Because the funding was provided by an unnamed intelligence agency, little of the work was published. Given a large database of images—in effect, a book of mug shots—and a photograph, the problem was to select from the database a small set of records such that one of the image records matched the photograph. The success of the program could be measured in terms of the ratio of the answer list to the number of records in the database. Bledsoe (1966a) described the following difficulties:

" 'This recognition problem is made difficult by the great variability in head rotation and tilt, lighting intensity and angle, facial expression, aging, etc. Some other attempts at facial recognition by machine have allowed for little or no variability in these quantities. Yet the method of correlation (or pattern matching) of unprocessed optical data, which is often used by some researchers, is certain to fail in cases where the variability is great. In particular, the correlation is very low between two pictures of the same person with two different head rotations.'

"This project was labeled man-machine because the human extracted the coordinates of a set of features from the photographs, which were then used by the computer for recognition. Using a GRAFACON, or RAND TABLET, the operator would extract the coordinates of features such as the center of pupils, the inside corner of eyes, the outside corner of eyes, point of widows peak, and so on. From these coordinates, a list of 20 distances, such as width of mouth and width of eyes, pupil to pupil, were computed. These operators could process about 40 pictures an hour. When building the database, the name of the person in the photograph was associated with the list of computed distances and stored in the computer. In the recognition phase, the set of distances was compared with the corresponding distance for each photograph, yielding a distance between the photograph and the database record. The closest records are returned.

"This brief description is an oversimplification that fails in general because it is unlikely that any two pictures would match in head rotation, lean, tilt, and scale (distance from the camera). Thus, each set of distances is normalized to represent the face in a frontal orientation. To accomplish this normalization, the program first tries to determine the tilt, the lean, and the rotation. Then, using these angles, the computer undoes the effect of these transformations on the computed distances. To compute these angles, the computer must know the three-dimensional geometry of the head. Because the actual heads were unavailable, Bledsoe (1964) used a standard head derived from measurements on seven heads.

"After Bledsoe left PRI [Panoramic Research, Inc.] in 1966, this work was continued at the Stanford Research Institute, primarily by Peter Hart. In experiments performed on a database of over 2000 photographs, the computer consistently outperformed humans when presented with the same recognition tasks (Bledsoe 1968). Peter Hart (1996) enthusiastically recalled the project with the exclamation, 'It really worked!' " (Faculty Council, University of Texas at Austin, In Memoriam Woodrow W. Bledsoe, accessed 05-15-2009).

Bledsoe, W. W. 1964. The Model Method in Facial Recognition, Technical Report PRI 15, Panoramic Research, Inc., Palo Alto, California.

Bledsoe, W. W., and Chan, H. 1965. A Man-Machine Facial Recognition System-Some Preliminary Results, Technical Report PRI 19A, Panoramic Research, Inc., Palo Alto, California.

Bledsoe, W. W. 1966a. Man-Machine Facial Recognition: Report on a Large-Scale Experiment, Technical Report PRI 22, Panoramic Research, Inc., Palo Alto, California.

Bledsoe, W. W. 1966b. Some Results on Multicategory Patten Recognition. Journal of the Association for Computing Machinery 13(2):304-316.

Bledsoe, W. W. 1968. Semiautomatic Facial Recognition, Technical Report SRI Project 6693, Stanford Research Institute, Menlo Park, California.

View Map + Bookmark Entry

Marshall McLuhan's "The Medium is the Message" 1964

In 1964 Canadian educator, philosopher, and media theorist of the University of Toronto Marshall McLuhan published Undertstanding Media: The Extensions of Man.

"In it McLuhan proposed that media themselves, not the content they carry, should be the focus of study — popularly quoted as the medium is the message'. McLuhan's insight was that a medium affects the society in which it plays a role not by the content delivered over the medium, but by the characteristics of the medium itself. McLuhan pointed to the light bulb as a clear demonstration of this concept. A light bulb does not have content in the way that a newspaper has articles or a television has programs, yet it is a medium that has a social effect; that is, a light bulb enables people to create spaces during nighttime that would otherwise be enveloped by darkness. He describes the light bulb as a medium without any content. McLuhan states that 'a light bulb creates an environment by its mere presence.' More controversially, he postulated that content had little effect on society — in other words, it did not matter if television broadcasts children's shows or violent programming, to illustrate one example — the effect of television on society would be identical. He noted that all media have characteristics that engage the viewer in different ways; for instance, a passage in a book could be reread at will, but a movie had to be screened again in its entirety to study any individual part of it.

"The book is the source of the well-known phrase 'The medium is the message'. It was a leading indicator of the upheaval of local cultures by increasingly globalized values. The book greatly influenced academics, writers, and social theorists" (Wikipedia article on Understanding Media, accessed 11-14-2009)

View Map + Bookmark Entry

Bertram Gross Coins the Term "Information Overload" 1964

In 1964 American social scientist Bertram Myron Gross coined the expression "information overload" in his book, The Managing of Organizations: the Administrative Struggle.

View Map + Bookmark Entry

IBM Receives the Fundamental Patent for Disk Drives 1964 – March 24, 1970

On March 24, 1970 IBM received US Patent 3,503,060, the fundamental patent for disk drives.

"The invention of the hard disk drive was clearly led and inspired by Rey Johnson, the IBM San Jose Laboratory Director, with the day to day management of the project led by Lou Stevens beginning November 1, 1953. However, 5,600 miles away and more or less simultaneously, a prolific German inventor Gerhard Dirks was inventing the same thing.

"During World War II, Dirks as a Russian prisoner of war in occupied Germany was incarcerated for several years in a building containing a technical library. Dirks, with little else to do, spent his time reading and studying in the library during which he conceived versions of magnetic drum and core storage. When the war ended, he returned to his former employer, the Krupp Company, but failed to interest it in taking a license to his German patent application. A small German company, Siemag Fein Mechanische Werke GmbH, that manufactured bookkeeping machines did show interest and, in return for an exclusive German license, paid Dirks a modest sum, enough to enable him to file his patent application worldwide.

"On December 14, 1954 Lou Stevens, Ray Bowdle, Jim Davis, Dave Kean, Bill Goddard and John Lynott of IBM San Jose filed a US Patent application as the co-inventors of the several potential inventions disclosed therein. Because under US law a patent can claim only one invention in 1962 the application was split into two applications, one relating the the RAMAC 305 System itself and one relating to a Magnetic Transducer Mounting Apparatus. The two applications were identical as to disclosure and drawings and when issued were substantially identical differing only in the claimed invention.
  • US Patent 3,134,097 was the first to issue on May 19, 1964; invented by Stevens, Goddard & Lynott it claimed the 305 RAMAC system.
  • US Patent 3,503,060 issued March 24, 1970; invented by Goddard & Lynott, it claimed both a Magnetic Transducer Mounting Apparatus and Disk Drives in general.
"The broad claims on Disk Drives were not originally in the IBM application that became the '060 patent but were added by IBM during the application's review by the US Patent Office" (http://chmhdd.wikifoundry.com/page/Disk+Drive+Patent, accessed 10-22-2013).
View Map + Bookmark Entry

Joseph Weizenbaum Writes ELIZA: A Pioneering Experiment in Artificial Intelligence Programming 1964 – 1966

Between 1964 and 1966 German and American computer scientist Joseph Weizenbaum at MIT wrote the computer program ELIZA. This program, named after the ingenue in George Bernard Shaw's play Pygmalion, was an early example of primitive natural language processing. The program operated by processing users' responses to scripts, the most famous of which was DOCTOR, which was capable of engaging humans in a conversation which bore a striking resemblance to one with an empathic psychologist. Weizenbaum modeled its conversational style after Carl Rogers, who introduced the use of open-ended questions to encourage patients to communicate more effectively with therapists. The program applied pattern matching rules to statements to figure out its replies. Using almost no information about human thought or emotion, DOCTOR sometimes provided a startlingly human-like interaction.

"When the "patient" exceeded the very small knowledge base, DOCTOR might provide a generic response, for example, responding to "My head hurts" with "Why do you say your head hurts?" A possible response to "My mother hates me" would be "Who else in your family hates you?" ELIZA was implemented using simple pattern matching techniques, but was taken seriously by several of its users, even after Weizenbaum explained to them how it worked. It was one of the first chatterbots in existence" (Wikipedia article on ELIZA, accessed 06-15-2014).

"Weizenbaum was shocked that his program was taken seriously by many users, who would open their hearts to it. He started to think philosophically about the implications of artificial intelligence and later became one of its leading critics.

"His influential 1976 book Computer Power and Human Reason displays his ambivalence towards computer technology and lays out his case: while Artificial Intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. Weizenbaum makes the crucial distinction between deciding and choosing. Deciding is a computational activity, something that can ultimately be programmed. Choice, however, is the product of judgment, not calculation. It is the capacity to choose that ultimately makes us human. Comprehensive human judgment is able to include non-mathematical factors, such as emotions. Judgment can compare apples and oranges, and can do so without quantifying each fruit type and then reductively quantifying each to factors necessary for comparison" (Wikipedia article on Joseph Weizenbaum, accessed 06-15-2014).

View Map + Bookmark Entry

Mosteller & Wallace Apply Computing in Disputed Authorship Investigation of The Federalist Papers 1964

In the early 1960s American statistician Frederick Mosteller and David Wallace conducted what was probably the most influential early computer-based authorship investigation in an attempt to identify the authorship of the twelve disputed papers in the The Federalist Papers by Alexander HamiltonJames Madison, and John Jay. With so much material to work with on the same subject matter by the authorship candidates this study was an ideal situation for comparative analysis. Mosteller and Wallace were primarily interested in the statistical methods they employed, but they were able to show that Madison was very likely the author of the disputed papers. Their conclusions were generally accepted, and The Federalist Papers have been used to test new methods of authorship discrimination. 

Mosteller, F. and D. L. Wallace. Inference and Disputed Authorship: The Federalist. Reading, MA., 1964

Holmes, D. I. and R. S. Forsyth (1995). The Federalist Revisited: New Directions in Authorship AttributionLiterary and Linguistic Computing 10 (1995) 111–27.

A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell, 2004. 

View Map + Bookmark Entry

MEDLARS: The First Large Scale Computer-Based Retrospective Search Service Available to the General Public January 1964

In January 1964 Medical Literature Analysis and Retrieval System (MEDLARS) was operational at the National Library of Medicine, Bethesda, Maryland.

MEDLARS was the first large scale, computer-based, retrospective search service available to the general public.  It was also the first major machine-readable database and batch-retrieval system.

View Map + Bookmark Entry

Luther Terry Issues The Surgeon General's Report on Smoking and Health January 11, 1964

On January 11, 1964 Surgeon General of the United States Luther L. Terry issued Smoking and Health. Report of the Advisory Committee to the Surgeon General of the Public Health Service. This 387 page report was published on a Saturday to minimize the negative effect on the American stock markets, while maximizing the coverage in Sunday newspapers. It was issued by the U.S. Government Printing Office for $1.25.

"The report concluded that lung cancer and chronic bronchitis are causally related to cigarette smoking. The report also noted out that there was suggestive evidence, if not definite proof, for a causative role of smoking in other illnesses such as emphysema, cardiovascular disease, and various types of cancer. The committee concluded that cigarette smoking was a health hazard of sufficient importance to warrant appropriate remedial action.

"In June 1964, the Federal Trade Commission voted by a margin of 3-1 to require that cigarette manufacturers "clearly and prominently" place a warning on packages of cigarettes effective January 1, 1965, stating that smoking was dangerous to health, in line with the warning issued by the Surgeon General's special committee. The same warning would be required in all cigarette advertising effective July 1, 1965.

"The landmark Surgeon General's report on smoking and health stimulated a greatly increased concern about tobacco on the part of the American public and government policymakers and led to a broad-based anti-smoking campaign. It also motivated the tobacco industry to intensify its efforts to question the scientific evidence linking smoking and disease. The report was also responsible for the passage of the Cigarette Labeling and Advertising Act of 1965, which, among other things, mandated Surgeon General's health warnings on cigarette packages" (Wikipedia article on Luther Terry, accessed 11-11-2012).

View Map + Bookmark Entry

Texas Instruments & Zenith Radio Introduce a Hearing Aid, the First Consumer Product Containing an Integrated Circuit February 14, 1964

On February 14, 1964 Texas Instruments in partnership with Zenith Radio introduced the first consumer product containing an integrated circuit—a hearing aid.

View Map + Bookmark Entry

Solomonoff Begins Algorithmic Information Theory March – June 1964

In March and June, 1964 American mathematician and researcher in artificial intelligence Ray Solomonoff published "A Formal Theory of Inductive Inference, Part I" Information and Control, 7, No. 1, 1-22,  and  "A Formal Theory of Inductive Inference, Part II," Information and Control, 7, No. 2,  224-254. This two-art paper is considered the beginning of algorithmic information theory.

Solomonoff first described his results at a Conference at Caltech, 1960, and in a report of February, 1960: "A Preliminary Report on a General Theory of Inductive Inference."

View Map + Bookmark Entry

The IBM System/360 Family is Introduced April 7, 1964

On April 7, 1964 IBM announced the System/360 family of compatible machines.  All IBM System/360 products ran the same operating system—OS/360. Previously products developed by different divisions of IBM were incompatible.

IBM System/360 products were the first IBM computers capable of both commercial and scientific applications that were offered at what was then considered a “reasonable price.” Their architecture incorporated microprogramming.

View Map + Bookmark Entry

720 Million Copies Printed and Distributed in Under Four Years May 1964

In May 1964 the Central Intelligence Bureau of the Chinese People's Liberation Army issued in Beijing or Tianjin Mao Zedong, Mao Zhu XI Yu Lu (Quotations of Chairman Mao.) This "probably still holds the world record for most copies printed of a single work in under four years (720 million books by the end of 1967)."

See Oliver Lei Han,"Sources and Early Printing History of Chairman Mao’s 'QUOTATIONS',", @The Bibliographical Society of America Bibsite, accessed 11-30-2010).

Here is a description of the first edition adapted from Michael R. Thompson Autumn Miscellany, List 96, accessed 11-30-2010:

"MAO TSE TUNG [MAO ZEDONG]. Mao zhuxi yulu [Chinese, i.e., Quotations of Chairman Mao]. [n.p., probably Beijing: Central Intelligence Bureau of the Chinese People’s Liberation Army, May, 1964]. Sixteenmo, with the page size measuring 5 3/8: x 4.” 2, half-title printed in red with blank verso], [2, title printed in green and red with blank verso], [2, portrait of Mao in brown tones, with blank verso], [2, endorsement leaf by Lin Biao with blank verso], 2 (letterpress introduction), 2 (table of contents, listing 30 chapters), 250 pp. The endorsement leaf is in the earliest state, with the misprint in the second character down of the second vertical row from the right. (See Oliver Han Lei, “How Read is the Little Red Book, in the Antiquarian Book Review, November 2003). In the earliest binding of off-white paper wrappers with front cover printed in black and red, and spine printed in red.

"First edition, distinguishable from other editions by its slightly larger paper size, by containing thirty chapters and ending at page 250. Contains the Lin Biao’s endorsement leaf, with three sentences for the diary of Lei Feng, printed letterpress in calligraphic script. The endorsement leaf is lacking in most copies because of political circumstances. Lin Biao, head of National Defense, had risen in power within the Mao hierarchy and was designated to become Mao’s successor. However, rumors surfaced that Lin was planning to assassinate Mao. While never completely proven, they caused Lin to leave suddenly on a military transport for an undisclosed location when their plane was shot down in Mongolia on the evening of September 12, 1971. Subsequently, Mao attempted to eradicate his name from modern history, and the endorsement leaf was ordered to be torn out or defaced in all copies as a sign of loyalty to Mao. Therefore, copies with the endorsement leaf are uncommon.

"The first state was printed in an edition of 50,000-60,000 copies. It was never intended for sale, but was issued to members of the military as inspirational reading. It was only in the second state that the well-known vinyl cover first appeared. By 1967, the book had been translated into more than thirty-six languages and an estimated 720 million copies had been printed. . . ."

View Map + Bookmark Entry

The Rand Tablet: One of the Earliest Tablet Computers and the First Reference to Electronic Ink August 1964

In August 1964 M. R. Davis and T. O. Ellis of The Rand Corporation, Santa Monica, California, published The RAND Tablet: A Machine Graphical Communication DeviceThey indicated that the device had been in use since 1963.

"The RAND table is believed to be the first such graphic device that is digital, is relatively low-cost, possesses excellent linearity, and is able to uniquely describe 10 [to the 6th power] locations in the 10" x 10" active table area. . . . the tablet has great potential no only in such applications as digitizing map information, but also as a working tool in the study of more esoteric applications of graphical languages for man-machine interaction. . . . " (p.iv)

"The RAND tablet device generates 10-bit x and 10-bit y stylus position information. It is connected to an input channel of a general-purpose computer and also to an oscilloscope display. The display control multiplexes the stylus position information with computer-generated information in such a way that the oscilloscope display contains a composite of the current pen position (represented as a dot) and the computer output. In addition, the computer may regenerate meaningful track history on the CRT, so that while the user is writing, it appears that the pen has "ink." This displayed "ink" is visualized from the oscilloscope display while hand-directing the stylus position on the tablet. users normally adjust within a few minutes to the conceptual superposition of the displayed ink and the actual off-screen pen movement. There is no apparent loss of ease or speed in writing, printing, constructing arbitrary figures, or even in penning one's signature" (pp. 2-3).

J. W. Ward, History of Pen Computing: Annotated Bibliography in On-line Character Recognition and Pen Computing: http://rwservices.no-ip.info:81/pens/biblio70.html#DavisMR64 , accessed 12-30-2009).

View Map + Bookmark Entry

SYNCOM 3, The First Geostationary Communication Satellite, Is Launched August 19, 1964

On August 19, 1964 the first geostationary communication satellite, Syncom 3, was launched by NASA with a Delta D #25 launch vehicle from Cape Canaveral.

"The satellite, in orbit near the International Date Line, was used to telecast the 1964 Summer Olympics in Tokyo to the United States. It was the first television program to cross the Pacific ocean" (Wikipedia article on Syncom, accessed 05-24-2009).

View Map + Bookmark Entry

Arader, Parrish & Bessinger Organize the First Humanities Computing or Digital Humanities Conference September 9 – September 11, 1964

From September 9-11 the first Literary Data Processing Conference occurred. It was organized by Harry F. Arader of IBM and chaired by Stephen M. Parrish of Cornell and Jess B. Bessinger of NYU. This was the first conference on what came to be called humanities computing or digital humanities.

"Among the other speakers, Roberto Busa expatiated on the problems of managing 15 million words for his magnum opus on Thomas Aquinas. Parrish and Bessinger, along with the majority of other speakers, reported on their efforts to generate concordances with the primitive data processing machines available at that time. In light of the current number of projects to digitize literary works it is ironic to recall Martin Kay’s plea to the audience not to abandon their punch cards and magnetic tapes after their concordances were printed and (hopefully) published" (Joseph Raben, "Introducing Issues in Humanities Computing", Digital Humanities Quarterly, Vol. 1, No. 1 [2007])

On March 20, 2014 Joseph Raben posted information relevant to the conference on the Humanist Discussion Group, Vol. 27, No. 908, from which I quote:

In September 1964 IBM organized at the same laboratory what it called a Literary Data Processing conference, primarily, I believe now, to publicize the project of Fr. Roberto Busa to generate a huge verbal index to the writings of  Saint Thomas Aquinas and writers associated with him. IBM had underwritten this  project and Fr. Busa, an Italian Jesuit professor of linguistics, had been able to  recruit a staff of junior clergy to operate his key punches. The paper he read at this conference was devoted to the problems of managing the huge database he had created. IBM had persuaded The New York Times to send a reporter to the conference, and in the story he filed he chose to describe in some detail my paper on the Milton-Shelley project. The report of the eccentric professor who was trying to use a computer to analyze poetry caught the fancy of the news services, and the story popped up in The [London] Times and a  few other major newspapers around the world.

What impressed me most at that conference, however, was the number of American academics who had been invited to speak about their use of the computer, often to generate concordances. Such reference works had, of course, long  antedated the computer, having originated in the Renaissance, when the first efforts  to reconcile the disparities among the four Gospels produced these alphabetized lists of  keywords and their immediate contexts, from which scholars hoped to  extract the "core" of biblical truth. The utility of such reference works  for non-biblical literature soon became obvious, and for centuries,  dedicated students of literature, often isolated in outposts of Empire,  whiled away their hours of enforced leisure by copying headwords, lines  and citations onto slips which then had to be manually alphabetized for  the printer. Such concordances already existed for a small number of major poets, like Milton, Shelley and Shakespeare.

Apparently unrecognized by the earlier compilers of concordances was the concept that by restructuring the texts they were concording into a new order – here, alphabetical, but potentially into many others – they were creating a perspective radically different from the linear organization into which the texts had originally been organized.  A major benefit to the scholar of this new structure is the ability to examine all the  occurrences of individual words out of their larger contexts but in  association with other words almost immediately adjacent. Nascent in  this effort was the root of what we now conceive as a text database.

Some of this vision was becoming visible to the members of the avant garde represented at the Literary Data Processing conference, who had generally taken up a program called KWIC (keyword in context) that IBM  had "bundled" with its early computers, a program designed to facilitate  control over scientific information. Because it selectedkeywords from  rticle titles, it was recognized as a crude but acceptable mechanism for literary concordances, to the extent that Stephen M. Parrish had  begun publishing a series for Victorian poets, and others at the  conference reported on their work on Chaucer, Old English and other areas of literary interest. In hindsight it is evident that the greater  significance of these initiatives was twofold: first, they made clear that even in their primitive state in the 1960s, computers could perform functions beyond arithmetic and second,
that another dimension  f language study was available. From the beginning signaled by this small event would come a growing academic discipline covering such topics as corpus linguistics, machine translation, text analysis and literary databases.

Beyond the activity reported at that early conference, it became
increasingly evident that computer-generated concordances could not only serve immediate scholarly  needs but could also imply future applications of expanding value. Texts could be read non-linearly, in a variety of dimensions, with the entire  vocabulary alphabetized, with the most common words listed first, with  the least common words listed first, or with all the words spelled  backwards (so their endings could be associated), and in almost any  other manner that a scholar's imagination could conjure.Concordances  could be constructed for non-poetic works, such as Melville's Moby-Dick or Freud's translated writings. Many poets of lesser rank than Shakespeare, Milton, and Chaucer could now be accorded the stature of being concorded, and even political statements could be made, as when the anti-Stalinist Russian Josip Mandelstam was exalted by having his poetry concorded. David W. Packard even constructed a concordance to Minoan Linear A, the undeciphered writing system of prehistoric Crete.

Looking beyond that group's accomplishment in creating the concordances and other tools they were reporting on, I had a vision of a newer scholarship, based on a melding of the approaches that had served humanities scholars for generations with the newer ones generated by the computer scientists who were struggling at that  time to understand their new tool, to enlarge its capacities. Sensing that the group  of humanists gathering for this pioneering conference could benefit from maintaining communication with each other beyond this meeting, I devoted  some energy and persistence to persuading IBM to finance what I  conceived first as a newsletter. Through the agency of Edmond A. Bowles, a musicologist who had decided he could support his family more successfully as an IBM executive than as a college instructor, I received a grant of $5000 (as well as a renewal in the same amount), a huge award at that time for an assistant professor of English and enough  to impress my dean, who allowed me a course reduction so I could teach myself to be an editor. . . ."

View Map + Bookmark Entry

Introduction of the Moog Synthesizer October 1964

In October 1964 Robert Moog demonstrated the prototype Moog Synthesizer at the Audio Engineering Society convention held in New York City. It was the first substractive synthesizer to utilize a keyboard as a controller. Moog discussed his innovations in a paper that he presented at the conference: "Voltage-Controlled Electronic Music Modules". This paper was published in Journal of the Audio Engineering Society, Vol. 13, No. 3 (July 1965) 200–206.

Moog began his career in 1954, manufacturing Theremins. In the video below he demonstrated both his Synthesizer and the latest Theremin offered by his company.

"There were two key features in Moog's new system: he analyzed and systematized the production of electronically generated sounds, breaking down the process into a number of basic functional blocks, which could be carried out by standardized modules. He proposed the use of a standardized scale of voltages for the electrical signals that controlled the various functions of these modules—the Moog oscillators and keyboard, for example, used a standard progression of 1 volt per octave for pitch control."

"The Moog synthesizer gained wider attention in the music industry after it was demonstrated at the Monterey International Pop Festival in 1967. The commercial breakthrough of a Moog recording was made by Wendy Carlos in the 1968 record Switched-On Bach, which became one of the highest-selling classical music recordings of its era" (Wikipedia article on Robert Moog, accessed 12-01-2013). 

View Map + Bookmark Entry

A Meeting Between Licklider and Lawrence G. Roberts Leads to the Original Planning for What Would Eventually Become ARPANET November 1964

The Homestead Meeting between J.C.R. Licklider and Lawrence G. Roberts of MIT's Lincoln Laboratory in November 1964 inspired Roberts to develop the concept of a computer-to-computer network that could communicate via data packets. This became the basis of the ARPANET.

View Map + Bookmark Entry

TYPESET and RUNOFF: Text Formatting Program and Forerunner of Word Processors November 6, 1964

In November 1964 computer scientist Jerome H. Salzer of MIT wrote TYPESET and RUNOFF, memorandum editor and type-out commmandsRUNOFF was the first computer text formatting program to see significant use. It's formatting commands derived from the commands used by typesetters to manually format documents.

"It actually consisted of a pair of programs, TYPSET (which was basically a document editor), and RUNOFF (the output processor). RUNOFF had support for pagination and headers, as well as text justification (TJ-2 appears to have been the earliest text justification system, but it did not have the other capabilities).

"RUNOFF is a direct predecessor of the runoff document formatting program of Multics, which in turn was the ancestor of the roff and nroff document formatting programs of Unix, and their descendants. It was also the ancestor of FORMAT for the IBM System/360, and of course indirectly for every computerized word processing system.

"Likewise, RUNOFF for CTSS was the predecessor of the various RUNOFFs for DEC's operating systems, via the RUNOFF developed by the University of California, Berkeley's Project Genie for the SDS 940 system.

"The name is alleged to have come from the phrase at the time, I'll run off a copy" (Wikipedia article on TYPESET and RUNOFF, accessed 01-31-2010).

View Map + Bookmark Entry

William Fetter Issues the First Book on Computer Graphics 1965

Example of image from Computer Graphics in Communication.  Please click to view larger image.

Detail of cover of Computer Graphics in Communication. Please click to see entire image.

Detail of title page of Computer Graphics in Communication. Please click to see entire image.

William Fetter taken when he worked at Boeing.

In 1965 William A. Fetter of The Boeing Company in Seattle, Washington, issued the first book on computer graphics: Computer Graphics in Communication. This 110-page work with nearly 100 illustrations may be the first monograph illustrated with computer graphics. Fetter had coined the term "computer graphics" in 1960. The book was "Written for the Course Content Development Study in Engineering Graphics Supported by the National Science Foundation September, 1964." As a reflection of the novelty of its topic, the book contained a bibliography of seven references but stated that none were used directly in its development.

View Map + Bookmark Entry

Licklider Issues "Libraries of the Future" 1965

In 1965 J.C.R. Licklider, Director of Project MAC (Machine-Aided Cognition and Multiple-Access Computers) at MIT and Professor of Electrical Engineering at MIT, published Libraries of the Future, a study of what libraries might be at the end of the twentieth century. Licklider's book reviewed systems for information storage, organization, and retrieval, use of computers in libraries, and library question-answering systems. In his discussion he was probably the first to raise general questions concerning the transition of the book from exclusively printing on paper to electronic form.

View Map + Bookmark Entry

Honeywell Produces an Early Home Computer? 1965

In 1965 Honeywell attempted to open the home computer market with its Kitchen Computer. The H316 was the first under-$10,000 16-bit machine from a major computer manufacturer. It was the smallest addition to the Honeywell "Series 16" line, and was available in three versions: table-top, rack-mountable, and self-standing pedestal. The pedestal version, complete with cutting board, was marketed by Neiman Marcus as "The Kitchen Computer.” It came with some built-in recipes, two weeks' worth of programming, a cook book, and an apron.

There is no evidence that any examples were sold.

View Map + Bookmark Entry

Tom Van Vleck & Noel Morris Write One of the First Email Programs 1965

Though its exact history is murky, email (e-mail) began as a way for users on time-sharing mainframe computers to communicate. Among the first systems to have an email facility were System Development Corporation of Santa Monica's programming for the AN/FSQ-32  (Q32) built by IBM for the United States Air Force Strategic Air Command (SAC), and MIT's Compatible Time-Sharing System (CTSS). The authors of the first email program for CTSS were American software engineer Tom Van Vleck and American computer scientist Noel Morris. The two men created the program in the summer of 1965.

"A proposed CTSS MAIL command was described in an undated Programming Staff Note 39 by Louis Pouzin, Glenda Schroeder, and Pat Crisman. Numerical sequence places the note in either Dec 64 or Jan 65. PSN 39 proposed a facility that would allow any CTSS user to send a message to any other. The proposed uses were communication from "the system" to users informing them that files had been backed up, and communication to the authors of commands with criticisms, and communication from command authors to the CTSS manual editor.

"I was a new member of the MIT programming staff in spring 1965. When I read the PSN document about the proposed CTSS MAIL command, I asked "where is it?" and was told there was nobody available to write it. My colleague Noel Morris and I wrote a version of MAIL for CTSS in the summer of 1965. Noel was the one who saw how to use the features of the new CTSS file system to send the messages, and I wrote the actual code that interfaced with the user. The CTSS manual writeup and the source code of MAIL are available online. (We made a few changes from the proposal during the course of implementation: e.g. to read one's mail, users just used the PRINT command instead of a special argument to MAIL.)  

"The idea of sending "letters' using CTSS was resisted by management, as a waste of resources. However, CTSS Operations did need a faclility to inform users when a request to retrieve a file from tape had been completed, and we proposed MAIL as a solution for this need. (Users who had lost a file due to system or user error, or had it deleted for inactivity, had to submit a request form to Operations, who ran the RETRIEVE program to reload them from tape.) Since the blue 7094 installation in Building 26 had no CTSS terminal available for the operators, one proposal for sending such messages was to invoke MAIL from the 7094 console switches, inputting a code followed by the problem number and programmer number in BCD. I argued that this was much too complex and error prone, and that a facility that let any user send arbitrary messages to any other would have more general uses, which we would discover after it was implemented" (http://www.multicians.org/thvv/mail-history.html, accessed 06-20-2011).

♦ On June 19, 2011 writer and filmmaker Errol Morris published a series of five illustrated articles in The New York Times concerning the roles of his brother Noel and Tom Van Vleck in the invention of email. The first of these was entitled "Did My Brother Invent E-Mail with Tom Van Vleck? (Part One)". The articles, in an usual dialog form, captured some of the experience of programming time-sharing mainframes, and what it was like to send and receive emails at this early date.

View Map + Bookmark Entry

U.S. Senate Hearings on the Invasion of Privacy by Computers 1965

In 1965 hearings were held by the House of Representatives Special Subcommittee on Invasion of Privacy by computers.

View Map + Bookmark Entry

Ted Nelson Coins the Terms Hypertext, Hypermedia, and Hyperlink 1965

In 1965 self-styled "systems humanist" Ted Nelson (Theodor Holm Nelson) published "Complex Information Processing: A File Structure for the Complex, the Changing, and the Indeterminate," ACM '65 Proceedings of the 1965 20th national conference, 84-100In this paper Nelson coined the terms hypertext and hypermedia to refer to features of a computerized information system. He used the word "link" to refer the logical connections that came to be associated with the word "hyperlink."  

Nelson is also credited with inventing the word hyperlink, though its published origin is less specific:

"The term "hyperlink" was coined in 1965 (or possibly 1964) by Ted Nelson and his assistant Calvin Curtin at the start of Project Xanadu. Nelson had been inspired by "As We May Think", a popular essay by Vannevar Bush. In the essay, Bush described a microfilm-based machine (the Memex) in which one could link any two pages of information into a "trail" of related information, and then scroll back and forth among pages in a trail as if they were on a single microfilm reel. The closest contemporary analogy would be to build a list of bookmarks to topically related Web pages and then allow the user to scroll forward and backward through the list.

In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical worldwide computer network, and advocated the creation of such a network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968)" (Wikipedia article on Hyperlink, accessed 08-29-2010). 

Wardrip-Fruin and Montfort, The New Media Reader (2003) 133-45.

View Map + Bookmark Entry

The U.S. Postal Services Introduces OCR 1965

In 1965 the U. S. Postal Sevice introduced OCR software to sort mail.

View Map + Bookmark Entry

Charles Kao Proposes Optical Fibers as a Medium for Communication 1965 – 2009

In 1965 Chinese-British-American electrical engineer and physicist Charles K. Kao of STC's Standard Telecommunications Laboratories in Harlow, Essex, England, and George A. Hockham promoted the idea that the attenuation in optical fibers could be reduced below 20 dB per kilometer, allowing fibers to be a practical medium for communication. Kao and Hockham proposed that the attenuation in fibers available at the time was caused by revovable impurities rather than by fundamental physical effects such as scattering. Eventually fiber optic communication became the technology enabling the Internet backbone.

In 2009 Charles Kao received half of the Nobel Prize in Physics "for groundbreaking achievements concerning the transmission of light in fibers for optical communication." A more detailed account of Kao's work, placing it in historical perspective, was prepared by the Nobel Prize Committee and is available from the Nobel Prize website at this link.

View Map + Bookmark Entry

The TUTOR Programming Language for Education and Games 1965 – 1969

In 1965 Paul Tenczar developed the TUTOR programming language for use in developing electronic learning programs called "lessons" for the PLATO system at the University of Illinois at Urbana-Champaign. It has "powerful answer-parsing and answer-judging commands, graphics and features to stimulate handling student records and statistics by instructors." This also made it suitable for the creation of many non-educational lessons— that is, games—including flight simulators, war games, role-playing, such as Dungeons and Dragons (dnd), card games, word games, and Medical lesson games.

The first documentation of the TUTOR language, under this name, appears to be The TUTOR Manual, CERL Report X-4, by R. A. Avner and P. Tenczar, January 1969.

View Map + Bookmark Entry

Henriette Avram Develops the MARC Cataloguing Standard 1965 – 1968

From 1965 to 1968 programmer and systems analyst Henriette Avram completed the Library of Congress MARC (Machine Readable Cataloging) Pilot Project, creating the foundation for the national and international data standard for bibliographic and holdings information in libraries.

The MARC standards consist of the MARC formats, which are standards for the representation and communication of bibliographic and related information in machine-readable form, and related documentation.... Its data elements make up the foundation of most library catalogs.

View Map + Bookmark Entry

Irving John Good Originates the Concept of the Technological Singularity 1965

In 1965 British mathematician Irving John Good, originally named Isidore Jacob Gudak, published "Speculations Concerning the First Ultraintelligent Machine," Advances in Computers, vol. 6 (1965) 31ff. This paper, published while Good held research positions at Trinity College, Oxford and at Atlas Computer Laboratory, originated the concept later known as "technological singularity," which anticipates the eventual existence of superhuman intelligence:

"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make." 

Stanley Kubrick consulted Good regarding aspects of computing and artificial intelligence when filming 2001: A Space Odyssey (1968), one of whose principal characters was the paranoid HAL 9000 supercomputer.

View Map + Bookmark Entry

Theodore Schellenberg Issues "The Management of Archives" is Published 1965

In 1965 American historian and Assistant Archivist of the United States Theodore R. Schellenberg published The Management of Archives

"In this book Dr. Schellenberg successfully undertakes to define the archival methodology that heretofore has been available only in isolated books and journals. . . An indispensable manual, which anticipates and answers most questions that arise in the handling of nonpublic records." --The American Archivist

(This entry was last revised on 03-21-2014.)

View Map + Bookmark Entry

Filed under: Archives, Libraries

Feigenbaum, Djerassi & Lederberg Develop DENDRAL the First Expert System 1965

In 1965 artificial intelligence researcher Edward Feigenbaum, chemist Carl Djerassi, and molecular biologist Joshua Lederberg, began their collaboration at Stanford University on Dendral, a long-term pioneer project in artificial intelligence that is considered the first computer software expert system

"In the early 1960s, Joshua Lederberg started working with computers and quickly became tremendously interested in creating interactive computers to help him in his exobiology research. Specifically, he was interested in designing computing systems that to help him study alien organic compounds. As he was not an expert in either chemistry or computer programming, he collaborated with Stanford chemist Carl Djerassi to help him with chemistry, and Edward Feigenbaum with programming, to automate the process of determining chemical structures from raw mass spectrometry data. Feigenbaum was an expert in programming languages and heuristics, and helped Lederberg design a system that replicated the way Carl Djerassi solved structure elucidation problems. They devised a system called Dendritic Algorithm (Dendral) that was able to generate possible chemical structures corresponding to the mass spectrometry data as an output" (Wikipedia article on Dendral, accessed 12-22-2013).

Lindsay, Buchanan, Feigenbaum, Lederberg, Applications of Artificial Intelligence for Organic Chemistry. The DENDRAL Project (1980).

View Map + Bookmark Entry

Carver Mead Builds the First Schottky-Barrier Gate Field Effect Transistor 1965 – 1966

In 1965 American electrical engineer and computer scientist Carver Mead of Caltech built the first working Schottky-barrier gate field-effect transistor: GaAs (gallium arsenide) MESFET (metal-semiconductor field effect transistor). This key amplifying device became a mainstay of high-frequency wireless electronics, used in microwave communication systems from radio telescopes to home satellite dishes and cellular phones. Using band-gap-engineered materials, the device evolved into the HEMT (High-electron-mobility transistor).

Mead, "Schottky Barrier Gate Field Effect Transistor," Proceedings of IEEE 54 (1966) 307−308.

View Map + Bookmark Entry

Michael Noll's "Human or Machine" : Comparing Computer-Generated Art with Human Created Art 1965 – 1966

In 1965 A. Michael Noll, American electrical engineer and pioneer computer artist at Bell Labs in Murray Hill, New Jersey, created Computer Composition With Lines. He generated the art work algorithmically with pseudo-random processes to mimic Piet Mondrian’s Composition With Lines (1917). In what became a classic experiment in aesthetics, copies of both works were shown to people, a majority of whom expressed a preference for the computer work and thought it was by Mondrian. The work won first prize in August 1965 in the contest held by Computers and Automation magazine.

The following year Noll published an illustrated account of the production of this pioneering work of computer art and its perception: "Human or Machine: A Subjective Comparison of Piet Mondrian's 'Composition with Lines' (1917) and a Computer-Generated Picture," The Psychological Record 16 (1966) 1-10.

In January 2014 Noll published an authoritative, illustrated, and thoroughly documented historical paper on computer art done at Bell Labs from 1962 to 1968 entitled "First-Hand: Early Digital Art at Bell Telephone Laboratories, Inc." on the website of the IEEE Global History Network.

View Map + Bookmark Entry

John Alan Robinson Introduces the Resolution Principle January 1965

In 1965 philosopher, mathematician and computer scientist John Alan Robinson, while at Rice University, published "A Machine-Oriented Logic Based on the Resolution Principle", Communications of the ACM, 5: 23–41. This paper introduced the resolution principle, a standard of logical deduction in AI applications.

View Map + Bookmark Entry

The Earliest Public Exhibitions of Computer Art February 5 – November 26, 1965

The first public exhibitions of computer art were:

Feb 5-19, 1965:

Generative Computergrafik. Studien-Galerie des Studium Generale, Technische Hochschule Stuttgart. held by Frieder Nake and Georg Nees. Opened by Max Bense.  

Apr 6-24, 1965:

Computer-generated pictures. Howard Wise Gallery, New York, held by A. Michael Noll, Bela Julesz, both of whom worked at Bell Labs. The announcement for the show was a small deck of colored IBM punch cards.

"The agreement was that any profits from the sale of the works would be split between the Wise Gallery and either Julesz or Noll. In the end, not a single work was sold" (Noll, First-Hand: Early Digital Art at Bell Telephone Laboratories, Inc., accessed 01-19-2014).

Nov 5-26, 1965:

Computergrafik. Galerie Wendelin Niedlich, Stuttgart. held by Frieder Nake and Georg Nees. Opened by Max Bense.

View Map + Bookmark Entry

The Cooley-Tukey FFT Algorithm April 1965

In April 1965 American mathematician James W. Cooley of IBM Watson Research Center, Yorktown Heights, New York,  and American statistician John W. Tukey published "An algorithm for the machine calculation of complex Fourier series", Mathematics of  Computation 19, 297–301. This paper enunciated the Cooley-Tukey FFT algorithm, the most common fast Fourier transform algorithm.

"The motivation for it [FFT algorithm] was provided by Dr. Richard L. Garwin at IBM Watson Research who was concerned about verifying a Nuclear arms treaty with the Soviet Union for the SALT talks. Garwin thought that if he had a very much faster Fourier Transform he could plant sensors in the ground in countries surrounding the Soviet Union. He suggested the idea of how Fourier transforms could be programmed to be much faster to both Cooley and Tukey. They did the work, the sensors were planted, and he was able to locate nuclear explosions to within 15 kilometers of where they were occurring" (Wikipedia article on James Cooley, accessed 03-06-2012).

View Map + Bookmark Entry

Maurice Wilkes Introduces Memory Caching April 1965

In April 1965 British computer scientist Maurice Wilkes introduced memory caching.

View Map + Bookmark Entry

INTELSAT 1: The First Commercial Communications Satellite to be Placed in Geosynchronous Orbit April 6, 1965

On April 6, 1965, Intelsat I (nicknamed Early Bird), was placed in geosynchronous orbit above the Atlantic Ocean by a Thrust Augmented Delta D rocket launched from Cape Canaveral, Florida.  Built by the Space and Communications Group of Hughes Aircraft Company (later Hughes Space and Communications Company, and now Boeing Satellite Systems) for COMSAT, Intelsat I was the first commercial communications satellite to be placed in geosynchronous orbit, and the first satellite to provide direct and near instantaneous contact between Europe and North America. It handled television, telephone, and facsimile transmissions. It measured nearly 76 x 61 cm and weighed 34.5 kg.

"It [Intelsat I] helped provide the first live TV coverage of a spacecraft splashdown, that of Gemini 6 in December 1965. Originally slated to operate for 18 months, Early Bird was in active service for four years, being deactivated in January 1969, although it was briefly activated in June of that year to serve the Apollo 11 flight when the Atlantic Intelsat satellite failed. It was deactivated again in August 1969 and has been inactive since that time (except for a brief reactivation in 1990 to commemorate its 25th launch anniversary), although it remains in orbit. . . .Early Bird was one of the satellites used in the then record-breaking broadcast of Our World" (Wikipedia article on Intelsat I, accessed 03-23-2012).

View Map + Bookmark Entry

Gordon Moore Promulgates "Moore's Law" April 19, 1965

On April 19, 1965, while Director of the Research and Development Laboratory at Fairchild Semiconductor in Palo Alto, California, physical chemist Gordon Moore published "Cramming More Components onto Integrated Circuits" in Electronics Magazine. In this article Moore observed that the number of transistors that could be placed inexpensively on an integrated circuit doubled approximately every two years, and predicted that this trend would continue. In 1970, after Moore had left Fairchild Semiconductor to co-found Intel Corporation, the press called this observation “Moore’s Law.”

"The term "Moore's law" was coined around 1970 by the Caltech professor, VLSI pioneer, and entrepreneur Carver Mead. Predictions of similar increases in computer power had existed years prior. Alan Turing in his 1950 paper "Computing Machinery and Intelligence" had predicted that by the turn of the millennium, we would have "computers with a storage capacity of about 10^9", what today we would call "128 megabytes." Moore may have heard Douglas Engelbart, a co-inventor of today's mechanical computer mouse, discuss the projected downscaling of integrated circuit size in a 1960 lecture. A New York Times article published August 31, 2009, credits Engelbart as having made the prediction in 1959. . . .

"Moore slightly altered the formulation of the law over time, in retrospect bolstering the perceived accuracy of his law. Most notably, in 1975, Moore altered his projection to a doubling every two years. Despite popular misconception, he is adamant that he did not predict a doubling "every 18 months". However, David House, an Intel colleague, had factored in the increasing performance of transistors to conclude that integrated circuits would double in performance every 18 months." (Wikipedia article on Moore' Law, accessed 11-19-2011).

View Map + Bookmark Entry

Walter Allner Designs the First Magazine Cover Using Computer Graphics July 1965

Detail of cover of the July 1965 issue of Fortune.  Please click to see entire image.

The color cover of the July 1965 issue of Fortune magazine was the first magazine cover designed using computer graphics, though the editor and designer made not have been aware of that at the time. The cover reproduced a photograph of graphics displayed on a computer screen. Two color filters made the computer image appear in color.  On p. 2 of the issue the magazine explained their cover as follows:

"This cover is the first in Fortune's thirty-five years to have been executed wholly by machine— a PDP-1 computer manufactured by Digital Equipment Corp., and loaned to Fortune by Bolt Beranek & Newman Inc. of Cambridge, Massachusetts. The myriad arrows photographed in upward flight across the machine's oscilloscope symbolize the predominant direction of corporate statistics in 1964, while the large, glowing numeral [500] represents the number of companies catalogued in the Directory of the 500 Largest Industrial Corporations. . . ."

On p. 97 editor Duncan Norton-Taylor devoted his monthly column to the cover, writing:

"In the course of events, Fortune's art director, Walter Allner, might have frowned on filling the column at left with an array of abbreviations and figures, for Allner is no man to waste space on uninspired graphics. But these figures are his special brain children. They are the instructions that told a PDP-1 computer how to generate the design on this month's cover. This program was 'written' to Allner's specifications and punched into an eight-channel paper tape by Sanford Libman and John Price, whose interest in art and electronics developed at M.I.T.

"Generating the design on an oscilloscope and photographing required about three hours of computer time and occupied Price, Allner, and Libman until four one morning. Multiple exposure through two filters added color to the electron tube's glow. . . . 

"Walter Allner was born in Dessau, Germany. He studied at the Bauhaus-Dessau under Josef Albers, Vasily Kandinsky, and Paul Klee. . . . 

"Allner confesses to certain misgivings about teaching the PDP-1 computer too much about Fortune cover design, but adds, philosophically: 'If the computer puts art directors out of work, I'll at least have had some on-the-job training as a design-machine programer [sic]."

Herzogenrath & Nierhoff-Wielk, Ex Machina—Frühe Computergrafik bis 1979. Ex Machina—Early Computer Graphics up to 1978 (2007) 243.

View Map + Bookmark Entry

Lawrence G. Roberts Does the First "Actual Network Experiment" October 1965

In October 1965 Lawrence G. Roberts did the first actual network experiment, tying MIT Lincoln LabsTX-2 in Lexington, Massachusetts to System Development Corporation's Q32 in Santa Monica, California.

This was the first time that two computers talked to each other, and the first time that packets were used to communicate between computers.

View Map + Bookmark Entry

The NY Stock Exchange Completes Automation of Trading 1966

In 1966 The New York Stock Exchange completed automation of its basic trading functions.

View Map + Bookmark Entry

Semi-Conductor Memory Replaces Magnetic-Core Memory 1966

In 1966 semiconductor memory began to replace magnetic-core memory.

View Map + Bookmark Entry

The IRS Completes Computerization of Income-Tax Processing 1966

In 1966 the IRS completed computerization of income-tax processing, with a central facility in Martinsburg, West Virginia, and satellite locations around the United States.

View Map + Bookmark Entry

Robert H. Dennard of IBM Invents DRAM 1966

In 1966 American electrical engineer and inventor Robert H. Dennard of IBM invented Dynamic Random Access Memory (DRAM) cells— one-transistor memory cells that stored each single bit of information as an electrical charge in an electronic circuit. DRAM technology permitted major increases in memory density.

"The idea for DRAM came to Dennard in 1966, in an epiphany on his living room couch in Westchester County, New York, as he enjoyed the waning daylight over the Croton River Gorge. That morning, he had attended an all-day meeting of IBM researchers, where they shared projects with one another in an attempt to stir ideas and foster collaboration. At the time, Dennard was working on metal-oxide semiconductor (MOS) transistor memories for computers. Earlier in the day, he had listened to the group trying to improve magnetic core memory. Something about his own work and what he saw at the review troubled Dennard. The magnetic memory being developed by his competing researchers had drawbacks, but it was extremely simple. His MOS project had promise, on the other hand, but it was quite complicated, using six transistors for each bit of information.

“ 'I thought, ‘What could I do that would be really simple,’' Dennard recalled. There on his couch, he thought through the characteristics of MOS technology—it was capable of building capacitors, and storing a charge or no charge on the capacitor could represent the 1 and 0 of a bit of information. A transistor could control writing the charge to the capacitor. The more Dennard thought, the more he knew he could make a simple memory out of this.

“ 'I called my boss that night around 10 p.m.,' Dennard said. 'It’s a rare event that I’d call him. He listened to me, then suggested we talk about it tomorrow. I joke that he basically told me to take two aspirin and call him in the morning.' 

Dennard still had to work on the six-transistor memory, so he worked on his new idea in his spare time, eventually figuring out the subtleties of writing a charge to the capacitor by way of an access transistor, and then reading it back through the same transistor. In 1967, Dennard and IBM filed a patent application for his single-transistor dynamic random access memory, or DRAM, and the patent was issued in 1968.

"In 1970, Intel ® built a very successful 1-kilobit DRAM chip using a three-transistor cell design, while several manufacturers produced 4-kilobit chips using Dennard’s single-transistor cell by the mid-1970s. Wave after wave of innovation followed, driven by Moore’s Law and scaling principles pioneered by Dennard and coworkers at IBM in the early 1970s. This progress continued through the years, resulting in the DRAM chips of today with capacities of up to 4,000,000,000 bits. Dennard said he could not foresee how important DRAM would become when he invented it: 'I knew it was going to be a big thing, but I didn’t know it would grow to have the wide impact it has today' " (http://www-943.ibm.com/ibm100/us/en/icons/dram/, accessed 07-021-2011).

View Map + Bookmark Entry

Aaron Klug Invents Digital Image Processing 1966

In 1966 English molecular biologist Aaron Klug at the University of Cambridge formulated a method for digital image processing of two-dimensional images.

A. Klug and D. J. de Rosier, “Optical filtering of electron micrographs: Reconstruction of one-sided images,” Nature 212 (1966): 29-32.

View Map + Bookmark Entry

The Amateur Computer Society, Possibly the First Personal Computer Club, is Founded 1966

IN 1966 Stephen B. Gray, computers editor for Electronics magazine, founded The Amateur Computer Society, possibly the first personal computer club.

View Map + Bookmark Entry

Data Corporation Develops a Computer-Assisted Full-Text Inventory System 1966

In 1966 Richard Gering's Data Corporation of Beavercreek, Ohio, contracted with the U.S. Air Force to develop a computer-assisted, full-text system to keep track of procurement contracts and equipment inventory.

View Map + Bookmark Entry

Roger K. Summit's DIALOG Information Retrieval System is Operational at Lockheed Aircraft 1966

In 1966 Roger K. Summit, "the father of online search," had the DIALOG online information retrieval system operational for Lockheed Aircraft in Burbank, California.

View Map + Bookmark Entry

The Second Vatican Council Abolishes the Index Librorum Prohibitorum 1966

The Second Vatican Council in 1966 under Pope Paul VI abolished the Index Librorum Prohibitorum, founded in 1557.

View Map + Bookmark Entry

Cyrus Levinthal Builds the First System for Interactive Display of Molecular Structures 1966

In 1966, using the Project MAC, an early time-sharing system at MIT, Cyrus Levinthal built the first system for the interactive display of molecular structures

"This program allowed the study of short-range interaction between atoms and the "online manipulation" of molecular structures. The display terminal (nicknamed Kluge) was a monochrome oscilloscope (figures 1 and 2), showing the structures in wireframe fashion (figures 3 and 4). Three-dimensional effect was achieved by having the structure rotate constantly on the screen. To compensate for any ambiguity as to the actual sense of the rotation, the rate of rotation could be controlled by globe-shaped device on which the user rested his/her hand (an ancestor of today's trackball). Technical details of this system were published in 1968 (Levinthal et al.). What could be the full potential of such a set-up was not completely settled at the time, but there was no doubt that it was paving the way for the future. Thus, this is the conclusion of Cyrus Levinthal's description of the system in Scientific American (p. 52):

It is too early to evaluate the usefulness of the man-computer combination in solving real problems of molecular biology. It does seems likely, however, that only with this combination can the investigator use his "chemical insight" in an effective way. We already know that we can use the computer to build and display models of large molecules and that this procedure can be very useful in helping us to understand how such molecules function. But it may still be a few years before we have learned just how useful it is for the investigator to be able to interact with the computer while the molecular model is being constructed.

"Shortly before his death in 1990, Cyrus Levinthal penned a short biographical account of his early work in molecular graphics. The text of this account can be found here."

In January 2014 two short films produced with the interactive molecular graphics and modeling system devised by Cyrus Levinthal and his collaborators in the mid-1960s was available at this link.

View Map + Bookmark Entry

NCR Issues the Smallest Published Edition of the Bible, and the First to Reach the Moon 1966

In 1966 the Research and Development department of National Cash Register (NCR) of Dayton, Ohio produced an edition of all 1245 pages of  the World Publishing Company's No. 715 Bible on a single 2" x 1-3/4" photochromatic microform (PCMI) The microform contained both the Old Testament on 773 pages and the New Testament on 746 pages, and was issued in a paper sleeve with title on the cover and information about the process inside and on the back.

On the microform each page of double column Bible text was about 0.5 mm wide and 1 mm high. Each text character was 8 um high (ie 8/1000ths of a millimeter). NCR noted on the paper wallet provided with the microform that this represented a linear reduction of about 250:1 or an area reduction of 62,500:1. This would correspond to the original text being circa 2 mm high. To put this into perspective, NCR also noted that if this reduction was used on the millions of books on the 270+ miles of shelving in the Library of Congress, the entire Library of Congress as it existed in 1966 could be stored in six standard filing cabinets.

♦ In 1971 Apollo 14 lunar module pilot Edgar D. Mitchell carried 100 of the microform bibles aboard the lunar module Antares, as confirmed by NASA's official manifest. Launched January 31, 1971, Mitchell and the bibles reached the Fra Mauro formation of the Moon on February 5 aboard the Antares before returning to the command module for the voyage back to Earth. This was the first edition of the Bible to reach the Moon, and probably the first book of any kind of reach the moon and return. A second parcel containing 200 microform Bibles flew in Edgar Mitchell's command module "PPK" bag in lunar orbit, and did not land. These 200 copies represented extra Bibles to be used if something happened to the lunar module copies.

View Map + Bookmark Entry

Douglas Parkhill Issues a Predictive Discussion of the Features of Cloud Computing 1966

In 1966 Canadian technologist Douglas Parkhill issued a book entitled The Challenge of the Computer Utility. In this work Parkhill predicted and explored features of cloud computing that became widely established by the second decade of the twenty-first century. These features included elastic provision, provision as a utility, online, illusion of infinite supply, the comparison to the electricity industry and the use of public, private, government, and community forms.

View Map + Bookmark Entry

Stanford Research Institute Develops Shakey, the First Intelligent Mobile Robot 1966 – 1972

Developed from approximately 1966 through 1972, Shakey the robot was the first general-purpose mobile robot that could "reason" about its own actions. While other robots at the time had to be instructed step by step in order to complete a larger task, Shakey could analyze commands and break them down into basic steps by itself. "Shakey could perceive its surroundings, create plans, recover from errors that occurred while executing a plan, and communicate with people using ordinary English." 

Shakey was developed by the Artificial Intelligence Center at Stanford Research Institute (now SRI International) in a project funded by DARPA intended to create “intelligent automata” for “reconnaissance" applications. Because the project combined research in robotics, computer vision, and natural language processing, it was the first successful project that combined logical reasoning with physical action. 

"Shakey's overall software design has influenced the design of everything from driverless cars to undersea exploration robots. 

"Shakey's planning methodology has been used in applications ranging from planning beer production at breweries to planning the actions of characters in video games. 

"Variants of Shakey's route-finding software compute your driving directions here on earth, as well as driving directions for the Mars Curiosity rover. (Note that Curiosity is quite a “reconnaissance application”!) 

"Image analysis techniques that enabled Shakey to perceive its world are similarly used to alert today's drivers of cars that may be drifting out of lane" (C[computer]H[istory]M[useum] News, June 1, 2015). 

(This entry was written on the Oceania Riviera off the coast of Sicily in June 2015.) 

View Map + Bookmark Entry

"Bobb Goldsteinn" Coins the Term Multimedia July 1966

American showman, songwriter, and artist Bobb Goldsteinn (Bob Goldstein) coined the term multimedia to promote the July 1966 opening of his "LightWorks at L'Oursin" show at Southampton, Long Island, New York.

"On August 10, 1966, Richard Albarino of Variety borrowed the terminology, reporting: 'Brainchild of songscribe-comic Bob (‘Washington Square’) Goldstein, the ‘Lightworks’ is the latest multi-media music-cum-visuals to debut as discothèque fare' " (Wikipedia article on Multimedia, accessed 08-29-2010).

The evolving concept of multimedia involves combinations of text, still images, video, animation, sound, and interactivity. Thus, technically an illustrated book could be considered a multimedia object with a combination of texts and images; however, multimedia primarily implies combinations of electronic media.

View Map + Bookmark Entry

President Lyndon B. Johnson Signs the Freedom of Information Act (FOIA) July 4, 1966

On July 4, 1966 President Lyndon B. Johnson signed The Freedom of Information Act (FOIA), a federal freedom of information law that allows for the full or partial disclosure of previously unreleased information and documents controlled by the U. S. government.  It became law the following year. The Act defines agency records subject to disclosure, outlines mandatory disclosure procedures, and grants nine exemptions to the statute.

"With the ongoing stress on both constitutional and inherent rights of American citizens and the added assertion of government subservience to the individual, some thought it was necessary for government information to be available to the public.

"However, due to the sensitivity of some government information and private interests, others believed that certain types of government information should remain secret. Therefore, Congress attempted to enact a Freedom of Information Act in 1966 that would effectively deal with requests for government records, consistent with the belief that the people have the “right to know” about them. The Privacy Act of 1974 additionally covered government documents charting individuals.

"However, it is in the exemptions to solicitation of information under these acts that problems and discrepancies arise. The nine exemptions to the FOIA address issues of sensitivity and personal rights. They are (as listed in Title 5 of the United States Code, section 552):

  1. (A) specifically authorized under criteria established by an Executive order to be kept secret in the interest of national defense or foreign policy and (B) are in fact properly classified pursuant to such Executive order;
  2. related solely to the internal personnel rules and practices of an agency;
  3. specifically exempted from disclosure by statute (other than section 552b of this title), provided that such statute (A) requires that the matters be withheld from the public in such a manner as to leave no discretion on the issue, or (B) establishes particular criteria for withholding or refers to particular types of matters to be withheld; FOIA Exemption 3 Statutes
  4. trade secrets and commercial or financial information obtained from a person and privileged or confidential;[7]
  5. inter-agency or intra-agency memoranda or letters which would not be available by law to a party other than an agency in litigation with the agency;
  6. personnel and medical files and similar files the disclosure of which would constitute a clearly unwarranted invasion of personal privacy;
  7. records or information compiled for law enforcement purposes, but only to the extent that the production of such law enforcement records or information (A) could reasonably be expected to interfere with enforcement proceedings, (B) would deprive a person of a right to a fair trial or an impartial adjudication, (C) could reasonably be expected to constitute an unwarranted invasion of personal privacy, (D) could reasonably be expected to disclose the identity of a confidential source, including a State, local, or foreign agency or authority or any private institution which furnished information on a confidential basis, and, in the case of a record or information compiled by a criminal law enforcement authority in the course of a criminal investigation or by an agency conducting a lawful national security intelligence investigation, information furnished by a confidential source, (E) would disclose techniques and procedures for law enforcement investigations or prosecutions, or would disclose guidelines for law enforcement investigations or prosecutions if such disclosure could reasonably be expected to risk circumvention of the law, or (F) could reasonably be expected to endanger the life or physical safety of any individual;
  8. contained in or related to examination, operating, or condition reports prepared by, on behalf of, or for the use of an agency responsible for the regulation or supervision of financial institutions; or
  9. geological and geophysical information and data, including maps, concerning wells" (Wikipedia article on Freedom of Information Act, accessed 11-03-2013).
View Map + Bookmark Entry

Joseph Raben Founds "Computers and the Humanities", the First Humanities Computing Journal September 1966

In 1966 Joseph Raben, professor of English at Queens College in the City University of New York, founded Computers and the Humanities to report on significant new research concerning the application of computer methods to humanities scholarship. This was the first periodical in the nascent field later known as digital humanities, or humanities computing. The "Prospect" of the first issue of the journal, published in September, 1966, (p. 1) placed the field in the context of traditional humanities:

"We define humanities as broadly as possible. our interests include literature of all times and countries, music, the visual arts, folklore, the non-mathematical aspects of linguistics, and all phases of the social sciences that stress the humane. when, for example, the archaeologist is concerned with fine arts of the past, when the sociologist studies the non-material facets of culture, when the linguist analyzes poetry, we may define their intentions as humanistic; if they employ computers, we wish to encourage them and to learn from them. (Prospect, 1966, p. 1)" quoted in Terras, Nyhan & Vanhoutte eds. Defining Digital Humanities: A Reader (2013) Introduction p. 3.

On March 20, 2014 Joseph Raben posted relevant comments on the Humanist Discussion Group, Vol. 27, No. 908, from which I quote:

". . . . The first issue of Computers and the Humanities: A Newsletter (CHum) appeared in September 1966, and immediately began to outgrow its original conception. In an illustration of the paradox of success following an unplanned initiative, people of began to submit articles, and university libraries began to  subscribe. Within a few years, what started as a sixteen-page pamphlet  became the standard journal in its field, with a circulation of about 2000 in all parts of the globe, equal to that of the scholarly journals  of major universities. Among our contributors was J.M. Coetzee, who had worked as a computer programmer while building his reputation as a  novelist and who later won the Nobel Prize in Literature. Throughout the more than two decades that it served the scholarly community, CHum's policy was to present as comprehensive as possible a depiction of the computer's role in expanding the resources of the humanist scholar. Articles covered a wide spectrum of disciplines: literary and linguistic subjects, of course, but also also archaeology, musicology, history, art history, and machine translation. . . ."

View Map + Bookmark Entry

Filed under: Digital Humanities

Laurence G. Roberts Describes Networking Research at MIT October 1966

In October 1966 Lawrence G. Roberts wrote Towards a Cooperative Network of Time-Shared Computers, describing networking research at MIT.

View Map + Bookmark Entry

Lawrence G. Roberts Begins the Design of the ARPANET December 1966

In December 1966 electrical engineer Lawrence G. Roberts became Chief Scientist at the ARPA IPTO (Advanced Research Projects Agency Information Processing Technology Office), and began the design of the ARPANET. The ARPANET program as proposed to Congress by Roberts explored computer resource sharing and packet switching communications to ensure reliability.

View Map + Bookmark Entry

Jacques Bertin's "Sémiologie graphique" is Published 1967 – 1983

In 1967 French cartographer and theorist of information graphics Jacques Bertin published Sémiologie graphique. Les diagrammes, les réseaux, les cartes in Paris at the press of Gauthier-Villars. This book provided the first theoretical foundation for information graphics: a systematic classification of the use of visual elements to display data and relationships, primarily in static graphics. Bertin's system consisted of seven visual variables: position, form (shape), orientation, color (hue), texture, value (lightness or darkness of color), and size, combined with a visual semantics for linking data attributes to visual elements.

Bertin revised his book for a second edition published in 1973. It was translated into German in 1974. In 1983 an English translation of the second French edition by William J. Berg, with a foreword by Howard Wainer, and a new preface by Bertin, was published by the University of Wisconsin Press as Semiology of Graphics: Diagrams, Networks, Maps. In 2010 the English translation was reissued with a new Foreword by Wainer. The 438 page book contained over 1000 images, a few in color.

A widely referenced quotation from Bertin's book is:

"And now, at the end of the twentieth century, with the pressure of modern information and the advances of data processing, graphics is passing through a new and fundamental stage. The great difference between the graphic representation of yesterday, which was poorly dissociated from the figurative image, and the graphics of tomorrow, is the disappearance of the congential fixity of the image.

"When one can superimpose, juxtapose, transpose, and permute graphic images in ways that lead to groupings and classings, the graphic image passes from the dead image, the 'illustration,' to the living image, the widely accessible research instrument it is now becoming. The graphic is no longer only the 'representation' of a final simplification, it is a point of departure for the discovery of these simplifications and the means for their justification. The graphic has become, by its manageability, an instrument for information processing. . . . " [I added the bold face. JN]

In his foreword to the 1983 English translation Wainer called Bertin's work, "the most important work on graphics since the publication of Playfair's Atlas [The Commerical and Political Atlas (1785-86)]."

View Map + Bookmark Entry

Donald Watts Davies Performs an Experiment in Packet Switching 1967

In 1966 Donald Watts Davies developed the NPL Data Network, an experiment in packet switching, at The National Physical Laboratory (NPL) in Teddington, England. 

View Map + Bookmark Entry

Data Corporation Develops a Full-Text Interactive Search Service 1967

In 1967 Data Corporation of Beavercreek, Ohio, contracted with the Ohio Bar Automated Research Corporation to create a full-text, interactive research service for Ohio statutes.

View Map + Bookmark Entry

Ted Nelson & Andries van Dam Develop the First Hypertext Editing System 1967

In 1967 Ted Nelson (Theodor Holm Nelson), Andries van Dam, and students at Brown University collaborated on the first hypertext editing system (HES) based on Nelson's concept of hypertext.

"HES organized data into two main types: links and branching text. The branching text could automatically be arranged into menus and a point within a given area could also have an assigned name, called a label, and be accessed later by that name from the screen. Although HES pioneered many modern hypertext concepts, its emphasis was on text formatting and printing.

"HES ran on an IBM System/360/50 mainframe computer, which was inefficient for the processing power required by the system. The program was used by NASA's Houston Manned Spacecraft Center for documentation on the Apollo space program. The project's research was funded by IBM but the program was stopped around 1969, and replaced by the FRESS (File Retrieval and Editing System) project" (Wikipedia article on Hypertext Editing System, accessed 11-08-2013).

View Map + Bookmark Entry

Jack Kilby and Texas Instruments Invent the First Hand-Held Electronic Calculator 1967 – June 25, 1974

In 1967 Texas Instruments filed the patent for the first hand-held electronic calculator, invented by Jack S. Kilby, Jerry Merryman, and Jim Van Tassel. The patent (Number 3,819,921) was awarded on June 25, 1974. This miniature calculator employed a large-scale integrated semiconductor array containing the equivalent of thousands of discrete semiconductor devices.

View Map + Bookmark Entry

The Museum Computer Network is Founded in New York 1967

In 1967 directors of fifteen New York-area museums formed the Museum Computer Network to create a prototype system for a shared museum data bank. The project recruited curators and registrars to develop a data dictionary that  accommodated the diverse methods used to describe museum collections. The resulting tagged record format allowed for the description of individual objects with separate records for artist biographical information and reference citations. Jack Heller's GRIPHOS (General Retrieval and Information Processor for Humanities Oriented Studies) system provided the information storage, search, and retrieval infrastructures for the records.

View Map + Bookmark Entry

Robert MacArthur & E.O. Wilson Issue "The Theory of "Island" Biogeography" 1967

In 1967 ecologist Robert MacArthur of Princeton and biologist E. O. Wilson of Harvard published The Theory of Island Biogeography through Princeton University Press. In this work they showed that the species richness of an area could be predicted in terms of such factors as habitat area, immigration rate and extinction rate.

"Island biogeography is a field within biogeography that attempts to establish and explain the factors that affect the species richness of natural communities. The theory was developed to explain species richness of actual islands. It has since been extended to mountains surrounded by deserts, lakes surrounded by dry land, forest fragments surrounded by human-altered landscapes. Now it is used in reference to any ecosystem surrounded by unlike ecosystems. The field was started in the 1960s by the ecologists Robert MacArthur and E.O. Wilson, who coined the term theory of island biogeography, as this theory attempted to predict the number of species that would exist on a newly created island.

"For biogeographical purposes, an 'island' is any area of suitable habitat surrounded by an expanse of unsuitable habitat. While this may be a traditional island—a mass of land surrounded by water—the term may also be applied to many untraditional 'islands', such as the peaks of mountains, isolated springs in the desert, or expanses of grassland surrounded by highways or housing tracts. Additionally, what is an island for one organism may not be an island for another: some organisms located on mountaintops may also be found in the valleys, while others may be restricted to the peaks" (Wikipedia article on Island biogeography, accessed 05-08-2009).

View Map + Bookmark Entry

Andrew Viterbi Develops the Viterbi Algorithm 1967

While a professor at UCLA in 1967, Italian-American electrical engineer and businessman Andrew Viterbi developed the Viterbi algorithm,

 "as an error-correction scheme for noisy digital communication links, finding universal application in decoding the convolutional codes used in both CDMA and GSM digital cellular, dial-up modems, satellite, deep-space communications, and 802.11 wireless LANs. It is now also commonly used in speech recognition, keyword spotting, computational linguistics, and bioinformatics. For example, in speech-to-text (speech recognition), the acoustic signal is treated as the observed sequence of events, and a string of text is considered to be the "hidden cause" of the acoustic signal. The Viterbi algorithm finds the most likely string of text given the acoustic signal" (Wikipedia article on Viterbi algorithm, accessed 12-29-2009).

View Map + Bookmark Entry

Henry Kucera and Nelson Francis Issue "Computational Analysis of Present-Day American English" 1967

In 1967 Henry Kucera (born Jindřich Kučera) of Brown University and Nelson Francis published Computational Analysis of Present-Day American EnglishA founding work on corpus linguistics, this book "provided basic statistics on what is known today simply as the Brown Corpus. The Brown Corpus was a carefully compiled selection of current American English, totaling about a million words drawn from a wide variety of sources. Kucera and Francis subjected it to a variety of computational analyses, from which they compiled a rich and variegated opus, combining elements of linguistics, psychology, statistics, and sociology" (Wikipedia article on Brown Corpus, accessed 06-07-2010).

View Map + Bookmark Entry

Edmund Bowles Issues The First Anthology of Research on Humanities Computing 1967

In 1967 musicologist Edmund A. Bowles, in his capacity as manager of Professional Activities in the Department of University Relations at IBM, edited Computers in Humanistic Research. Readings and Perspectives. This was the first anthology of research on humanities computing.

View Map + Bookmark Entry

35,000 Computers Are Operational in the United States 1967

In 1967 there were 35,000 computers operating in the United States.

Bowles (ed.) Computers in Humanistic Research (1967) v,

View Map + Bookmark Entry

Ellis Batten Page Begins Automated Essay Scoring 1967

In 1964 American educational psychologist at the University of Connecticut (StorrsEllis Batten Page, inspired by developments in computational linguistics and artificial intelligence, began research on automated essay scoring. Page published his initial research in 1967 as "Statistical and linguistic strategies in the computer grading of essays," Coling 1967: Conférence Internationale sur le Traitement Automatique des Langues, Grenoble, France, August 1967.  The same year he also published "The imminence of grading essays by computer," Phi Delta Kappan, 47 (1967) 238-243. The following year he published, with Dieter H. Paulus  The analysis of essays by computer (Final report, Project No. 6-1318). Washington, D. C.: Department of Health, Education, and Welfare; Office of Education; Bureau of Research. That year he published his successful work with a program he called Project Essay Grade (PEG) in "The Use of the Computer in Analyzing Student Essays," International Review of Education, 14(3), 253-263. Page's work is considered the beginning of automated essay scoring, the development of which could not become cost effective until computing became far cheaper and more pervasive in the 1990s. 

Later at Duke University, Page renewed his development and research in automated scoring and, in 1993, formed Tru-Judge, Inc., anticipating the potential for commercial applications of the software. In 2002, and in declining health, Page sold the intellectual property assets of Tru-Judge to Measurement Incorporated, educational company that provides achievement tests and scoring services for state governments, other testing companies and various organizations and institutions.

View Map + Bookmark Entry

U.S. Senate Hearings on Computer Privacy Occur March 1967

In 1967 the United States Senate held hearings on computer privacy.

View Map + Bookmark Entry

Wesley Clark Suggests the Use of Interface Message Processors on ARPANET April 1967

At the ARPANET Design Session held by Lawrence G. Roberts at the ARPA IPTO PI meeting in Ann Arbor, Michigan in April 1967 Wesley Clark suggested the use of mini-computers for network packet switches instead of using the main frame computers on the Arpanet for switching. These machines were called Interface Message Processors.

View Map + Bookmark Entry

Protecting Security in a Networked Environment Circa May – September 1967

Between May and September 1967 the Department of Defense requested the Director of the Advanced Research Planning Agency (ARPA) to form a Task Force “to study and recommend hardware and software safeguards that would satisfactorily protect classified information in multi-access, resource-sharing computer systems.” The report was published in February 1970 as Security Controls for Computer Systems: Report of Defense Science Board Task Force on Computer Security - RAND Report R-609-1, edited by Willis H. Ware.

View Map + Bookmark Entry

Steven A. Coons Develops the "Coons Patch" in Computer Graphics June 1967

In June 1967 Steven A. Coons, professor of mechanical engineering and researcher in interactive computer graphics at MIT's Electronic Systems Laboratory, published Surfaces for Computer-aided Design of Space Forms, Project MAC Report MAC-TR-41, MIT.

Known as the "The Little Red Book,

" the paper described what became known as the "Coons Patch"— "a formulation that presented the notation, mathematical foundation, and intuitive interpretation of an idea that would ultimately become the foundation for surface descriptions that are commonly used today, such as b-spline surfaces, NURB surfaces, etc. His technique for describing a surface was to construct it out of collections of adjacent patches, which had continuity constraints that would allow surfaces to have curvature which was expected by the designer. Each patch was defined by four boundary curves, and a set of "blending functions" that defined how the interior was constructed out of interpolated values of the boundaries" (Carlson, A Critical History of Computer Graphics and Animation, accessed 05-30-2009).

View Map + Bookmark Entry

"Our World," the First Live, International Satellite Television Production June 25, 1967

The Our World TV special, the first live, international satellite television production, was broadcast on June 25, 1967 from the BBC control room in London, using the Intelsat I (Early Bird), Intelsat II and ATS-1 satellites.

 "Creative artists, including opera singer Maria Callas, The Beatles and painter Pablo Picasso, representing nineteen different nations were invited to perform or appear in separate segments featuring their respective countries. The two-and-half-hour event had the largest television audience ever up to that date: an estimated 400 million people around the globe watched the broadcast. Today, it is most famous for the segment from the United Kingdom starring The Beatles. They sang their specially composed song "All You Need Is Love" to close the broadcast.

"The project was conceived by BBC producer Aubrey Singer. It was transferred to the European Broadcasting Union, but the master control room for the broadcast was still at the BBC in London. . . .

"It took ten months to bring everything together. One hitch was the sudden pull-out of the Eastern Bloc countries headed by the Soviet Union in the week leading up to the broadcast. Apparently it was a protest at the Western nations' response to the Six-Day War.

"The ground rules included that no politicians or heads of state could participate in the broadcast. In addition, everything had to be 'live', so no use of videotape or film was permitted. Ten thousand technicians, producers, and interpreters took part in this massive broadcast. Each country would have its own announcers, due to language issues, and interpreters would voice-over the original sound when not in a country's native language. In the end 14 countries participated in the production that was transmitted to 31 countries with an estimated audience of between 400 and 700 million people" (Wikipedia article on Our World [TV special] accessed 10-26-2014).

View Map + Bookmark Entry

Douglas Engelbart Invents the Computer Mouse June 27, 1967 – November 17, 1970

On June 27, 1967 electrical engineer and inventor Douglas C. Engelbart of the Augmentation Research Center at SRI filed a patent for an X-Y Position Indicator for a Display System. The device was covered on patent 3,541,541 granted on November 17, 1970. It eventually became known as the Mouse.

View Map + Bookmark Entry

Frederick G. Kilgour Begins Development of OCLC July 5, 1967

On July 5, 1967 three university presidents, three university vice- presidents, and four university library directors from the Ohio College Association met at Ohio State University in Columbus to found the non-profit Ohio College Library Center (OCLC)

"The group hired Frederick G. Kilgour to build a ‘cooperative, computerized network in which most, if not all, Ohio libraries would participate.’ Fred’s idea was to merge the newest information storage and retrieval system, the computer, with the oldest, the library. His vision was that this new computerized library would be active rather than passive, that people would no longer go to the library, but that the library would go to the people. Back in 1967, this was a rather revolutionary idea.

"The first step in this vision would be to merge the catalogs of Ohio libraries electronically through a computer network and database. The network and database would streamline operations and control rising costs. It also would bring libraries together to work cooperatively to keep track of the world’s information for the benefit of researchers and scholars" (http://www.oclc.org/about/history/beginning.htm, accessed 03-07-2012).

After the bibliographical database expanded far beyond the state of Ohio it was renamed Online Computer Library Center, retaining the same initials.

View Map + Bookmark Entry

Donald W. Davies Introduces the Term "Packet" to Describe Discrete Blocks of Data October 1967

In October 1967 Welsh computer scientist Donald W. Davies of the National Physical Laboratory, Teddington, England, introduced the use of the term “packet” to describe discrete blocks of data sent over networks in his paper called “A Digital Communications Network for Computers. Giving Rapid Response at Remote Terminals." Davies presented his paper at the ACM Symposium on Operating System Principles held October 1–4, 1967 in Gatlinburg, Tennessee

View Map + Bookmark Entry

Lawrence G. Roberts Issues the First Paper on the Design of the ARPANET October 1967

In October 1967 Lawrence G. Roberts published the first paper on the design of the ARPANET: “Multiple computer networks and intercomputer communication,” at the ACM Symposium on Operating System Principles at Gatlinburg, Tennessee(See Reading 13.5)

View Map + Bookmark Entry

The Computer Artis Society, the First Society for Computer Art, is Founded in London 1968

In the months following the ground breaking London exhibition, Cybernetic Serendipity, that showcased computer-based and technologically influenced works in graphics, music, film, and interactivity, Alan SutcliffeGeorge Mallen, and John Lansdown founded the Computer Arts Society in London. The Society enabled relatively isolated artists working with computers in a variety of fields to meet and exchange information. It also ran practical courses, conferences and exhibitions.

"In March 1969, CAS organised an exhibition entitled Event One, which was held at the Royal College of Art. The exhibition showcased innovative work with computers across a broad range of disciplines, including sculpture, graphics, music, film, architecture, poetry, theatre and dance. CAS founder John Lansdown, for example, designed and organised a dance performance that was choreographed entirely by the computer and performed by members of the Royal Ballet School. The multi-media approach of exhibitions such as Event One greatly influenced younger artists and designers emerging at this time. Many of these artists were rebelling against the traditional fine art hierarchies of the time, and went on to work in the new fields of computer, digital, and video art as a result.

"CAS established links with educational establishments, journalists and industry, ensuring greater coverage of their activities and more importantly helping to provide access to computing technology at a time when this was difficult. CAS members were remarkably ahead of their time in recognising the long term impact that the computer would have on society, and in providing services to those already working creatively with the computer. By 1970 CAS had 377 members in 17 countries. Its journal 'PAGE' was first edited by auto-destructive artist Gustav Metzger, and is still being produced today. The Computer Arts Society is a specialist group of the British Computer Society" (http://www.vam.ac.uk/content/articles/t/v-and-a-computer-art-collections/, accessed 01-19-2014).

In January 2014 all of the early issues of Page, beginning with "Page 1," April 1969 were available from the website of the Computer Arts Society Specialty Group of the BCS at this link.

In 2007 the Computer Arts Society donated its collection of original computer art to the Victoria and Albert Museum in London, which maintains one of the world's largest and most significant collections of computer art. The V&A's holdings in this field were the subject of an article by Honro Beddard entitled "Computer Art at the V&A," V&A Online Journal, Issue No. 2 (2009), accessed 01-19-2014). 

View Map + Bookmark Entry

Ivan Sutherland and Bob Sproull Create the First Virtual Reality Head Mounted Display System 1968

In 1968 Ivan Sutherland at the University of Utah, with the help of his student Bob Sproull, created the first Virtual Reality (VR) and Augmented Reality (AR) head mounted display system.

Sutherland's head mounted display was so heavy that it had to be suspended from the ceiling, and the formidable appearance of the device inspired its name—the Sword of Damocles. The system was primitive both in terms of user interface and realism, and the graphics comprising the virtual environment were simple wireframe rooms.

View Map + Bookmark Entry

Evans & Sutherland Commercialize the Use of Computers as Simulators 1968

In 1968 Ivan Sutherland and David Evans, both professors at the University of Utah, founded Evans & Sutherland to commercialize the use of computers as simulators for training purposes.

View Map + Bookmark Entry

The HP 9100A, the First Marketed, Mass-Produced Programmable Calculator, or Personal Computer 1968

In 1968 Hewlett Packard, Palo Alto, California, introduced the programmable desk calculator, the HP 9100A.

"HP called it a desktop calculator, because, as Bill Hewlett said, 'If we had called it a computer, it would have been rejected by our customers' computer gurus because it didn't look like an IBM. We therefore decided to call it a calculator, and all such nonsense disappeared.' An engineering triumph at the time, the logic circuit was produced without any integrated circuits; the assembly of the CPU having been entirely executed in discrete components. With CRT display, magnetic-card storage, and printer, the price was around $5000. The machine's keyboard was a cross between that of a scientific calculator and an adding machine. There was no alphabetic keyboard" (Wikipedia article on Hewlett-Packard, accessed 03-10-2010).

View Map + Bookmark Entry

NUC: The Largest Printed Bibliography, Complete in 754 Folio Volumes 1968 – 1981

In 1968 Mansell, in London, began publication of The National Union Catalog, Pre-1956 Imprints: a Cumulative Author List Representing Library of Congress Printed Cards and Titles Reported by other American Libraries (NUC). One of the largest sets of printed volumes ever published, and the largest printed bibliography, it was completed in 1981 in 754 folio volumes, containing a total of over 12,000,000 entries on 528,000 pages, and occupying approximately 130 feet of shelf space. It was produced manually by photocopying library catalogue cards.

Before OCLC was available outside of institutional libraries, in 1984 Breslauer & Folter, in their survey of historical bibliographical classics entitled Bibliography, its History and Development, characterized NUC as "The most extensive general bibliographical compilation of all time." In 2013, when I last revised this entry, their assertion remained valid if the scope of the comparison was limited to bibliographies printed on paper. 

Around 1995 NUC was superceded in certain respects (especially ease of access, ease of updating, and shelf space) by bibliographical databases such as OCLC WorldCat on the Internet.  Experts were quick to point out that bibliographical nuances and details present in some NUC entries were lost, or sometimes confused, in the electronic conversion. Whether any of those details would eventually be added or mistakes corrected in the database remained unknown.

♦ As a reflection of changing times and a shortage of shelf space, in November 2013 antiquarian bookseller Ian Jackson of Berkeley offered for sale in his Cedules from a Berkeley Bookshop, No. 28, a full set of NUC, "one of the few sets without library markings", for $754 or one dollar per volume, with the caveat that "Buyer removes."  This last element was crucial as the set weighed perhaps 10 pounds per volume. 

In conclusion, I cannot resist quoting Jackson's final paragraph of his description:

"Other attractions: the four volumes on the Bible remain the largest inventory in print, and in volume 671 (p. 565, s.v. Wolveridge) a presumably disgruntled employee has written 'Anyone paying good bucks for the crap in this catalogue has been royally screwed by us.' "

View Map + Bookmark Entry

Stanley Kubrick & Arthur C. Clarke Create "2001: A Space Odyssey" 1968

In 1968 the film 2001: A Space Odyssey, written by American film director Stanley Kubrick in collaboration with science fiction writer and futurist Arthur C. Clarke, captured imaginations with the idea of a computer that could see, speak, hear, and “think.” 

Perhaps the star of the film was the HAL 9000 computer. "HAL (Heuristically programmed ALgorithmic Computer) is an artificial intelligence, the sentient on-board computer of the spaceship Discovery. HAL is usually represented only as his television camera "eyes" that can be seen throughout the Discovery spaceship.... HAL is depicted as being capable not only of speech recognition, facial recognition, and natural language processing, but also lip reading, art appreciation, interpreting emotions, expressing emotions, reasoning, and chess, in addition to maintaining all systems on an interplanetary voyage.

"HAL is never visualized as a single entity. He is, however, portrayed with a soft voice and a conversational manner. This is in contrast to the human astronauts, who speak in terse monotone, as do all other actors in the film" (Wikipedia article on HAL 9000, accessed 05-24-2009).

"Kubrick and Clarke had met in New York City in 1964 to discuss the possibility of a collaborative film project. As the idea developed, it was decided that the story for the film was to be loosely based on Clarke's short story "The Sentinel", written in 1948 as an entry in a BBC short story competition. Originally, Clarke was going to write the screenplay for the film, but Kubrick suggested during one of their brainstorming meetings that before beginning on the actual script, they should let their imaginations soar free by writing a novel first, which the film would be based on upon its completion. 'This is more or less the way it worked out, though toward the end, novel and screenplay were being written simultaneously, with feedback in both directions. Thus I rewrote some sections after seeing the movie rushes -- a rather expensive method of literary creation, which few other authors can have enjoyed.' The novel ended up being published a few months after the release of the movie" (Wikipedia article on Arthur C. Clarke, accessed 05-24-2009).

View Map + Bookmark Entry

Philip K. Dick Imagines Replicants 1968

In 1968 American writer Philip K. Dick published his science fiction novel, Do Androids Dream of Electric Sheep? It told of the moral crisis of Rick Deckard, a bounty hunter who stalked androids—robots visually identifical to people—in a fall-out clouded, dystopic, partially deserted San Francisco.

In 1982 the novel was brought to the screen as Blade Runner, with its location changed to Los Angeles. 

View Map + Bookmark Entry

Unbundling at IBM Gives Rise to the Software and Services Industry 1968

In 1968 IBM adopted a new marketing policy of charging separately for most systems engineering activities, future computer programs, and customer education courses. This “unbundling” gave rise to the software and services industry.

View Map + Bookmark Entry

Helmut Gröttrup & Jürgen Dethloff Invent the "Smart Card" 1968 – 1984

In 1968 German electrical engineers Helmut Gröttrup of Stuttgart and Jürgen Dethloff, of Hamburg, invented the smart card (chip card, or integrated circuit card [ICC]) and applied for the patent. The patent for the smart card was finally granted to both inventors in 1982. The first wide use of the cards was for payment in French pay phones—France Telecom Télécarte—starting in 1983-84.

View Map + Bookmark Entry

Stewart Brand Issues "The Whole Earth Catalog": Google and Blogging before the Internet 1968

In Fall 1968 American writer and founder of organizations Stewart Brand of the Portola Institute, Menlo Park, California, published the first edition of the Whole Earth Catalog. Access to tools, with the goal of providing education and "access to tools" so a reader could "find his own inspiration, shape is own environment, and share his adventure with whoever is interested."

In his June 2005 Stanford University commencement address Steve Jobs compared The Whole Earth Catalog to Google. "When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation.... It was sort of like Google in paperback form, 35 years before Google came along. It was idealistic and overflowing with neat tools and great notions." During the commencement speech, Jobs also quoted the farewell message placed on the back cover of the 1974 edition of the catalog: 'Stay hungry. Stay foolish.' "

A similar comparison was made by Kevin Kelly in 2008:

"For this new countercultural movement, information was a precious commodity. In the ’60s, there was no Internet; no 500 cable channels. [The Whole Earth Catalog] was a great example of user-generated content, without advertising, before the Internet. Basically, Brand invented the blogosphere long before there was any such thing as a blog. ... No topic was too esoteric, no degree of enthusiasm too ardent, no amateur expertise too uncertified to be included.... This I am sure about: it is no coincidence that the Whole Earth Catalogs disappeared as soon as the web and blogs arrived. Everything the Whole Earth Catalogs did, the web does better" (Wikipedia article on Whole Earth Catalog, accessed 12-06-2013).

In December 2013 

View Map + Bookmark Entry

Stephen A. Benton Invents the Rainbow Hologram or Benton Hologram 1968

In 1968 Stephen A. Benton, then of Polaroid Corporation, and later at MIT's Media Lab, invented the Benton hologram or rainbow hologram, a hologram designed to be viewed under white light illumation rather than laser light, which was required to view holograms before this invention.  

"The rainbow holography recording process uses a horizontal slit to eliminate vertical parallax in the output image, greatly reducing spectral blur while preserving three-dimensionality for most observers. A viewer moving up or down in front of a rainbow hologram sees changing spectral colors rather than different vertical perspectives. Stereopsis and horizontal motion parallax, two relatively powerful cues to depth, are preserved. The holograms found on credit cards are examples of rainbow holograms" (Wikipedia article on rainbow hologram, accessed 11-23-2012).

View Map + Bookmark Entry

Andries van Dam Develops Probably the First Hypertext System Used in Education 1968

In 1968 Andries van Dam and his students at Brown University, including Bob Wallace, introduced The File Retrieval and Editing SyStem, or FRESS. FRESS was a continuation of work done on van Dam's previous hypertext system, HES, developed in 1967.

"FRESS ran on an IBM 360-series mainframe running VM/CMS. It implemented one of the first virtual-terminal interfaces, and could run on various terminals from dumb typewriters up to the Imlac PDS-1 graphical minicomputer. On the PDS-1, it supported multi-window WYSIWYG editing and graphics display. Notably, the PDS-1 used a lightpen rather than a mouse, and the lightpen could be "clicked" using a cathartic foot-pedal.

"FRESS improved on HES's capabilities in many ways. FRESS documents could be of arbitrary size, and (unlike prior systems) were not laid out in lines until the moment of display. FRESS users could insert a marker at any location within a text document and link the marked selection to any other point either in the same document or a different document.

"FRESS had two types of links: tags and "jumps". Tags were links to information such as references or footnotes, while "jumps" were links that could take the user through many separate but related documents, much like the World Wide Web of today. FRESS also had the ability to assign keywords to links or text blocks to assist with navigation. Keywords could be used to select which sections to display or print, which links would be available to the user, and so on. Multiple "spaces" were also automatically maintained, including an automatic table of contents and indexes to keywords, document structures, and so on.

"FRESS is also possibly the first computer-based system to have had an "undo" feature for quickly correcting small editing or navigational mistakes.

"FRESS was essentially a text-based system and editing links was a fairly complex task unless you had access to the PDS-1 terminal, in which case you could select each end with the lightpen and create a link with a couple of keystrokes. FRESS provided no method for knowing where the user was within a collection of documents.

"FRESS was used for instructional computing (probably being the first hypertext system used in education), particularly for teaching poetry, as well as typesetting many books, notably by philosopher Roderick Chisholm. For example, in the Preface to Person and Object Chisholm writes 'The book would not have been completed without the epoch-making File Retrieval and Editing System...'  FRESS was for many years the word processor of choice at Brown and a small number of other sites " (Wikipedia article on File Retrieval and Editing System, accessed 11-08-2013).

View Map + Bookmark Entry

Max Perutz Solves the Molecular Structure of Hemoglobin at High Resolution 1968

Thirty years after beginning his research on hemoglobin Austrian-born British molecular biologist Max Perutz at Cambridge solved the Fourier synthesis of hemoglobin at 2.8Å, and built an atomic model of the molecule.

Perutz el al, "Three-dimensional Fourier Synthesis of Horse Oxyhaemoglobin at 2.8Å Resolution: The Atomic Model," Nature 219 (1968) 131-39.

View Map + Bookmark Entry

"Incredible Machine" State of the Art in Computer Generated Film, Graphics and Music in 1968 1968

In 1968 scientists at Bell Labs in Murray Hill, New Jersey, created "Incredible Machine,"a color film that that represented the state-of-the- art in computer-generated film, graphics, and music at the time. 

The film featured artwork and computer graphics by Ken Knowlton and 
computer-generated music by Max Mathews. The title sequence was programmed by A. Michael Noll using his four-dimensional animation technique and is perhaps the first use of computer animation for title sequences. The computer ballet during the end credits was by A. Michael Noll. The basilar membrane animation was done by Robert C. Lummis, Man Mohan Sondhi, and A. Michael Noll.

View Map + Bookmark Entry

Lee Harrison's Scanimate: The First Widely Applied Analog Computer Animation System Circa 1968 – 1985

In 1962 Lee Harrison III built ANIMAC, a hybrid graphic animation computer. This was a precursor to the Scanimate, an analog computer animation system designed and built Harrison and the Computer Image Corporation in Denver. From the around 1969 to the mid 1980s Scanimates were used to produce a significant portion of the video-based animation seen on television in commercials, show titles, and other graphics. Scanimates could create animations in real time. This helped the system to supercede film-based animation techniques for television graphics, and from the early 1970s to early 1980s systems were in operation in Japan, Australia, Luxembourg, London, New York, Hollywood and Denver. Altogether eight Scanimate systems were built. The systems were also used in films. However, by the mid-1980s Scanimate was superseded by digital computer animation, which produced sharper images and more sophisticated 3D imagery.

"Animations created on Scanimate and similar analog computer animation systems have a number of characteristic features that distinguish them from film-based animation: The motion is extremely fluid, using all 60 fields per second (in NTSC format video) or 50 fields (in PAL format video) rather than the 24 frames per second that film uses; the colors are much brighter and more saturated; and the images have a very "electronic" look that results from the direct manipulation of video signals through which the Scanimate produces the images.

"A special high-resolution (around 800 lines) monochrome camera records high-contrast artwork. The image is then displayed on a high-resolution screen. Unlike a normal monitor, its deflection signals are passed through a special analog computer that enables the operator to bend the image in a variety of ways. The image is then shot from the screen by either a film camera or a video camera. In the case of a video camera this signal is then fed into a colorizer, a device that takes certain shades of grey and turns it into color as well as transparency. The idea behind this is that the output of the Scanimate itself is always monochrome. Another advantage of the colorizer is that it gives the operator the ability to continuously add layers of graphics. This makes possible the creation of very complex graphics. This is done by using two video recorders. The background is played by one recorder and then recorded by another one. This process is repeated for every layer. This requires very high-quality video recorders (such as both the Ampex VR-2000 or IVC's IVC-9000 of Scanimate's era, the IVC-9000 being used quite frequently for Scanimate composition due to its very high generational quality between re-recordings) " (Wikipedia article on Scanimate, accessed 02-01-2014).

In January 2014 a major archive of Scanimate information and videos was available from Dave Sieg's scanimate.sfx.com.

View Map + Bookmark Entry

Aaron Klug Invents Three-Dimensional Image Processing January 1968

In January 1968 English molecular biologist Aaron Klug described techniques for the reconstruction of three-dimensional structures from electron micrographs, thus founding the processing of three-dimensional digital images.

D. J. de Rosier and A. Klug, “Reconstruction of three dimensional structures from electron micrographs,” Nature 217 (1968) 130-34.

View Map + Bookmark Entry

Federico Faggin and Colleagues Invent Silicon Gate Technology at Fairchild Semiconductor February 1968

Silicon Gate Technology, invented in 1968 by Federico Faggin and colleagues at Fairchild Semiconductor in Palo Alto, California, was the first process technology used to fabricate commercial MOS (Metal Oxide Semiconductor) integrated circuits that was later widely adopted by the entire industry. Faggin also designed the first integrated circuit using a silicon gate, the Fairchild 3708. From the founding of Intel in July 1968 Robert Noyce and Gordon Moore adopted silicon gate technology, and within a few years it became the core technology for the fabrication of MOS integrated circuits worldwide. 

"In February 1968, Federico Faggin joined Les Vadasz’s group and was put in charge of the development of a low-threshold-voltage, self-aligned gate MOS process technology. Federico Faggin's first task was to develop the precision etching solution for the amorphous silicon gate, and then he created the process architecture and the detailed processing steps to fabricate MOS ICs with silicon gate. He also invented the ‘buried contacts,’ a method to make direct contact between amorphous silicon and silicon junctions, without the use of metal, a technique that allowed a much higher circuit density, particularly for random logic circuits.

"After validating and characterizing the process using a test pattern he designed, Federico Faggin made the first working MOS silicon gate transistors and test structures by April 1968. He then designed the first integrated circuit using silicon gate, the Fairchild 3708, an 8-bit analog multiplexer with decoding logic, that had the same functionality of the Fairchild 3705, a metal-gate production IC that Fairchild Semiconductor had difficulty making on account of its rather stringent specifications.... (Wikipedia article on Self-Aligned gate, accessed 12-02-2013).

View Map + Bookmark Entry

Licklider & Taylor Describe Features of the Future ARPANET; Description of a Computerized Personal Assistant April 1968

In 1968 American psychologist and computer scientist J.C.R. Licklider of MIT and Robert W. Taylor, then director of ARPA's Information Processing Techniques Office, published "The Computer as a Communication Device," Science and Technology, April 1968. In this paper, extensively illustrated with whimsical cartoons, they described features of the future ARPANET and other aspects of anticipated human-computer interaction.

Honoring the artificial intelligence pioneer Oliver Selfridge, on pp. 38-39 of the paper they proposed a device they referred to as OLIVER (On-Line Interactive Vicarious Expediter and Responder). OLIVER was one of the clearest early descriptions of a computerized personal assistant:

"A very important part of each man's interaction with his on-line community will be mediated by his OLIVER. The acronym OLIVER honors Oliver Selfridge, originator of the concept. An OLIVER is, or will be when there is one, an 'on-line interactive vicarious expediter and responder,' a complex of computer programs and data that resides within the network and acts on behalf of its principal, taking care of many minor matters that do not require his personal attention and buffering him from the demanding world. 'You are describing a secretary,' you will say. But no! secretaries will have OLIVERS.

"At your command, your OLIVER will take notes (or refrain from taking notes) on what you do, what you read, what you buy and where you buy it. It will know who your friends are, your mere acquiantances. It will know your value structure, who is prestigious in your eyes, for whom you will do with what priority, and who can have access to which of your personal files. It will know your organizations's rules pertaining to proprietary information and the government's rules relating to security classification.

"Some parts of your OLIVER program will be common with parts of ther people's OLIVERS; other parts will be custom-made for you, or by you, or will have developed idiosyncracies through 'learning based on its experience at your service."

View Map + Bookmark Entry

The First U.S. Conference on Museum Computing Occurs at the Metropolitan Museum of Art April 1968

In April 1968 the Museum Computer Network and the Metropolitan Museum of Art, with funding from IBM, organized the first U.S. conference on museum computing.

View Map + Bookmark Entry

Landmark Products from the Early Years of Intel Corporation July 18, 1968 – 1985

On July 18, 1968 Robert Noyce, Gordon Moore and Andrew Grove from Fairchild Semiconductor founded NM Electronics, later known as Intel. The company's first property was purchased in Santa Clara, California.

1970: The Intel 1103

In 1970 Intel announced the Intel 1103, the world's first commercially available Dynamic Random Access Memory (DRAM) chip (1K bit pMOS dynamic RAM ICs).

1971: The Intel 4004

In November 1971 announced the first microprocessor: the  Intel 4004  four-bit central processor logic chip (U.S. Patent #3,821,715). Invented by Intel engineers Federico FagginMarcian Edward "Ted" HoffStanley Mazor and Masatosi Shima, this was the first microprocessor. The size of "a little fingernail," the 4004 contained 2400 transistors and delivered more computing power than the ENIAC, which occupied a large room.

Quoting from:

"The Crucial Role Of Silicon Design In The Invention Of The Microprocessor (A Testimonial from Federico Faggin, designer of the 4004 and developer of its enabling technology)

"Every time there is a new and important invention, there are many people who claim to be their inventor. This is also the case for the microprocessor. What are then the criteria to determine what the invention is and who invented it? What is exactly the microprocessor and what is novel about it? 

"The microprocessor is the central processing unit (CPU) of a general-purpose electronic computer implemented in a single integrated circuit. The Intel 4004 was unquestionably the world’s first commercial microprocessor. No one had commercialized a single-chip CPU prior to Intel. There are people, however, who claim to have built CPUs in more than one chip before the 4004, although they were never commercialized as chip-sets but were used only in proprietary equipment. For example, Raymond Holt claims to have built with his team a three-chip microprocessor in 1969 for the US Navy’s F-14A; Lee Boysel of Four Phase Systems Inc., claims that he and his team created the first microprocessor, which was incorporated as part of a system, in 1969. Although their contributions were remarkable, their CPU implementation, not being a single chip, was not a microprocessor. 

"Why is one chip so much different or better than two or three chips? If we accept to call a microprocessor a three-chip implementation of a CPU, then why shouldn’t a four or five-chip implementation be also called a microprocessor? Pretty soon it would be impossible to distinguish a microprocessor from a CPU board built with conventional components! A single chip is important not only because of its simplicity and elegance, but because a one-chip CPU is the irreducible minimum for a CPU, thus optimizing all the critical requirements of size, speed, cost and energy consumption. The microprocessor changed the world of computing exactly because it reduced to an absolute minimum the size, cost and energy consumption of a CPU while maximizing its speed. 

"The existence of multiple-chip CPU realizations predating the 4004 indicates that the critical contribution of the 4004 in the industry was its implementation in a single chip rather than in multiple chips. This fact places much emphasis on the fundamental role played by the chip design that enabled the integration of the 4004 in a single chip, more than on its architecture. Simple CPU architectures requiring two to three thousand transistors – the same number of transistors used in the 4004 -- were generally known in 1968-1969, however it was not possible to integrate all those transistors in a single chip with the MOS technology available at that time.

"The primary reason for the appearance of the microprocessor in 1971, rather than a few years later and possibly by other companies, was the existence of the MOS Silicon Gate Technology (SGT). With the silicon gate technology, twice as many transistors could be integrated in the same chip size than with conventional metal gate MOS technology, using the same amount of energy, and with a speed advantage of about 4:1. This technology, originally developed by Federico Faggin at Fairchild Semiconductor in 1968, had also been adopted by Intel. In 1970, only Fairchild and Intel had been able to master the SGT. The 4004 could be integrated and made to function in a single chip not only because of Faggin’s intimate understanding of the silicon gate technology and his skills as a chip designer, but also because of all the additional technological and circuit innovations he created to make it possible (new methodology for random logic using silicon gatebootstrap loadburied contact, power-resettable flip-flop - US patent 3.753.011-, new flip-flop design used in a novel static MOS shift register). 

"There is a very specific and quite striking example showing that the chip design, more than its architecture, was the key to the creation of the microprocessor -- it is the CPU used in the Datapoint 2200 terminal. Conceived in 1969 by Computer Terminal Corporation (CTC), Texas Instruments attempted to integrate this CPU in a single chip in 1971, as a custom project commissioned by CTC. Described in the press in mid-1971, only a few months after the 4004 completion, this chip never functioned and it was never commercialized. In early 1972, exactly the same CPU that Texas Instruments failed to make viable, was integrated at Intel (assigned to Hal Feeney, under Faggin’s supervision) using the silicon gate technology and the CPU design methodology created by Federico Faggin. This CPU became the Intel 8008 microprocessor, and was first commercialized in April 1972. The 8008 chip size was about half the size of Texas Instrument’s chip and it worked perfectly" (http://www.intel4004.com/hyatt.htm, accessed 12-02-2013). 

1972:  The Intel 8008

In April 1972 Intel introduced the 8008 microprocessor, the first 8-bit microprocessor. With an external 14-bit address bus that could address 16KB of memory, it became the CPU for the first commercial, non-calculator personal computers: the US SCELBI kit and the pre-built French Micral N and Canadian MCM/70, and the Datapoint 2200.

"Originally known as the 1201, the chip was commissioned by Computer Terminal Corporation (CTC) to implement an instruction set of their design for their Datapoint 2200 programmable terminal. As the chip was delayed and did not meet CTC's performance goals, the 2200 ended up using CTC's own TTL based CPU instead. An agreement permitted Intel to market the chip to other customers after Seiko expressed an interest in using it for a calculator" (Wikipedia article on Intel 8008, accessed 12-02-2013). 

1974:  The Intel 8080

In April 1974 Intel released the 8080 eight-bit microprocessor, considered by many to be the first general-purpose microprocessor. It featured 4,500 transistors and about ten times the performance of its predecessors. Within a year the 8080 was designed into hundreds of different products, including the MITS Altair 8800 designed by H. Edward Roberts. 

"The 8080 also changed how computers were created. When the 8080 was introduced, computer systems were usually created by computer manufacturers such as Digital Equipment CorporationHewlett Packard, or IBM. A manufacturer would produce the entire computer, including processor, terminals, and system software such as compilers and operating system. The 8080 was actually designed for just about any application except a complete computer system. Hewlett Packard developed the HP 2640series of smart terminals around the 8080. The HP 2647 was a terminal which ran BASIC on the 8080. Microsoft would market as its founding product the first popular programming language for the 8080, and would later acquire DOS for the IBM-PC" (Wikipedia article on Intel 8080, accessed 12-02-2013).

1978:  The Intel 8086

In 1978 Intel introduced the 8086 sixteen-bit microprocessor. The 8086 gave rise to the x86 architecture which eventually turned out as Intel's most successful line of processors.

1979: The Intel 8088

On July 1, 1979 Intel introduced the 8088 microprocessor, a low-cost version of the 8086 using an eight-bit external bus instead of the 16-bit bus of the 8086, allowing the use of cheaper and fewer supporting logic chips. It was the processor used in the original IBM PC.

1985:  The Intel 386

In 1985 Intel introduced the 32-bit 386 microprocessor. It featured 275,000 transistors— more than 100 times as many as the first Intel microprocessor, the 4004, developed in 1971.

(This entry was last revised on 01-18-2015.)

View Map + Bookmark Entry

Max Perutz Opens Up the Field of Molecular Pathology August 1968

In August 1968 Max Perutz opened up "the field of 'molecular pathology,' relating a structural abnormality to a disease" (Aaron Klug, "Max Perutz 1914-2002," Science 295 (2002) 2383). Specifically Perutz showed that hemoglobin molecules collapse into a sickle shape in the blood disorder sickle-cell anemia.

Perutz and Lehmann, H., "Molecular Pathology of Human Hemoglobin," Nature 219 (1968) 902-09.

View Map + Bookmark Entry

Cybernetic Serendipity: The First Widely-Attended International Exhibition of Computer Art August 2 – October 20, 1968

From August 2  to October 20, 1968 Cybernetic Serendipity: The Computer and the Arts was exhibited at the Institute of Contemporary Arts in London, curated by British art critic, editor, and Assistant Director of the Institute of Contemporary Arts Jasia Reichardt, at the suggestion of Max Bense. This was the first widely attended international exhibition of computer art, and the first exhibition to attempt to demonstrate all aspects of computer-aided creative activity: art, music, poetry, dance, sculpture, animation.

In the video below Jasia Reichardt introduced the exhibition:

"It drew together 325 participants from many countries; attendance figures reached somewhere between 45,000 and 60,000 (accounts differ) and it received wide and generally positive press coverage ranging from the Daily Mirror newspaper to the fashion magazine Vogue. A scaled-down version toured to the Corcoran Gallery in Washington DC and then the Exploratorium, the museum of science, art and human perception in San Francisco. It took Reichardt three years of fundraising, travelling and planning" (Mason, a computer in the art room. the origins of british computer arts 1950-80 [2008] 101-102)

For the catalogue of the show Reichardt edited a special issue of Studio International magazine, consisting of 100 pages with 300 images, publication of which coincided with the exhibition in 1968. The color frontispiece reproduced a color computer graphic by the American John C. Mott-Smith "made by time-lapse photography successively exposed through coloured filters, of an oscilloscope connected to a computer." The cover of the special issue was designed by the Polish-British painter, illustrator, film-maker, and stage designer Franciszka Themerson, incorporating computer graphics from the exhibition. Laid into copies of the special issue were 4 leaves entitled "Cybernetic Serendipity Music," each page providing a program for one of eight tapes of music played during the show. This information presumably was not available in time to be printed in the issue of Studio International.

Reichardt's Introduction  (p. 5) included the following:

"The exhibition is divided into three sections, and these sections are represented in the catalogue in a different order:

"1. Computer-generated graphics, computer-animated films, computer-composed and -played music, and computer poems and texts.

"2. Cybernetic devices as works of art, cybernetic enironments, remoted-control robots and painting machines.

"3. Machines demonstrating the uses of computers and an environment dealing with the history of cybernetics.

"Cybernetic Sernedipity deals with possibilites rather than achievements, and in this sense it is prematurely optimistic. There are no heroic claims to be made because computers have so far neither revolutionized music, nor art, nor poetry, the same way that they have revolutionized science.

"There are two main points which make this exhibition and this catalogue unusual in the contexts in which art exhibitions and catalogues are normally seen. The first is that no visitor to the exhibition, unless he reads all the notes relating to all the works, will know whether he is looking at something made by an artist, engineer, mathematician, or architect. Nor is it particularly important to know the background of all the makers of the various robots, machines and graphics- it will not alter their impact, although it might make us see them differently.

"The other point is more significant.

"New media, such as plastics, or new systems such as visual music notation and the parameters of concrete poetry, inevitably alter the shape of art, the characteristics of music, and content of poetry. New possibilities extend the range of expression of those creative poeple whom we identify as painters, film makers, composers and poets. It is very rare, however, that new media and new systems should bring in their wake new people to become involved in creative activity, be it composiing music drawing, constructing or writing.

"This has happened with the advent of computers. The engineers for whom the graphic plotter driven by a computer represented nothing more than a means of solving certain problems visually, have occasionally become so interested in the possibilities of this visual output, that they have started to make drawings which bear no practical application, and for which the only real motives are the desire to explore, and the sheer pelasure of seeing a drawing materialize. Thus people who would never have put pencil to paper, or brush to canvas, have started making images, both still and animated, which approximate and often look identical to what we call 'art' and put in public galleries.

"This is the most important single revelation of this exhibition." 

Some copies of the special issue were purchased by Motif Editions of London.  Those copies do not include the ICA logo on the upper cover and do not print the price of 25s. They also substitute two blanks for the two leaves of ads printed in the back of the regular issue. They do not include the separate 4 leaves of programs of computer music.  These special copies were sold by Motif Editions with a large  (75 x 52 cm) portfolio containing seven 30 x 20 inch color lithographs with a descriptive table of contents. The artists included Masao Komura/Makoto Ohtake/Koji Fujino (Computer Technique Group); Masao Komura/Kunio Yamanaka (Computer Technique Group); Maugham S. Mason, Boeing Computer Graphics; Kerry Starnd, Charles "Chuck" Csuri/James Shaffer & Donald K. Robbins/ The art works were titled respectively 'Running Cola is Africa', 'Return to Square', 'Maughanogram', 'Human Figure', 'The Snail', 'Random War' & '3D Checkerboard Pattern'.  Copies of the regular edition contained a full-page ad for the Motif Editions portfolio for sale at £5 plus postage or £1 plus postage for individual prints.

In 1969 Frederick A. Praeger Publishers of New York and Washington, DC issued a cloth-bound second edition of the Cybernetic Serendipity catalogue with a dust jacket design adapted from the original Studio International cover. It was priced $8.95. The American edition probably coincided with the exhibition of the material at the Corcoran Gallery in Washington. The Praeger edition included an index on p. 101, and no ads. Comparison of the text of the 1968 and 1969 editions shows that the 1969 edition contains numerous revisions and changes.

In 2005 Jasia Reichardt looked back on the exhibition with these comments:

"One of the journals dealing with the Computer and the Arts in the mid-sixties, was Computers and the Humanities. In September 1967, Leslie Mezei of the University of Toronto, opened his article on 'Computers and the Visual Arts' in the September issue, as follows: 'Although there is much interest in applying the computer to various areas of the visual arts, few real accomplishments have been recorded so far. Two of the causes for this lack of progress are technical difficulty of processing two-dimensional images and the complexity and expense of the equipment and the software. Still the current explosive growth in computer graphics and automatic picture processing technology are likely to have dramatic effects in this area in the next few years.' The development of picture processing technology took longer than Mezei had anticipated, partly because both the hardware and the software continued to be expensive. He also pointed out that most of the pictures in existence in 1967 were produced mainly as a hobby and he discussed the work of Michael Noll, Charles Csuri, Jack Citron, Frieder Nake, Georg Nees, and H.P. Paterson. All these names are familiar to us today as the pioneers of computer art history. Mezei himself too was a computer artist and produced series of images using maple leaf design and other national Canadian themes. Most of the computer art in 1967 was made with mechanical computer plotters, on CRT displays with a light pen or from scanned photographs. Mathematical equations that produced curves, lines or dots, and techniques to introduce randomness, all played their part in those early pictures. Art made with these techniques was instantaneously recognisable as having been produced either by mechanical means or with a program. It didn't actually look as if it had been done by hand. Then, and even now, most art made with the computer carries an indelible computer signature. The possibility of computer poetry and art was first mentioned in 1949. By the beginning of the 1950s it was a topic of conversation at universities and scientific establishments, and by the time computer graphics arrived on the scene, the artists were scientists, engineers, architects. Computer graphics were exhibited for the first time in 1965 in Germany and in America. 1965 was also the year when plans were laid for a show that later came to be called 'Cybernetic Serendipity' and presented at the ICA in London in 1968. It was the first exhibition to attempt to demonstrate all aspects of computer-aided creative activity: art, music, poetry, dance, sculpture, animation. The principal idea was to examine the role of cybernetics in contemporary arts. The exhibition included robots, poetry, music and painting machines, as well as all sorts of works where chance was an important ingredient. It was an intellectual exercise that became a spectacular exhibition in the summer of 1968" (http://www.medienkunstnetz.de/exhibitions/serendipity/images/1/, accessed 06-16-2012). This website reproduces photographs of the actual exhibition and a poster printed for the show.

View Map + Bookmark Entry

The Vanderbilt Television News Archive is Founded August 5, 1968

On August 5, 1968 the Vanderbilt Television News Archive was founded as a unit of the Jean and Alexander Heard Library of Vanderbilt University.

In October 2014 the website of the Vanderbilt archive described its contents as follows:

"The collection spans the presidential administrations of Lyndon Baines Johnson, Richard Nixon, Gerald Ford, Jimmy Carter, Ronald Reagan, George H.W. Bush, Bill Clinton, George W. Bush and Barack Obama. The core collection includes evening news broadcasts from ABC, CBS, and NBC (since 1968), an hour per day of CNN (since 1995) and Fox News (since 2004). Special news broadcasts found in the Archive include political conventions, presidential speeches and press conferences, Watergate hearings, coverage of the Persian Gulf War, the events of September 11, 2001, the War in Afghanistan, and the War in Iraq."

Also in October 2014 the Wikipedia article on the archive summarized some of its holdings as follows:

"The Archive’s collection consists of more than 40,000 hours of video content, including:

  • The daily news broadcasts of ABCCBS and NBC from August 5, 1968 to the present
  • A daily one-hour CNN news program beginning in 1995
  • A daily one-hour Fox News program beginning in 2004
  • The weeknight broadcasts of Nightline by ABC, beginning in 1988
  • The networks’ televised coverage of live presidential speeches, press conferences, summit meetings, and other events
  • The networks’ televised coverage of live presidential election-related events, including debates, political conventions and election night coverage."
View Map + Bookmark Entry

The Term Software Engineering is Coined October 7 – October 11, 1968

The term “software engineering” was coined at a NATO conference held from October 7-11, 1968 in Garmisch, Germany. The conference was held in response to the perception that computer programming had not kept up with advances in computer hardware.

View Map + Bookmark Entry

Philip Bagley Coins the Term "Metadata" November 1968

In November 1968 American computer scientist Philip Bagley coined the term metadata in his technical report Extension of programming language concepts.

"... it is clear that he uses the term in the ISO 11179 'traditional' sense, which is 'structural metadata' i.e. 'data about the containers of data'; rather than the alternate sense 'content about individual instances of data content' or metacontent, the type of data usually found in library catalogues. Since then the fields of information management, information science, information technology, librarianship and GIS? have widely adopted the term. In these fields the word metadata is defined as "data about data". While this is the generally accepted definition, various disciplines have adopted their own more specific explanation and uses of the term" (Wikipedia article on Metadata, accessed 12-08-2013).

View Map + Bookmark Entry

K. G. Pontius Hultén Curates "The Machine as Seen at the End of the Mechanical Age" : An Art Exhibition November 27, 1968 – February 9, 1969

Swedish art collector and curator K. G. Pontius Hultén curated and wrote the catalogue for The Machine as Seen at the End of the Mechanical Age, an exhibition at The Museum of Modern Art, New York, from November 27, 1968 to February 9, 1969. This was a landmark exhibition on the history of the machine in its relationship to art from the Renaissance to 1968; or as the editor stated, it was "a collection of comments on technology by artists of the Western world" (p. 3). The art reproduced and described in the catalogue— including much that was radical for its time—was mainly in traditional media such as prints or paintings, sculptural or mechanical, with a few electro-mechanical items, and one example of laser art.

Only the last two items in the exhibition were examples of computer graphics, the first of which was a digitized and pixilated image of a reclining nude, entitled "The Nude," executed in 1966 Leon D. Harman and Kenneth C. Knowlton, researchers at Bell Labs. "Knowlton relates how they tossed a coin to determine who would be listed in the museum catalogue as the 'artist' (Harmon) and as the 'engineer' (Knowlton)" (Noll, First Hand: Early Digital Art at Bell Telephone Laboratories, Inc, accessed 01-19-2014).  Creation of "The Nude" as shown in a five foot by twelve enlargement was discussed in a New York Times article by Henry R. Lieberman entitled "Art and Science Proclaim Alliance in Avant-Garde Loft," October 11, 1967.

That the show took place only a month after the pioneering computer art show, Cybernetic Serendipity, closed in London, was probably a coincidence.

The design and production of the catalogue was unusually excellent, including a very striking binding of aluminum sheeting with a stamped enamel-painted design of the MOMA building on the upper cover.

In January 2014 all the press release documents, including detailed information about art exhibited, were available from the Museum of Modern Art website at this link.

View Map + Bookmark Entry

Lloyd Sumner Issues The First Monograph by a Computer Artist December 1968

Cover of Computer Art and Human Response by Lloyd Sumner.

In 1968 print dealer and book collector Paul B. Victorius of Charlottesville, Virginia published Computer Art and Human Response by computer artist Lloyd Sumner. This oblong 8vo of 96 pages, dedicated "To my good friends the Burroughs B5500 and the Calcomp 565," appears to be the first monograph by a computer artist, and because it explains techniques, it is probably the first book on how to produce computer art, as William Fetter's 1965 book on Computer Graphics in Communication was focused on computer graphics used in engineering.

Sumner's book is extensively illustrated with numerous plotter images output on the Calcomp 565, several of which are reproduced in color, making it one of the earliest books exclusively illustrated with computer graphics. Sumner's book was probably published in December 1968.  The text refers to Sumner's participation in the Cybernetic Serendipity show in London held from August to October 1968, indicating that "over 50,000 people" attended that show, and the introduction to the book by the president of the University of Virginia is dated August 1968. Two presentation copies in my collection are dated December 12 and 13 respectively.

View Map + Bookmark Entry

Douglas Engelbart Demonstrates Hypertext, Text Editing, Windows, Email and a Mouse: "The Mother of All Demos" December 9, 1968

On December 8, 1968 Douglas Engelbart of the Stanford Research Institute, Menlo Park, California, presented a 100 minute demonstration  at the San Francisco Convention Center of an “oNLine System” (NLS), the features of which included hypertext, text editing, screen windowing, and email. To make this system operate, Engelbart used the mouse which he had invented the previous year.

In December 2013 numerous still images, a complete video stream of the demo, and 35 brief flash streaming video clips of different segments, were available from the Engelbart Collection at Stanford University at this link

View Map + Bookmark Entry

The First Manned Apollo Flights Occur December 24, 1968

On December 24, 1968 the first manned Apollo flights occurred, including Apollo 8, launched from the Kennedy Space Center, which circumnavigated the moon on Christmas Eve.

View Map + Bookmark Entry

The First ATM is Installed at Chemical Bank in New York Circa 1969 – 1970

In 1969 or 1970 the first automatic teller machine (ATM) was installed. Dates conflict as to whether this was in 1969 or slightly later. The first machine installed at Chemical Bank in New York may have been only a cash dispenser.

View Map + Bookmark Entry

Kenneth Thompson & Dennis Ritchie Develop UNIX, Making Open Systems Possible 1969

In 1969 Kenneth Thompson and Dennis Ritchie developed the UNIX operating system at Bell Labs. This was the first operating system designed to run on computers of all sizes, making open systems possible. UNIX became the foundation for the Internet.

View Map + Bookmark Entry

32,393 New Books Are Published in the U.K. 1969

In 1968 32,393 books were produced in the United Kingdom.

View Map + Bookmark Entry

Filed under: Book History, Publishing

Compuserve, the First Commercial Online Service, is Founded 1969

In 1969 Compuserve was founded in Columbus, Ohio, as a way to generate income from Golden United Life Insurance mainframe computers during non-business hours. Compuserve became the first commercial online service in the United States.

View Map + Bookmark Entry

Willard Boyle & George Smith Develop the CCD, a Sensor for Recording Images 1969

Working at Bell Labs, in 1969 Willard Boyle and George E. Smith invented the charge-coupled device (CCD), a sensor for recording images.

Twenty years later, in 2009 Boyle and Smith shared half of the Nobel Prize in Physics "for the invention of an imaging semiconductor circuit – the CCD sensor." The Nobel Prize Committee prepared a report putting the discovery of the CCD in perspective. It may be accessed at http://nobelprize.org/nobel_prizes/physics/laureates/2009/phyadv09.pdf

"The lab [Bell Labs] was working on the picture phone and on the development of semiconductor bubble memory. Merging these two initiatives, Boyle and Smith conceived of the design of what they termed 'Charge "Bubble" Devices'. The essence of the design was the ability to transfer charge along the surface of a semiconductor. As the CCD started its life as a memory device, one could only "inject" charge into the device at an input register. However, it was immediately clear that the CCD could receive charge via the photoelectric effect and electronic images could be created. By 1969, Bell researchers were able to capture images with simple linear devices; thus the CCD was born. Several companies, including Fairchild Semiconductor, RCA and Texas Instruments, picked up on the invention and began development programs. Fairchild was the first with commercial devices and by 1974 had a linear 500 element device and a 2-D 100 x 100 pixel device. Under the leadership of Kazuo Iwama, Sony also started a big development effort on CCDs involving a significant investment. Eventually, Sony managed to mass produce CCDs for their camcorders. Before this happened, Iwama died in August 1982. Subsequently, a CCD chip was placed on his tombstone to acknowledge his contribution" (Wikipedia article on Charge-coupled device, accessed 10-06-2009).

View Map + Bookmark Entry

Gary Starkweather at Xerox Invents the Laser Printer 1969 – 1971

In 1969-1971 American engineer Gary Starkweather, working at a Xerox research facility in Webster, New York, and later at Xerox PARC, invented the laser printer by combining a laser, a xerographic copier, and a Research Character Generator (RGG) that converted digital information to a form readable by a laser. By 1971 or 1973, (sources vary on this point)  the first perfected version of Starkweather's invention could print two pages per second at a resolution of 300 dpi. However Xerox did not market a laser printer until 1977 when they offered the Xerox 9700, the first commercially available stand-alone laser printer. Prior to this in 1976 IBM produced the IBM 3800 as a peripheral to computer systems.

Reilly, Milestones in Computer Science and Information Technology, p. 152.

View Map + Bookmark Entry

IBM Introduces the Generalized Markup Language (GML) Circa 1969

Around 1969 IBM introduced the Generalized Markup Language, GML, developed by Charles Goldfarb, Edward Mosher and Raymond Lorie, whose surname initials were used by Goldfarb to make up the term GML. GML was"a set of macros that implemented intent-based markup tags for the IBM text formatter, "'SCRIPT.' SCRIPT was the main component of IBM's Document Composition Facility (DCF). A starter set of tags in GML was provided with the DCF product.

"GML simplifies the description of a document in terms of its format, organization structure, content parts and their relationship, and other properties. GML markup (or tags) describes such parts as chapters, important sections, and less important sections (by specifying heading levels), paragraphs, lists, tables, and so forth." (Wikipedia article on IBM Generalized Markup Language, accessed 12-21-2008).

View Map + Bookmark Entry

Konrad Zuse Issues "Rechnender Raum," the First Book on Digital Physics 1969

In 1969 German engineer and computer designer Konrad Zuse published Rechnender Raum. This was translated into English in 1970 under the title, Calculating Space. It was the first book on digital physics.

"Zuse proposed that the universe is being computed in real time on some sort of cellular automata or other discrete computing machinery, challenging the long-held view that some physical laws are continuous by nature. He focused on cellular automata as a possible substrate of the computation, and pointed out (among other things) that the classical notions of entropy and its growth do not make sense in deterministically computed universes.

"Bell's theorem is sometimes thought to contradict Zuse's hypothesis, but it is not applicable to deterministic universes, as Bell himself pointed out. Similarly, while Heisenberg's uncertainty principle limits in a fundamental way what an observer can observe, when the observer is himself a part of the universe he is trying to observe, that principle does not rule out Zuse's hypothesis, which views any observer as a part of the hypothesized deterministic process. So far there is no unambiguous physical evidence against the possibility that "everything is just a computation," and a fair bit has been written about digital physics since Zuse's book appeared" (Wikipedia article on Calculating Space, accessed 05-16-2009).

View Map + Bookmark Entry

EMS Produces the First Digital Sampler in the First Digital Music Studio Circa 1969

"The first digital sampler was the EMS (Electronic Music Studios) Musys system developed by Peter Grogono (software), David Cockerell (hardware and interfacing) and Peter Zinovieff (system design and operation) at their London (Putney) Studio c. 1969. The system ran on two mini-computers, a pair of Digital Equipment’s PDP-8s. These had the tiny memory of 12,000 (12k) bytes, backed up by a hard drive of 32k and by tape storage (DecTape)—all of this absolutely minuscule by today’s standards. Nevertheless, the EMS equipment was used as the world’s first music sampler and the computers were used to control the world's first digital studio" (Wikipedia article on Sampler (musical instrument), with hyperlinks that I added, accessed 08-29-2009).

View Map + Bookmark Entry

Houghton Mifflin Issues the First Dictionary Based on Corpus Linguistics 1969

In 1969 Houghton Mifflin of Boston published The American Heritage Dictionary of the English Language.

"The AHD broke ground among dictionaries by using corpus linguistics for compiling word-frequencies and other information. It took the innovative step of combining prescriptive information (how language should be used) and descriptive information (how it actually is used). The descriptive information was derived from actual texts. Citations were based on a million-word, three-line citation database [the Brown Corpus] prepared by Brown University linguist Henry Kucera" (Wikipedia article on The American Heritage Dictionary of the English Language, accessed 06-07-2010).

View Map + Bookmark Entry

Heinz von Foerster Issues the First Book on Computer Music to Include Recordings of Compositions 1969

In 1969 Austrian American physicist and philosopher Heinz von Foerster (born Heinz von Förster), director of the Biological Computer Laboratory at the University of Illinois at Urbana-Champaign, and James W. Beauchamp published Music by Computers.  This was probably the first book on computer music to include recordings of actual compositions. Four thin analog sound recordings (33-1/3 RPM) on thin flexible vinyl were included in a pocket in the inside back cover.

Hook & Norman, Origins of Cyberspace (2001) no. 608.

View Map + Bookmark Entry

David Gregg & James Russell Invent the Laserdisc 1969 – December 15, 1978

The Laserdisc (videodisc) was invented and originally called "Videodisk" using a transparent disc by David Paul Gregg in 1958, and by James Russell in 1965. It was enhanced by Philips Electronics in 1969 by using a videodisc in reflective mode. The purchaser of Gregg's patents, Music Corporation of America (MCA) and Philips first publically demonstated the videodisc in 1972.  They first made the technology available on the market in Atlanta, Georgia on December 15, 1978 with the MCA DiscoVision release of the 1975 thriller film Jaws.

Laserdisc technology later became the basis for compact discs (CDs).

View Map + Bookmark Entry

The Datapoint 2200: Precursor of the Personal Computer and the Microprocessor 1969 – 1971

In 1971 Phil Ray and Gus Roche of Computer Terminal Corporation of San Antonio, Texas, later known as Datapoint Corporation, began shipping the Datapoint 2200, a mass-produced programmable terminal, which could be used as a simple stand-alone personal computer.

"It was intended by its designers simply to be a versatile, cost-efficient terminal for connecting to a wide variety of mainframes by loading various terminal emulations from tape rather than being hardwired as most terminals were. However, enterprising users in the business sector (including Pillsbury Foods) realized that this so-called 'programmable terminal' was equipped to perform any task a simple computer could, and exploited this fact by using their 2200s as standalone computer systems. Equally significant is the fact that the terminal's multi-chip CPU (processor) became the embryo of the x86 architecture upon which the original IBM PC and its descendants are based.

"Aside from being one of the first personal computers, the Datapoint 2200 has another connection to computer history. Its original design called for a single-chip 8-bit microprocessor for the CPU, rather than a conventional processor built from discrete TTL modules. In 1969, CTC contracted two companies, Intel and Texas Instruments, to make the chip. TI was unable to make a reliable part and dropped out. Intel was unable to make CTC's deadline. Intel and CTC renegotiated their contract, ending up with CTC keeping its money and Intel keeping the eventually completed processor.

"CTC released the Datapoint 2200 using about 100 discrete TTL components (SSI/MSI chips) instead of a microprocessor, while Intel's single-chip design, eventually designated the Intel 8008, was finally released in April 1972. The 8008's seminal importance lies in its becoming the ancestor of Intel's other 8-bit CPUs, which were followed by their assembly language compatible 16-bit CPU's—the first members of the x86-family, as the instruction set was later to be known. Thus, CTC's engineers may be said to have fathered the world's most commonly used and emulated instruction set architecture from the mid-1980s to date" (Wikipedia article on Datapoint 2200, accessed 09-12-2012).

View Map + Bookmark Entry

The First Full-Length Audio Recording of a Book 1969

The first full-length audio recording of a book was of The Autobiography of Benjamin Franklin by American voice actor Michael Rye in 1969. It included excerpts from the autobiography, Poor Richard's AlmanackThe Dogood Papers, and other shorter works. 

View Map + Bookmark Entry

Derek Austin Develops the PRECIS Preserved Context Index System 1969 – 1984

In 1969 British librarians Derek Austin and Peter Butcher issued PRECIS: A rotated subject index system, published by the Council of the British National Bibliography. This appears to be the first published report on an innovative method for adding subject data in the form of descriptors to the computerized MARC record. The 87-page report with a theoretical discussion and many specific examples was followed by an undated 17-page  "Supplement to "PRECIS - A Rotated Subject Index System." The new system was applied to the British National Bibliography. 

Austin followed the 1969 report with an expanded book entitled PRECIS: A manual of concept analysis and subject indexing (1974). An expanded version of this was issued by the British Library Bibliographic Services Division in 1984. I have reviewed copies of the 1969 and 1984 publications.

According to a quotation from Austin's obituary quoted in the Wikipedia article on Derek Austin, which I accessed in October 2016, Austin's "aim was to create an indexing system that would liberate indexers from the constraints of 'relative significance' (main entries). ...As by-products of his indexing theories he worked out drafts that in the mid-1980s were accepted as British and International Standards for examining documents, and for establishing multilingual and monolingual thesauri". PRECIS was an example of the application of syntactical devises in indexing. It was replaced at the British National Biography by COMPASS in 1996, which was later replaced by Library of Congress Subject Headings.

View Map + Bookmark Entry

Steve Crocker Embodies Peer to Peer Architecture (P2P) as One of the Key Concepts of the ARPANET April 7, 1969

In Network Working Group Request for Comment: 1 issued on April 7, 1969 Steve Crocker at UCLA embodied peer to peer architecture (P2P) as one of the key concepts of the ARPANET.

View Map + Bookmark Entry

Jerry Sanders and Colleagues from Fairchild Semiconductor Found AMD May 1, 1969

Advanced Micro Devices (AMD) was founded by Jerry Sanders and seven others from Fairchild Semiconductor on May 1, 1969. It began operations as a producer of logic chips.

View Map + Bookmark Entry

A Problem with the Apollo 11 Guidance Computer Nearly Prevents the First Moon Walk July 21, 1969

On July 21, 1969 Neil Armstrong, commander of the Apollo 11 lunar landing mission, and Edwin "Buzz" Aldrin, lunar module pilot, became the first human beings to walk on the moon. A Saturn V rocket launched the Command Module, Service Module ("Columbia") and Lunar Module ("Eagle") from the Kennedy Space Center Launch Complex 39 in Merritt Island, Florida.

The moon landing was almost canceled in the final seconds because of an overload of the Apollo Guidance Computer’s memory, but on advice from Earth, Armstrong and Aldren ignored the warnings and landed safely. The Apollo Guidance Computer was the first recognizably modern embedded system used in real-time by astronaut pilots.

View Map + Bookmark Entry

Leonard Kleinrock Establishes the First ARPANET Node and the First Network Connection August 30, 1969

On August 30, 1969 The first ARPANET node was installed at the UCLA Network Measurement CenterLeonard Kleinrock established the first network connection between a network packet switch called an Interface Message Processor, ancestor of today's routers, and a time-shared host computer. (See Reading 13.7.)

View Map + Bookmark Entry

Charley Kline Sends the First Message Over the ARPANET October 29, 1969

The first message sent over the ARPANET was from Leonard Kleinrock’s UCLA computer by student programmer Charley Kline at 10:30 pm on October 29, 1969, to the second node at Stanford Research Institute’s computer in Menlo Park, California.

The message was simply “Lo" instead of the intended word,"login."

"The message text was the word login; the l and the o letters were transmitted, but the system then crashed. Hence, the literal first message over the ARPANET was lo. About an hour later, having recovered from the crash, the SDS Sigma 7 computer effected a full login" (Wikipedia article on Arpanet, accessed 12-26-2012).

View Map + Bookmark Entry

The First Four Nodes on the ARPANET December 5, 1969

By December 5, 1969 the ARPANET consisted of four nodes:

1. University of California, Los Angeles (UCLA), where Leonard Kleinrock had established a Network Measurement Center. 

2. The Stanford Research Institute's Augmentation Research Center, where Douglas Engelbart had created the ground-breaking NLS system, a very important early hypertext system.

3. University of California, Santa Barbara (UCSB), Culler-Fried Interactive Mathematics Center. 

4. The University of Utah's Computer Science Department, where Ivan Sutherland had moved.

View Map + Bookmark Entry