4406 entries. 94 themes. Last updated December 26, 2016.

1950 to 1960 Timeline


Jule Charney, Agnar Fjörtoff & John von Neumann Report the First Weather Forecast by Electronic Computer 1950

In 1950 meteorologist Jule Charney of MIT, Agnar Fjörtoff, and mathematician John von Neumann of Princeton published “Numerical Integration of the Barotropic Vorticity Equation,” Tellus 2 (1950) 237-254. The paper reported the first weather forecast by electronic computer. It took twenty-four hours of processing time on the ENIAC to calculate a twenty-four hour forecast.

"As a committed opponent of Communism and a key member of the WWII-era national security establishment, von Neumann hoped that weather modeling might lead to weather control, which might be used as a weapon of war. Soviet harvests, for example, might be ruined by a US-induced drought.

"Under grants from the Weather Bureau, the Navy, and the Air Force, he assembled a group of theoretical meteorologists at Princeton's Institute for Advanced Study (IAS). If regional weather prediction proved feasible, von Neumann planned to move on to the extremely ambitious problem of simulating the entire atmosphere. This, in turn, would allow the modeling of climate. Jule Charney, an energetic and visionary meteorologist who had worked with Carl-Gustaf Rossby at the University of Chicago and with Arnt Eliassen at the University of Oslo, was invited to head the new Meteorology Group.

"The Meteorology Project ran its first computerized weather forecast on the ENIAC in 1950. The group's model, like [Lewis Fry] Richardson's, divided the atmosphere into a set of grid cells and employed finite difference methods to solve differential equations numerically. The 1950 forecasts, covering North America, used a two-dimensional grid with 270 points about 700 km apart. The time step was three hours. Results, while far from perfect, justified further work" (Paul N. Edwards [ed], Atmospheric General Circulation Modeling: A Participatory History, accessed 04-26-2009).

As Charney, Fjörtoff, and von Neumann reported:

"It may be of interest to remark that the computation time for a 24-hour forecast was about 24 hours, that is, we were just able to keep pace with the weather. However, much of this time was consumed by manual and I.B.M. oeprations, namely by the reading, printing, reproducing, sorting and interfiling of punch cards. In the course of the four 24 hour forecasts about 100,000 standard I.B.M. punch cards were produced and 1,000,000 multiplications and divisions were performed. (These figures double if one takes account of the preliminary experimentation that was carried out.) With a larger capacity and higher speed machine, such as is now being built at the Institute for Advanced Study, the non-arithmetical operations will be eliminated and the arithmetical operations performed more quickly. It is estimated that the total computation time with a grid of twice the Eniac-grids density, will be about 1/2 hour, so that one has reason to hope that RICHARDSON'S dream (1922) of advancing the computation faster than the weather may soon be realized, at least for a two-dimensional model. Actually we estimate on the basis of the experiences acquired in the course of the Eniac calculations, that if a renewed systematic effort with the Eniac were to be made, and with a thorough routinization of the operations, a 24-hour prediction could be made on the Eniac in as little as 12 hours." (pp. 274-75).

View Map + Bookmark Entry

"High-Speed Computing Devices," the First Textbook on How to Build an Electronic Computer 1950

In 1950 Engineering Research Associates of St. Paul, Minnesota, published High-Speed Computing Devices, the first textbook on how to build an electronic digital computer. Written in the form of a “cookbook,” the book described available computer components and how they worked. It included extensive bibliographies of the American computing literature and some of the English. 

View Map + Bookmark Entry

Wilkes,Wheeler & Gill: the First Treatise on Software for an Operational Stored-Program Computer 1950 – 1951

In 1950 Maurice Wilkes, David Wheeler, and Stanley Gill of Cambridge University issued Report on the Preparation of Programmes for the EDSAC and the Use of the Library of Subroutines. This dittoed document, published for private distribution in a very small number of copies, was the first treatise on software written for an operational stored-program computer. The book described “assemblers” and “subroutines”—segments of programs that are frequently used, so they can be kept in “libraries” and reused as needed in many software applications. The Cambridge group thus introduced the concept of reusable code, one of the principal tools for reducing software bugs and improving the productivity of programmers.

In 1951 this work was published as a conventional hard-cover book, with some changes and a new title by the American publishers Addison-Wesley, coincidentally in Cambridge, Massachusetts. The Preparation of Programs for an Electronic Digital Computer, with special reference to the EDSAC and the use of a library of subroutines was the first conventionally published book on software. (See Reading 9.4.)

View Map + Bookmark Entry

Filed under: Publishing, Software

Compiling a Bibliography by Electric Punched Card Tabulating 1950

In 1950 the Library of Congress announced plans to compile the Union List of Serials using electric punched card tabulating.

View Map + Bookmark Entry

The Hamming Codes 1950

In 1950 Richard W. Hamming of Bell Labs and the City College of New York published Error Detecting and Error Codes.

View Map + Bookmark Entry

11,638 New Books Are Published in the U.K. 1950

In 1950 11,638 new books were published in the United Kingdom.

View Map + Bookmark Entry

After 1954 More News Was Distributed Electronically than on Paper 1950

According to Asa Brigg’s The History of British Broadcasting in the United Kingdom, Vol. 4, p. 524, newspaper circulation in Britain as a distribution medium for news reached its peak in 1950 and 1954. Thereafter more news was distributed over radio and television than through print.

View Map + Bookmark Entry

The IBM NORC, the First Supercomputer 1950 – 1954

Between 1950 and 1954 IBM developed and built at Columbia University's Watson Scientific Computing Laboratory, 612 West 115th Street location, the Naval Ordnance Research Computer (NORC)—for the U.S. Navy Bureau of Ordnance. The NORC was the "first supercomputer," and "the most powerful computer on earth from 1954 to about 1963." The NORC’s multiplication unit remains the fastest ever built with vacuum tube technology.

IBM introduced the input-output channel as a feature on the NORC. This innovation synchronized the flow of data into and out of the computer while computation was in progress, relieving the central processor of that task.

View Map + Bookmark Entry

The Bic Pen 1950

After purchasing the patent for the ballpoint pen from Lázló Biró (who had been producing ballpoints in Argentina since 1943), in 1950 Marcel Bich produced the very inexpensive Bic Cristal in Clichy, Hauts-de-Seine, France.

"A Bic Cristal ballpoint pen contains enough ink to draw a continuous line up to two miles (3.2 km) long. In 2005, Bic sold its hundred billionth ballpoint pen - enough ink to draw a line to Pluto and back more than 20 times."

View Map + Bookmark Entry

Archival Records Include "Machine-Readable Materials" 1950

The Federal Records Act of 1950 expanded the definition of "record" to include "machine-readable materials." At this time machine-readable records included primarily punched-cards.

View Map + Bookmark Entry

Coining the Expression, Information Retrieval 1950

In 1950 American computer scientist Calvin Mooers coined the expression information retrieval in "the Zator Technical Bulletin No. 48 (1950), a publication of the Cambridge, Mass.-based Zator Co.- which Mooers founded in 1947. Mooers produced the definition of the phrase: 'The requirements of information retrieval, of finding information whose location or very existence is a priori unknown....'(http://www.garfield.library.upenn.edu/commentaries/tsv11(06)p09y19970317.pdf, accessed 01-16-2010).
Mooer's Zator Co. was probably the first information retrieval service company.
View Map + Bookmark Entry

Schmieder's Bach-Werke-Verzeichnis 1950

In 1950 German musicologist Wolfgang Schmieder, Special Advisor for Music for the City and University library at Johann Wolfgang Goethe University of Frankfurt am Main, published the Thematisch-systematisches Verzeichnis der musikalischen Werke von Johann Sebastian Bach (Thematic-systematic catalogue of musical works of Johann Sebastian Bach). The numbering system by which Schmieder organized Johann Sebastian Bach's compositions became known as the Bach-Werke-Verzeichnis, with the numbers Schmieder assigned to each work taking on the prefix BWV.

View Map + Bookmark Entry

The Earliest Pioneer in Electronic Art 1950 – 1953

In 1950 American draftsman, graphic artist and mathematician Benjamin (Ben) F. Laposky of Cherokee, Iowa, first used a cathode ray oscilloscope with sine wave generators and various other electrical and electronic circuits to create abstract art, which he called "electrical compositions." The electrical vibrations shown on the screen of the oscilloscope, which included Lissajous figures, he recorded by still photography. Some of Laposky's images were published in Scripta Mathematica in 1952.

In 1953 Laposky exhibited fifty images that called "Oscillons" (or oscillogram designs) at the Sanford Museum in Cherokee, Iowa. To record this exhibition and Laposky's statements of his artistic philosophy the museum published an exhibition catalogue entitled electronic abstractions. Because of this exhibition Laposky is credited as the earliest pioneer in electronic art, more specifically in the analog vector medium. In later work Laposky also incorporated motorized rotating filters of variable speed to color the patterns. He never programmed computers to create images.

A version of Laposky's electronic abstractions show was exhibited across the United States, in France at LeMons, and other places by the Cultural Relations Section of the United States from 1953 to 1961.

In later work Laposky incorporated motorized rotating filters of variable speed to color the patterns, recording the images by color photography.

Herzogenrath & Nierhoff-Wielk, Ex Machina–Frühe Computergrafik bis 1979. Ex Machina-Early Computer Graphics up to 1979 (2007) 229.

View Map + Bookmark Entry

"Chargaff's Rules" 1950

In 1950 Austrian-American biochemist Erwin Chargaff of Columbia University reported his observation from analyses of different DNAs that DNA from any cell of all organisms should have a 1:1 ratio of pyrimidine and purine bases and, more specifically, that the amount of guanine is equal to cytosine and the amount of adenine is equal to thymine (base pair equality). Watson and Crick's model of the structure of DNA confirmed Chargaff's Rules.

Chargaff, "Chemical Specifity of Nucleic Acids and the Mechanism of their Enzymatic Degradation," Experimenta (Basel) 6 (1950) 201-9.

View Map + Bookmark Entry

"Newspaper Story": a Film About Newspaper Production 1950

In 1950 Encylopedia Brittanica Films produced Newspaper Story about newspaper production from news gathering to the finished product.

View Map + Bookmark Entry

"Can Man Build a Superman?" January 23, 1950

The cover by Boris Artzybasheff on the January 23, 1950 issue of TIME Magazine depicted the Harvard Mark III partly electronic and partly electromechanical computer as a Naval officer in Artzybasheff's "bizarrely anthropomorphic" style. The caption under the image read, "Mark III. Can Man Build a Superman?" The cover story of the magazine was entitled "The Thinking Machine."

The Mark III, delivered to U.S. Naval Proving Ground at the US Navy base at Dahlgren, Virginia in March 1950, operated at 250 times the speed of the Harvard Mark I (1944). 

Among its interesting elements,  the Time article included an early use of the word computer for machines rather than people. The review of Wiener's Cybernetics published in TIME in December 1948, referred to the machines as calculators.

"What Is Thinking? Do computers think? Some experts say yes, some say no. Both sides are vehement; but all agree that the answer to the question depends on what you mean by thinking.

"The human brain, some computermen explain, thinks by judging present information in the light of past experience. That is roughly what the machines do. They consider figures fed into them (just as information is fed to the human brain by the senses), and measure the figures against information that is "remembered." The machine-radicals ask: 'Isn't this thinking?'

"Their opponents retort that computers are mere tools that do only what they are told. Professor [Howard] Aiken, a leader of the conservatives, admits that the machines show, in rudimentary form at least, all the attributes of human thinking except one: imagination. Aiken cannot define imagination, but he is sure that it exists and that no machine, however clever, is likely to have any."

"Nearly all the computermen are worried about the effect the machines will have on society. But most of them are not so pessimistic as [Norbert] Wiener. Professor Aiken thinks that computers will take over intellectual drudgery as power-driven tools took over spading and reaping. Already the telephone people are installing machines of the computer type that watch the operations of dial exchanges and tot up the bills of subscribers.

"Psychotic Robots. In the larger, "biological" sense, there is room for nervous speculation. Some philosophical worriers suggest that the computers, growing superhumanly intelligent in more & more ways, will develop wills, desires and unpleasant foibles' of their own, as did the famous robots in Capek's R.U.R.

"Professor Wiener says that some computers are already "human" enough to suffer from typical psychiatric troubles. Unruly memories, he says, sometimes spread through a machine as fears and fixations spread through a psychotic human brain. Such psychoses may be cured, says Wiener, by rest (shutting down the machine), by electric shock treatment (increasing the voltage in the tubes), or by lobotomy (disconnecting part of the machine).

"Some practical computermen scoff at such picturesque talk, but others recall odd behavior in their own machines. Robert Seeber of I.B.M. says that his big computer has a very human foible: it hates to wake up in the morning. The operators turn it on, the tubes light up and reach a proper temperature, but the machine is not really awake. A problem sent through its sleepy wits does not get far. Red lights flash, indicating that the machine has made an error. The patient operators try the problem again. This time the machine thinks a little more clearly. At last, after several tries, it is fully awake and willing to think straight.

"Neurotic Exchange. Bell Laboratories' Dr. [Claude] Shannon has a similar story. During World War II, he says, one of the Manhattan dial exchanges (very similar to computers) was overloaded with work. It began to behave queerly, acting with an irrationality that disturbed the company. Flocks of engineers, sent to treat the patient, could find nothing organically wrong. After the war was over, the work load decreased. The ailing exchange recovered and is now entirely normal. Its trouble had been 'functional': like other hard-driven war workers, it had suffered a nervous breakdown" (quotations from http://www.time.com/time/magazine/article/0,9171,858601-7,00.html, accessed 03-05-2009).

View Map + Bookmark Entry

"Diners Club", the First Credit Card February 1950

Early Diners' Club card.

Early advertisement for Diners' Club card.

In February 1950 the Diners Club issued the first "general purpose" credit card, invented by Diners Club founder Frank X. McNamara. The card allowed members to charge the cost of restaurant bills only.

"The first credit card charge was made on February 8, 1950, by Frank McNamara, Ralph Schneider and Matty Simmons at Major's Cabin Grill, a restaurant adjacent to their offices in the Empire State Building" (Wikipedia article on Diners Club International, accessed 02-28-2012).

View Map + Bookmark Entry

Eckert-Mauchly is Sold to Remington Rand February 6, 1950

On February 6, 1950 Eckert-Mauchly Computer Corporation, the world's first electronic computer company, was sold to Remington Rand.

View Map + Bookmark Entry

Shannon Issues the First Technical Paper on Computer Chess March 1950

In March 1950 Claude Shannon of Bell Labs, Murray Hill, New Jersey, published "Programming a Computer for Playing Chess," Philosophical Magazine, Ser.7, 41, no. 314. This was the first technical paper on computer chess; however, the paper was entirely theoretical; it contained no references to Shannon programming an actual computer to play a game.

View Map + Bookmark Entry

Simon, the First Personal Computer May – November 1950

Edmund Berkeley's "Simon," which has been called the first personal computer, developed out of his book, Giant Brains, or Machines That Think, published in November 1949, in which he wrote,

 “We shall now consider how we can design a very simple machine that will think.. Let us call it Simon, because of its predecessor, Simple Simon... Simon is so simple and so small in fact that it could be built to fill up less space than a grocery-store box; about four cubic feet. . . . It may seem that a simple model of a mechanical brain like Simon is of no great practical use. On the contrary, Simon has the same use in instruction as a set of simple chemical experiments has: to stimulate thinking and understanding, and to produce training and skill. A training course on mechanical brains could very well include the construction of a simple model mechanical brain, as an exercise."

One year later in an article published in Scientific American about “Simon,” in November 1950 Berkeley predicted that “some day we may even have small computers in our homes, drawing energy from electric power lines like refrigerators or radios.”

"Who built "Simon"? The machine represents the combined efforts of a skilled mechanic, William A. Porter, of West Medford, Mass., and two Columbia University graduate students of electrical engineering, Robert A. Jensen . . . and Andrew Vall . . . . Porter did the basic construction, while Jensen and Vall took the machine when it was still not in working order and engineered it so that it functioned. Specifically, they designed a switching system that made possible the follow-through of a given problem; set up an automatic synchronizing system; installed a system for indicated errors due to loss of synchronization; re-designed completely the power supply of themachine" (Fact Sheet on "Simon." Public Information Office, Columbia University, May 18, 1950).

"The Simon's architecture was based on relays. The programs were run from a standard paper tape with five rows of holes for data. The registers and ALU could store only 2 bit. The data entry was made through the punched paper or by five keys on the front panel of the machine. The output was provided by five lamps. The punched tape served not only for data entry, but also as a memory for the machine. The instructions were carried out in sequence, as they were read from the tape. The machine was able to perform four operations: addition, negation, greater than, and selection" (Wikipedia article on Simon (computer) accessed 10-10-2011).

In his 1956 article, "Small Robots-Report," Berkeley stated that he had spent $4000 developing Simon. The single machine that was constructed is preserved at the Computer History Museum, Mountain View, California. Berkeley also marketed engineering plans for Simon, of which 400 copies were sold.

View Map + Bookmark Entry

MESM, the First Russian Stored-Program Computer November 6, 1950 – 1951

In 1951 Russian mathematician and computer scientist Sergei Lebedev had MESM, the first Russian stored-program computer, operational in Feofaniya (Ukrainian: Феофанія), Theophania, a suburb of Kiev.

"Work on MESM got going properly at the end of 1948 and, considering the challenges, the rate of progress was remarkable. Ukraine was still struggling to recover from the devastation of its occupation during WWII, and many of Kyiv’s buildings lay in ruins. The monastery in Feofania was among the buildings destroyed during the war, so the MESM team had to build their working quarters from scratch—the laboratory, metalworking shop, even the power station that would provide electricity. Although small—just 20 people—the team was extraordinarily committed. They worked in shifts 24 hours a day, and many lived in rooms above the laboratory. (You can listen to a lively account of this time in programme 3 of the BBC’s ”Electronic brains” series.) 

"MESM ran its first program on November 6, 1950, and went into full-time operation in 1951. In 1952, MESM was used for top-secret calculations relating to rocketry and nuclear bombs, and continued to aid the Institute’s research right up to 1957. By then, Lebedev had moved to Moscow to lead the construction of the next generation of Soviet supercomputers, cementing his place as a giant of European computing. As for MESM, it met a more prosaic fate—broken into parts and studied by engineering students in the labs at Kyiv’s Polytechnic Institute" (http://googleblog.blogspot.com/2011/12/remembering-remarkable-soviet-computing.html, accessed 12-25-2011)

View Map + Bookmark Entry

The First Public Demonstration of Machine Translation Occurs 1951 – January 7, 1954

On January 7, 1954 the first public demonstration of a Russian-English machine translation system occurred in New York—a collaboration between IBM and Georgetown University. Brief statements about politics, law, mathematics, chemistry, metallurgy, communications and military affairs were submitted in Russian by Léon Dostert and linguists of the Georgetown University Institute of Languages and Linguistics, and within a within a few seconds a computer translated the sentences into English. This project, which began in 1951, was also probably the first non-numerical application of a digital computer. Programming and the demonstration was done on an IBM 701, the first stored program digital computer that IBM put into production. IBM began producing the machine in December 1952. 

Although the demonstration was only a small-scale experiment of just 250 words and six grammar rules, it raised expectations, that later proved quite unrealistic, of automatic systems capable of high quality translation in the near future. The day after the demonstration the front pages of The New York Times and other major American newspapers carried reports of the first public demonstration of a computer for translating languages. Reports were syndicated in many provincial newspapers, and in the following months articles about it appeared in popular magazines. Some of the participants in the project claimed that within three of five years machine translation would be a solved problem. This encouraged governments to invest in computational linguistics.  

John Hutchins, "The first public demonstration of machine translation:
the Georgetown-IBM system, 7th January 1954" (2006).

In December 2013 IBM's January 8, 1954 press release concerning the demonstration was available from IBM's archive at this link.

View Map + Bookmark Entry

IBM's First Electronic Computer, the 701, is Designed 1951

In 1951 IBM decided to produce their first electronic computer, the 701. It was a machine for scientific applications based on the Princeton IAS design.

View Map + Bookmark Entry

The First OCR System: "GISMO" 1951

In 1951 American inventor David Hammond Shepard, a cryptanalyst at AFSA, the forerunner of the U.S. National Security Agency (NSA), built "Gismo" in his spare time.

Gismo was a machine to convert printed messages into machine language for processing by computer— the first optical character recognition (OCR) system.

"IBM licensed the [OCR] machine, but never put it into production. Shepard designed the Farrington B numeric font now used on most credit cards. Recognition was more reliable on a simple and open font, to avoid the effects of smearing at gasoline station pumps. Reading credit cards was the first major industry use of OCR, although today the information is read magnetically from the back of the cards.

"In 1962 Shepard founded Cognitronics Corporation. In 1964 his patented 'Conversation Machine' was the first to provide telephone Interactive voice response access to computer stored data using speech recognition. The first words recognized were 'yes' and 'no' " (Wikipedia article on David H. Shepard, accessed 02-29-2012).

View Map + Bookmark Entry

The Coordinate and Uniterm Indexing Systems 1951

The tremendous explosion of scientific literature during and after World War II overwhelmed manual indexing and retrieval methods. To meet the need for greater control over the mushrooming expansion of information American librarian Mortimer Taube developed the system (and later the theory) of Coordinate Indexing, and helped to establish its use as a major tool in automated library searches and documentation. Taube defined coordinate indexing as “the analysis of any field of information into a set of terms and the combination of these terms in any order to achieve any desired degree of detail in either indexing or selection."

Taube, "Coordinate Indexing of Scientific Fields." Paper delivered at the Symposium on Mechanical Aids to Chemical Documentation, Division of Chemical Literature, American Chemical Society, New York, Sept. 4, 1951. 

"Coordinate Indexing used 'uniterms' to make storing and retrieving information easier and faster. Uniterms 'constitute a special set of rules and requirements which makes both the analysis into terms and the combination of the terms in order to specify items of information a remarkably simple and efficient process.' Taube had split coordinate indexing into two categories, item and term indexing. It used punch cards and a machine reader to search for specific items or documents by terms or keywords....(Wikipedia article on Mortimer Taube, accessed 03-02-2012).

In 1952, with Gerald J. Sophar, Taube founded Documentation, Inc. There he developed automated data processing centers for handling scientific and technical information. In December 2013, when I wrote this entry, Documentation, Inc. was the earliest company of which I was aware, that was formed specifically for the purpose of automating library and documentation searches. After the foundation of NASA in 1958 Documentation, Inc. operated a technical center for NASA which provided NASA with a combination abstract and citation journal with a machine-produced index covering NASA reports and aquisitions, regular bibliographies in certain areas of aeronautuc and space sciences, microfilm reproductions of NASA reports, a high-speed reference service, and magnetic copies of the index to the total NASA collection. The company, which eventually employed 700 people, also produced Computexts, a series on magnetic tape designed for computer print-out of selected materials, offered for sale.

To automate the search process Taube developed the IBM 9900 Special Index Analyzer called COMAC (Continuous Multiple Access Collator). This was a punched card collator that tracked logical relationships among subject terms. Used with the IBM 9900, the IBM 305 RAMAC printed out results from searches of subject terms. Documentation Inc. eventually switched from the RAMAC to the lower cost IBM 1401.

View Map + Bookmark Entry

Bishop Fulton J. Sheen, Pioneer Televangelist 1951 – 1968

In 1951 Roman Catholic Bishop of Rochester, New York, and former radio broadcaster Fulton J. Sheen became one of the first  televangelists. From 1951 to 1957 Sheen hosted Life Is Worth Living first on the DuMont Television Network and later on ABC, winning an Emmy in 1952 for "Most Outstanding Personality". He later hosted The Fulton Sheen Program in syndication, with a virtually identical format, from 1961 to 1968.

View Map + Bookmark Entry

"The First of the True Robots" 1951

In 1951 American computer scientist and popular writer Edmund Berkeley developed Squee, the Electronic Robot Squirrel. Squee has been called, "the first of the true robots," because it was the first robot able to carry out a defined task, as opposed to just steering towards light. The task was collecting "nuts," which in the robot's case meant tennis balls. Squee was also the first robot to have a manipulator under automatic control. 

"Squee (named after 'squirrel') is an electronic robot squirrel. It contains four sense organs (two phototubes, two contact switches), three acting organs (a drive motor, a steering motor, and a motor which opens and closes the scoop or 'hands'), and a small brain of half a dozen relays. It will hunt for a 'nut'. The 'nut' is a tennis ball designated by a member of the audience who steadily holds a flashlight above the ball, pointing the light at Squee. Then Squee approaches, picks up the 'nut' in its 'hands' (the scoop), stops paying attention to the steady light, sees instead a light that goes on and off 120 times a second shining over its 'nest', takes the 'nut' to its 'nest', there leaves the nuts, and then returns to hunting more 'nuts'. When Squee is operating, it is a dramatic and exciting example of a robot. It has been exhibited in New York, Pittsburgh, and Minneapolis, and has always entertained and excited the audience. The machine however is sensitive to the surrounding light level, and usually has to be shown in a room about 8 by 10 ft. with only a small amount of overhead light and black curtained walls. Data: completed; rather well finished but not professionally; 75% reliable; maintenance, difficult; our costs, about $3,000" (Berkeley, Small Robots--Report [1956]).

Berkeley constructed only one example of Squee. It is preserved at the Computer History Museum in Mountain View, California, the gift of Gordon Bell.

View Map + Bookmark Entry

Filed under: Robotics / Automata

Miller's "Language and Communication" 1951

In 1951 American cognitive psychologist George Armitage Miller, then teaching at Harvard, published Language and Communication. Influenced by Claude Shannon's A Mathematical Theory of Communication (1948), this book

"used a probabilistic model imposed on a learning-by-association scheme borrowed from behaviorism, with Miller not yet attached to a pure cognitive perspective.The first part of the book reviewed information theory, the physiology and acoustics of phonetics, speech recognition and comprehension, and statistical techniques to analyze language. The focus was more on speech generation than recognition. The second part had the psychology: idiosyncratic differences across people in language use; developmental linguistics; the structure of word associations in people; use of symbolism in language; and social aspects of language use " (Wikipedia article on Goerge Armitage Miller, accessed 12-30-2012).

View Map + Bookmark Entry

Information Retrieval vs. Information Warehousing 1951

In 1951 Calvin N. Mooers, whose Zator Company was the first information retrieval company, and who had coined the term "information retrieval" in 1950, issued a report entitled Making Information Retrieval Pay, Zator Technical Bulletin 55.

From this report I quote:


"Information retreival must be distinguished from another operation performed on information. This is the 'information warehousing' operation, which is the orderly receipt, cataloguing and storage of information. Almost every library does a highly efficient and satisfactory job of information warehousing. This is fortunate, since successful operation of information retrieval--discovery and use of information--depends upon competent information warehousing. On the other hand, merely to warehouse a large collection of information does little to aid the User to discover the information he needs. Here we have a prevalent fallacy of the libraries."(p.3)

View Map + Bookmark Entry

The "DOKEN" (Documentary Engine): a Hypothetical Machine to Search the World's Literature 1951

In 1951 Calvin N. Mooers, whose Zator Company was the first information retrieval company, and who had coined the term "information retrieval" in 1950, issued a report entitled Making Information Retrieval Pay, Zator Technical Bulletin 55.

From this report I quote:


"Can the world-wide torrent of scientific information--from an estimated 30,000 periodicals containing an estimated 1,000,000 papers per annum--be met by any conceivable retrieval machine? The answer is yes, and the back-log (estimated roughly at 100,000,000 pieces) can be handled too.

"No existing machine is capable of doing a reasonable job of information retrieval on such a collection. The fastest electronic tabulating machinery would seem to require about 2,600 hours, or about 3 1/2 months, to scan a collection of 100 million pieces in answer to one request for information. The Microfilm Rapid Selector, according to published speeds, would take about 170 hours of steady running time or about a week to make the same search. Both these are too slow to meet a reasonable requirement that a central agency having such a machine should be able to make a number of searches each day, and to send out the bibliographies the same day the request was received.

"A machine that can do this job is actually possible--and it can be constructed within the limitations of our present technology. I will describe some of the features of such a machine in order that you will know what such a machine will be like when it is built. On the other hand, I can't tell you the date that his machine will actually be constructed because I cannot forecast when anyone will be able to afford it. The great expense is not in the machine. The machine will cost less than one of the enormous computing machines that we have been hearing so much about, and which some organizations seem to be able to afford. The real cost is handling and analyzing the magnltitude of information in setting up the systeme. We should figure on a cost of at least $2 per item. Thus the annual cost of processing the world's information--$2,000,000--would be several times the cost of the machine itself. But, to get back to the details of our hypothetical machine:

"We will call the machine the D O K E N, which is short for "documentary engine". The DOKEN is capable of making a complete multi-subject search of 100 million items in about 2 minutes, and having scanned the record, it reproduces or prints a bibliography of the selected abstracts at a rate of about 10 per minute by a dry printing process. Many searches are conducted each hour, steadily, throughout the day. After the first DOKEN is operating, film records for other DOKENS can be inexpensively copied at a fraction of the original cost. A DOKEN is a most appropriate instruction for national or regional research centers. It would be the information retrieval auxilliary instrument at a large library center for the local collection plus the entire world's literature. For instance, it could scan the Library of Congress collection (10 million catalogued items) in 10 seconds.

"The DOKEN can achieve the stated performance goal only by recourse to the most efficient techniques. That means that the job must be broken down into the different functional operations, and highly efficient specialized structures and methods be used to accomplish each. There are three separate functional organs that we must consider. They are: 1) the code storage and scanning engine, 2) the abstract record and reading engine, and 3) the abstract printing stations. These organs, unlike the corresponding elements in the Rapid Selector, are physically separate structures. We will consider them in turn.

"The Code storage and scanning engine contains the coded subjects of 100 million documents. Therefore, at least from considerations of sheer bulk, the most efficient possible subject coding must be used. The choice here is Zatocoding--the method of superimposition of random codes in each subject field--since this method seems to be considerably more efficient than any other coding scheme now known. We let each document be described by as many as 25 different cross-referenced subjects. The coded record is micro-photographed on photographic film, and this film strip is helically wound on a metal drum 10 feet in diameter and 7 feet long. This drum is driven at about 300 rpm, and the scanning head, following the helically-wound film, passes from one end to the other in less than a minute. The codes for more than one million documents are scanned in each second. This is about 5,000 times Rapid Selector speed. The basic principles of such a scanning head, able to do this with standard equipment, have been worked out. Selections, when made, are temporaily recorded as document or abstract numbers in an electronic or magnetic memory. The selections are made according to any simple or complex configuration of subject ideas, which can be chosen arbiitrarily to suit the needs of the request at hand.

"The abstract storage and reading engine is the organ which stores micrographic copies of 200-word abtracts and the citations for the documents. A single, large, square, semi-transparent sheet carries from a quarter million to a million of such abstracts. These sheets are stored in a stack, and by a mechanism like that of an automatic juke box record changer, the different sheets are pulled out of the stack to be read by an optical copying television head. This read head, using the two coordinate positions of the wanted abtract, finds the abstract, magnifies it, and electrically copies it into a wire circuit. Many such optical heads are working at the same time in the abstract storage engine. This abstract storage and reading engine fits nicely in a ordinary large-sized room, since the stack is only about 20 feet long.

"The abstract printing stations are placed remote from the rest of the engine--at the request desk or in the mailing room for mail service. The process used is a fast dry-printing, employing either ultra-violet sensitive diazo paper, or an electro-sensitive facsimile paper. Photography (silver) and Xerography do not meet nearly as well the requirements for a fast, simple and cheap process for giving a single-copy. Presently available equipment, about the size of table radio and now on the mrket, can produce about ten 200-word abstracts per minute at each station. There are as many stations in the operation as there are reading heads in the reading engine. The abstracts produced are reasonably clear, and are full sized and readable without any optical aid.

"Such is the DOKEN. It can built if there is a need for it. Part of the world's intellectual output is already being abstracted. With cooperation, and less than 10% additional effort, this same information could be put into a DOKEN system. Perhaps this cooperative endeavour will take the pattern so well worked out by Chemical Abstracts with its large corps of volunteer abstractors, and smaller staff of central editors. If so, the cost of the world-wide documentary project could be whittled down to manageable proportions. Support could be on a subscription basis. Bibliographic searches to any request would be funished by return airmail, giving an overnight service to information users.

"Smaller verions of the same instrument have a possible use in other situations, such as the whole chemical literature, the U.S. Patent Office, or the files of insurance companies. In such smaller collections, a much more complete subject coding is possible and would certainly be desirable int the case of patents.

"With regional DOKENs available, company collections of information on punched cards can be enriched by the inclusion of specially selected items from DOKEN bibliographies. But these bibliogrpahies of abstracts would generally have to be pruned, recoded, and 'slanted' into the particular company's technical viewpoint in order to raise their utility up to the company's retreival system threshhold value." (pp. 10-12) 

View Map + Bookmark Entry

The Paris symposium, "Les Machines á calculer et la pensée humaine," Occurs January 8 – January 13, 1951

From January 8-13, 1951 the Paris symposium, Les Machines á calculer et la pensée humaine (Calculating Machines and Human Thought) took place at l'Institut Blaise Pascal. Unlike the other early computer conferences, no demonstration of a stored-program electronic computer occurred. Louis Couffignal demonstrated the prototype of his non-stored-program machine.

Hook & Norman, Origins of Cyberspace (2002) no. 526.

View Map + Bookmark Entry

The First Ferranti Mark I is Delivered February 1951

In February 1951 the first Ferranti Mark I version of the Manchester University machine was delivered to the University of Manchester in England.

With the exception of the unique BINAC delivered to Northrop Aircraft in the United States, the Ferranti Mark I was the first commercially produced electronic digital computer delivered to a customer.

View Map + Bookmark Entry

One of the Earliest Computer Games February – October 1951

In February 1951 British computer scientist Christopher Strachey finished a program for the game of draughts, or checkers. The game ran for the first time on the Pilot ACE at the National Physical Laboratory, Teddington, on July 30, 1951, but completely exhausted the machine's memory.

"When Strachey heard about the Manchester Mark 1, which had a much bigger memory, he asked his former fellow-student Alan Turing for the manual and transcribed his program into the operation codes of that machine by around October 1951. The program could 'play a complete game of draughts at a reasonable speed' " (Wikipedia article on Christopher Strachey, accessed 09-12-2012).

View Map + Bookmark Entry

The Origins of NORAD February 16, 1951

On February 16, 1951 the Joint Chiefs of Staff (JCS) approved a U.S. - Canadian Permanent Joint Board on Defense (PJBD) recommendation (51/1) for an extension of the Permanent Radar Net. The recommendation called for the extension and consolidation of the control and warning system of Canada and the U.S. into one operational system to meet air defense needs of both countries. 

On March 10, 1951 51 the U.S. Army Antiaircraft Command assumed command for the first time of all antiaircraft forces assigned to air defense for both countries.

These developments are considered the origin of NORAD (North American Air Defense Command; now North American Aerospace Defense Command). The agency was not officially founded until May 12, 1958. NORAD headquarters are located at Peterson AFB, Colorado Springs, Colorado. NORAD command and control is exercised through the Cheyenne Mountain Operations Center, located a short distance away.

"The Operations Center itself lies along one side of a main tunnel bored almost a mile through the solid granite heart of the mountain. The tunnel is designed to route the worst of a blast's shock wave out the other end, past the two 25-ton blast doors that mark one wall. The center was designed to withstand up to a 30 megaton blast within 1-nautical-mile (1.9 km).

"The underground Combat Operations Center (COC) was originally intended to provide a 70% probability of continuing to function if a five-megaton nuclear weapon detonated three miles (5.6 km) away, but was ultimately built to withstand a multimegaton blast within 1.5 nautical miles (2.8 km; 1.7 mi). It was also designed to be self-sufficient for brief periods, have backup communications and television intercom with related commands, house personnel during an emergency, and protect staff against fallout and biological and chemical warfare.

"The main entrance to the complex is about one-third of a mile (540 m) from the North Portal via a tunnel which leads to a pair of 25-ton steel blast doors. Behind them is a steel building complex built within a 4.5 acres (18,000 m2) grid of excavated chambers and tunnels and surrounded by 2,000 feet (600 m) of granite. The main excavation consists of three chambers 45 feet (15 m) wide, 60 feet (20 m) high, and 588 feet (180 m) long, intersected by four chambers 32 feet (10 m) wide, 56 feet (17 m) high and 335 feet (100 m) long. Fifteen buildings, freestanding without contact with the rock walls or roofs and joined by flexible vestibule connections, make up the inner complex. Twelve of these buildings are three stories tall; the others are one and two stories.

"The outer shells of the buildings are made of three-eighths-inch (9.5 mm) continuously welded low carbon steel plates which are supported by structural steel frames. Metal walls and tunnels serve to attenuate electromagnetic pulse (EMP). Metal doors at each building entrance serve as fire doors to help contain fire and smoke. Emphasis on the design of the structure is predicated on the effects of nuclear weapons; however, building design also makes it possible for the complex to absorb the shock of earthquakes. During a nuclear explosion, powerful springs that support the complex can absorb much of the energy. North Portal

"Blast valves, installed in reinforced concrete bulkheads, have been placed in the exhaust and air intake supply, as well as water, fuel, and sewer lines. Sensors at the North and South Portal entrances will detect overpressure waves from a nuclear explosion, causing the valves to close and protect the complex. The buildings in the complex are mounted on 1,319 steel springs, each weighing about 1,000 pounds (450 kg). The springs allow the complex to move 12 inches (30 cm) in any one direction. To make the complex self-sufficient, adequate space in the complex is devoted to support functions. A dining facility, medical facility with dental office, pharmacy and a two-bed ward; two physical fitness centers with exercise equipment and sauna; a small base exchange and barber shop are all located within the complex.

"Electricity comes primarily from the city of Colorado Springs, with six 1,750 kilowatt diesel generators for backup. Water for the complex comes from an underground supply inside Cheyenne Mountain, deposited into four excavated reservoirs with a capacity of 1.5 million U.S. gallons (6,000 m³) of water. Three serve as industrial reservoirs and the remaining one is the complex's primary domestic water source. They are so large that workers sometimes cross them in rowboats. About 30,000 to 120,000 U.S. gallons (110 to 450 m³) are actually retained at any given time.

"Incoming air may be filtered through a system of chemical, biological, radiological, and nuclear filters to remove harmful pathogens and/or radioactive and chemical particles.

"The fresh air intake is mainly from the south portal access which is 17.5 feet (5.3 m) high and 15 feet (4.6 m) wide and linked to the north portal access which is 22.5 feet (7 m) high and 29 feet (9 m) wide. The entire tunnel from north to south entry portals is nine-tenths of a mile (1.5 km) long. The NORAD command center has been modernized several times over the years. The original equipment resembled Mission Control for NASA's Project Apollo in the 1960s-1970s and used similar Philco-Ford consoles and display systems. The current (2005) version, with ordinary desks and flat-screen displays, looks rather ordinary by comparison and resembles NASA's current (2000s) mission control" (Wikipedia article on Cheyenne Mountain, accessed 02-29-2012).

View Map + Bookmark Entry

Linus Pauling Reports the First Discovery of a Helical Structure for a Protein February 28, 1951

On his fiftieth birthday, February 28, 1951, American physical chemist Linus Pauling reported with his co-workers at Caltech, the American biochemist Robert Corey and the African-American physicist and chemist Herman Branson, the discovery of the alpha helix (α-helix). This was the first discovery of a helical structure for a protein. Their discovery built upon and confirmed the research of William Astbury reported in 1931.

"Although incorrect in their details, Astbury's models of these forms were correct in essence and correspond to modern elements of secondary structure, the α-helix and the β-strand (Astbury's nomenclature was kept), which were developed by Linus Pauling, Robert Corey and Herman Branson in 1951; that paper showed both right- and left-handed helixes, although in 1960 the crystal structure of myoglobin showed that the right-handed form is the common one. . . .

"Two key developments in the modeling of the modern α-helix were (1) the correct bond geometry, thanks to the crystal structure determinations of amino acids and peptides and Pauling's prediction of planar peptide bonds; and (2) his relinquishing of the assumption of an integral number of residues per turn of the helix. The pivotal moment came in the early spring of 1948, when Pauling caught a cold and went to bed. Being bored, he drew a polypeptide chain of roughly correct dimensions on a strip of paper and folded it into a helix, being careful to maintain the planar peptide bonds. After a few attempts, he produced a model with physically plausible hydrogen bonds. Pauling then worked with Corey and Branson to confirm his model before publication. In 1954 Pauling was awarded his first Nobel Prize "for his research into the nature of the chemical bond and its application to the elucidation of the structure of complex substances" (such as proteins), prominently including the structure of the α-helix" (Wikipedia article on Alpha helix, accessed 01-17-2014).

Pauling, Corey, and Branson, “The Structure of Proteins: Two Hydrogen-Bonded Configurations of the Polypeptide Chain," Proceedings National Academy of Sciences 37 (1951) 205-11.

Judson, The Eighth Day of Creation, 88-89.

View Map + Bookmark Entry

The First Rock and Roll Recording, Named After First American Muscle Car? March 3 – March 5, 1951

On March 3-5, 1951 American musician, bandleader, talent scout, and record producer Ike Turner and his band, the Kings of Rhythm, recorded  the rhythm and blues song, Rocket 88 in Memphis, Tennessee. This " hymn of praise" for the first American muscle car, the Oldsmobile Rocket 88, which had been introduced in 1949, has been called "the first rock and roll song." 


"Rock 'n' roll was an evolutionary process – we just looked around and it was here. . . . To name any one record as the first would make any of us look a fool." 

—Billy Vera, Foreword to "What Was the First Rock'n'Roll Record", Jim Dawson and Steve Propes, 1992" (Wikipedia article on First rock and roll recording, accessed 06-01-2009).

View Map + Bookmark Entry

NIMROD: The First Special Purpose Digital Computer Designed to Play a Game May 5, 1951

For the Festival of Britain Exhibition of Science in South Kensington, London, which opened on May 5, 1951 in commemoration of the hundredth anniversary of the Great Exhibition of 1851, Ferranti built a special purpose computer called NIMROD that played the ancient game of Nim, a mathematical game of strategy. 

NIMROD was the first digital computer designed specifically to play a game, though its actual purpose was to illustrate the principles of the digital computer to the public when almost no one had seen or interacted with a computer. Because of the number of vacuum tubes involved, the machine was 12 feet wide, 5 feet tall and 9 feet deep. When the Festival of Britain ended, in October 1951, the computer was displayed at the Berlin Industrial Show. According to the Wikipedia article on Nimrod (computing), so significant was the computer considered when it was exhibited in there that "famous German politicians were present including Konrad Adenauer, the Federal Chancellor of the Federal Republic of Germany (FRG) and Ludwig Erhard, the Federal Minister for Economic Affairs." 

In Berlin, the NIMROD

"was so popular that people ignored the free beer (in Berlin!!!...though the beer was English beer, I suppose). The beer was at the other end of the same room but people instead watched the 'electronic brain' beat its human competitors. In part the excitement was caused because on the first day Nimrod had beaten Ludwig Erhard, the German Federal Minister for Economic Affairs, three times in a row. The age of computers outwitting humans had started" (http://www.cs4fn.org/binary/nim/nim.php, accessed 02-01-2014).

To help explain the NIMROD computer to the British public Ferranti published a pamphlet priced 1s 6d entitled Faster than Thought. The Ferranti Nimrod Digital Computer. Discovery magazine published an artist's watercolor impression of the NIMROD in their March 1951 issue. NIMROD was further discussed in Bertram Bowden's book, Faster than Thought (1953), chapter 25. 

NIMROD was conceived by Ferranti employee John M. Bennett, who received his PhD in computing at Cambridge under Maurice Wilkes, and later became the first professor computer science in Australia. Bennett got the idea of a Nim-playing computer from the Nimatron, an electro-mechanical machine exhibited at the 1939-1940 World’s Fair in New York City.

In 1994 Bennett reminisced:

"Ferranti had undertaken to display a computer at the 1951 Festival of Britain, and late in 1950 it became evident that this promise could not be fulfilled. I suggested that a machine to play the game of NIM against all comers should be constructed with a versatile display to illustrate the algorithm and programming principles involved. The design was implemented by a Ferranti engineer, Raymond Stuart-Williams, who later joined RCA.

"In its simplest form, two players with several piles of, say, matches play the game of Nim. The players move alternately, each removing one or more of the matches from any one pile. Whoever removes the last match wins.

" The machine was a great success but not quite in the way intended, as I discovered during my time as spruiker on the Festival stand. Most of the public were quite happy to gawk at the flashing lights and be impressed. A few took an interest in the algorithm and even persisted to the point of beating the machine at the game. Only occasionally did we receive any evidence that our real message about the basics of programming had been understood" (http://www.goodeveca.net/nimrod/bennett.html, accessed 02-01-2014).

In February 2014 a 55 second sound recording of radio columnist Paul Jennings giving his impressions of the NIMROD in 1951 was available at this link

A reduced size replica of Nimrod was later built for the Computerspielemuseum Berlin.

View Map + Bookmark Entry

The First Commercial Color Television Broadcast in the United States June 25, 1951

The first commercial network television broadcast in color in the United States occurred over the CBS field-sequential color television system on June 25, 1951, when a musical variety special titled Premiere was shown over a network of five East Coast CBS affiliates. Viewing was highly restricted as the program could not be seen on black-and-white sets, and Variety estimated that only thirty prototype color receivers were available in the New York area. Regular CBS color broadcasts began the next day with the daytime series The World Is Yours and Modern Homemakers  starting on June 27.

View Map + Bookmark Entry

Filed under: Television

Maurice Wilkes Introduces Microprogamming July 9 – July 12, 1951

From July 9-12, 1951 the second English electronic computer conference was held at the University of Manchester to inaugurate the first Ferranti Mark 1. There Maurice Wilkes introduced the term microprogramming, referring to the design of control circuits. The idea was not widely accepted until the following decade. (See Reading 8.8.)

View Map + Bookmark Entry

Bertram V. Bowden, the First Computer Salesman in England July 9 – July 12, 1951

Bertram V. Bowden, the first computer salesman in England, discussed “The application of calculating machines to business and commerce” at the second English electronic computer conference held at the University of Manchester from July 9-12, 1951. (See Reading 10.2.)

View Map + Bookmark Entry

The First Application of an Electronic Computer to Molecular or Structural Biology July 9 – July 12, 1951

At the second English computer conference held in Manchester from July 9-12, 1951 computer scientist John Makepiece Bennett and biochemist and crystallographer John Kendrew described their use of the Cambridge EDSAC for the computation of Fourier syntheses in the calculation of structure factors of the protein molecule myoglobin. This was the first application of an electronic computer to computational biology or structural biology. The first published account of this research appeared in the very scarce Manchester University Computer Conference Proceedings (1951). 

Kendrew and Bennett formally published an extended version of their paper as "The Computation of Fourier Syntheses with a Digital Electric Calculating Machine," Acta Crystallographica 5 (1952) 109-116. 

In 1962 Kendrew received the Nobel Prize in chemistry for his discovery of the 3-dimensional molecular structure of myoglobin, the first protein molecule to be "solved."

Hook & Norman, Origins of Cyberspace (2002) nos. 744 & 745.

View Map + Bookmark Entry

The First Demonstration of Computer Music August 7 – August 9, 1951

During August 7 to 9, 1951 Geoff Hill, a computer programmer with perfect pitch, programmed the University of Melbourne CSIR Mk1, the first stored-program computer in Australia, to play a melody, and ran the program at the inaugural Conference of Automatic Computing Machines in Sydney. This was the first demonstration of computer music.

An interview with Trevor Pearcy, one the designers of CSIRAC:

"The CSIR Mk1 operated in Sydney Australia from about November 1949 to June 1955. Geoff Hill was the main programmer at that time and he used the machine to play musical melodies. These melodies, mostly from popular songs, were; 'Colonel Bogey', 'Bonnie Banks', 'Girl with Flaxen Hair' and so on.

"The CSIR Mk1 was dismantled in mid 1955 and moved to The University of Melbourne, where it was renamed CSIRAC. Professor of Mathematics, Thomas Cherry, later Sir Thomas Cherry FRS, had a great interest in programming and music and he created music with CSIRAC. In Melbourne the practice of how CSIRAC was programmed for music was altered and refined somewhat. The program tapes for a couple of test scales still exist, along with the popular melodies 'So early in the Morning' and 'In Cellar Cool', which was a popular drinking song - it appears that the pursuit of computer music and social drinking have been intimately linked since the earliest years. There was also other music on the tape. In about 1957 Cherry wrote a music performance program that would allow a computer user who understood simple standard music notation to enter it easily into CSIRAC for performance, without negotiating all of the timing problems such as was normally required. The music itself may now seem very crude unless it is understood in the context of its creation. It was created by engineers who were not knowledgeable of the latest in musical composition practice and at a time when there was little thought of digital sound. The idea of using a computer, the world's most flexible machine, to create music was a leap of imagination at the time. It is a pity that composers were not invited to use CSIRAC, as they were with the Bell Labs developments, to discover how it could have solved several compositional problems."

View Map + Bookmark Entry

The First Telephone Call Transmitted by Microwave August 17, 1951

On August 17, 1951 the first telephone call was placed on AT&T's microwave radio-relay skyway, the first facilities to transmit telephone conversations across the United States by radio rather than wire or cable. The new backbone telephone route, at the time the longest microwave system in the world, relayed calls along a chain of 107 microwave towers, spaced about 30 miles apart. AT&T spent about three years building it at a cost of $40 million.

The system was designed to carry television signals as well as telephone messages, and less than three weeks after the first phone call, on Sept. 4, 1951 more than 30 million people watched President Harry S. Truman deliver the opening speech at Japanese Peace Treaty conference held in San Francisco— the first transcontinental television broadcast.

View Map + Bookmark Entry

President Truman Makes the First Transcontinental Television Broadcast September 4, 1951

A speech by President Harry S. Truman delivered in San Francisco on September 4, 1951 and broadcast on television from coast to coast, was the first transcontinental television broadcast. Truman's speech, delivered at the opening of the Japanese Peace Treaty Conference, also known as the Treaty of San Francisco, discussed the acceptance by the U.S. of a treaty that officially ended America's post-war occupation of Japan.

The broadcast, the first test of new microwave radio-relay skyway technology developed by AT&T, was carried by 87 stations in 47 American cities. It was estimated that more than 30 million people saw and heard the broadcast—the largest single television audience to date.

View Map + Bookmark Entry

The Second Oldest Known Recordings of Computer Music Circa November 1951

In November 1951 the Ferranti Mark 1 performed Baa Baa Black Sheep and a truncated version of In the Mood at the University of Manchester. The program for Baa Baa Black Sheep was written by Christopher Strachey. The recording of these brief performances, may be the second oldest known recordings of computer-generated music.  

View Map + Bookmark Entry

First Stored-Program Computer to Run Business Programs on a Routine Basis November 17, 1951

On November 17, 1951 LEO I (Lyons Electronic Office) ran a program to "evaluate costs, prices and margins of that week's baked output" at tea shop operator J. Lyons and Company in England. The LEO adaptation of the EDSAC was the first stored-program electronic computer to run business programs on a routine basis. “LEO’s early success owed less to its hardware than to its highly innovative systems-oriented approach to programming, devised and led by David Caminer.”

View Map + Bookmark Entry

Edward R. Murrow Demonstrates the First Live Coast-to-Coast Simultaneous TV Transmission November 18, 1951

On November 18, 1951 Edward R. Murrow hosted the first episode of the American newsmagazine and documentary series, See It Now, broadcast on the CBS television network. The show opened with the first live simultaneous coast-to-coast TV transmission from both the East Coast (the Brooklyn Bridge and New York Harbor) and the West Coast (the San Francisco-Oakland Bay Bridge and San Francisco Bay), as reporters on both sides of the North American continent gave live reports to Murrow, who was sitting in the control room on CBS' Studio 41 with director Don Hewitt.

View Map + Bookmark Entry

Once Finally Operational, the EDVAC is Obsolete 1952

In 1952 the EDVAC binary stored-program computer, planning for which had started in 1944, with development starting in 1947-48, was finally operational at the Moore School in Philadelphia. By this time it was essentially obsolete.

View Map + Bookmark Entry

Vaccuum Tubes Especially Designed for Digital Circuits 1952

In 1952 manufacturers began producing vacuum tubes especially designed for use in digital circuits.

View Map + Bookmark Entry

The First Electronic Computer Produced in France: Not a Stored-Program Computer 1952

In 1952 Compagnie des Machines Bull, the first French electronic computer manufacturer, produced its Gamma 3 electronic calculator. It was not a stored-program computer.

View Map + Bookmark Entry

The First Graphical Computer Game 1952

In 1952 A. S. Douglas wrote Noughts and Crosses, the first graphical computer game, on the cathode ray tube (CRT) screen of the EDSAC at Cambridge University.

View Map + Bookmark Entry

"The Education of a Computer" 1952

In 1952 Grace Hopper published “The Education of a Computer,” in which she described fundamental principles in programming and anticipated future developments. (See Reading 9.5.)

View Map + Bookmark Entry

Filed under: Software

National Educational Television is Founded 1952

In 1952 National Educational Television (NET) was founded by a grant from the Ford Foundation.

View Map + Bookmark Entry

The First Trackball 1952 – 1953

In 1952 British electrical engineer Kenyon Taylor and team, working on the Royal Canadian Navy's DATAR project (a pioneering computerized battlefield information system) invented the first trackball, a precursor of the computer mouse. It used a standard Canadian five-pin bowling ball. The DATAR system was first successfully tested on Lake Ontario in autumn 1953.

View Map + Bookmark Entry

Probably the Best "Book Store" Film Noir 1952

Man Bait, originally released in England by Hammer Film Productions under the title of The Last Page in 1952 was a film noir directed by Terence Fisher starring George Brent and Marguerite Chapman. It also represented the screen debut of sexy Diana Dors, a Marilyn Monroe lookalike who was actually classically trained in acting, as the femme fatale.  

In the film the married manager (Brent) of a bookstore, which sells both new and rare books, is attracted to his sexy blonde clerk (Dors). He attempts to resist temptation but finally kisses her in his office, though the romance does not proceed beyond one kiss. Dors, who had become infatuated with a man played by Peter Reynolds, who she witnessed stealing a rare book in the store, blackmails the bookstore manager for kissing her (remarkably), sending a letter to the manager's wife. The manager's wife, a bed-ridden invalid, unbelievably dies as she gets out of bed to burn the letter. Dors is murdered by the ex-con, with her body stuffed into a shipping crate that was intended for a book shipment. The manager is framed for the murder. 

As unlikely as the plot is, in my opinion and the opinion of many of my colleagues, Man Bait is the best bookstore mystery film, and perhaps the most interesting film set in an antiquarian bookstore. The main area in which the film deviates from authenticity in book trade practice is the seemingly enormous bookstore staff (perhaps 10 people) working in a store which appears to do relatively insignificant business.

The original title of the film, The Last Page, is much more in character with the subdued, sultry sexuality of the film, compared to the graphic elements suggested in the revised title Man Bait, and the graphic elements of the posters advertising the film under that title which strongly emphasize the busty aspect of Ms. Dors.

View Map + Bookmark Entry

The First Compiled Programming Language 1952

In 1952 British computer scientist Alick Glennie developed the first autocode, or programming language, for the Manchester Mark 1 computer at the University of Manchester. This was the first compiled programming language. 

Knuth, Donald E.; Pardo, Luis Trabb. "Early development of programming languages". Encyclopedia of Computer Science and Technology (Marcel Dekker) 7: 419–493.

View Map + Bookmark Entry

Filed under: Software

The First Programmed Chess Game, Played Using a Human Computer 1952

In 1952, two years after Claude Shannon published his theoretical paper on programming a computer to play chess Alan Turing at Manchester wrote a program for playing chess called the "paper machine," and actually used it in a chess game. Following the algorithm with a paper and pencil— or acting as a human CPU, so to speak— Turing played an actual game against British computer scientist Alick Glennie. In this case the "computer" lost; the program hung a queen and resigned.

In January 2014 the game was available from chessgames.com at this link.

View Map + Bookmark Entry

The Hershey-Chase "Waring Blender Experiment" 1952

In the early twentieth century biologists thought that proteins carried genetic information. This was based on the belief that proteins were more complex than DNA. In 1928 Frederick Griffith's research suggested that bacteria are capable of transferring genetic information through a process known as transformation. Research by Avery, MacLeod, and McCarty communicated in 1944 isolated DNA as the material that communicated this genetic information

The Hershey–Chase experiment, often called the "Waring Blender experiment," was conducted in 1952 by American bacteriologist and geneticist Alfred D. Hershey and his research partner American geneticist Martha Chase at Cold Spring Harbor Laboratory, New York. The experiment showed that when bacteriophages, which are composed of DNA and protein, infect bacteria, their DNA enters the host bacterial cell, but most of their protein does not, confirming that DNA is the hereditary material.

Hershey & Chase, "Independent Functions of Viral Protein and Nucleic Acid in Growth of Bacteriophage," J. Gen. Physiol. 36 (1952) 39-56.

Judson, The Eighth Day of Creation, 108. J. Norman (ed) Morton's Medical Bibliography 5th edition (1991) no. 256.

View Map + Bookmark Entry

Possibly the First Artificial Self-Learning Machine January 1952

In January 1952 Marvin Minsky, a graduate student at Harvard University Psychological Laboratories implemented the SNARC (Stochastic Neural Analog Reinforcement Calculator). This randomly connected network of Hebb synapses was the first connectionist neural network learning machine that when "rewarded" facilitated recently-used pathways. The SNARC, implemented using vacuum tubes, was possibly the first artificial self-learning machine.

Minsky, A Neural-Analogue Calculator Based upon a Probability Model of Reinforcement," Harvard University Psychological Laboratories, Cambridge, Massachusetts, January 8, 1952.  This reference came from Minsky's bibliography of his selected publications on his website in December 2013. He did not include this in his bibliography on AI in Computers and Thought (1963), leading me to believe that some or all of the information may have been included in his Princeton Ph.D. dissertation, Neural Nets and the Brain Model Problem (1954). That was also unpublished.

View Map + Bookmark Entry

A First Step by Crick & Watson on the Road to the Discovery of the Double Helix February 1952

In February 1952 Scottish physicist William Cochran at the Cavendish Laboratory, Cambridge, and physicist and molecular biologist Francis Crick also at the Cavendish Laboratory, and Czech-English physical chemist and crystallographer Vladimir Vand of the University of Glasgow submitted their paper on the theory of helical diffraction to Acta Crystallographica. The paper represented “Crick’s full mathematical treatment . . . of the x-ray patterns produced by helical molecules generally” (Judson, The Eighth Day of Creation, 129). It provided the formulae for the Fourier transforms of a number of helical structures, and presented evidence that the structure of a synthetic polypeptide was based on the alpha helix of Pauling and Corey.

“It was, I believe, the first fairly conclusive experimental evidence for the existence of a helical structure at the molecular level. . . . The main value of this work, seen in retrospect, is that it was a first step on the road to the discovery of the structure of DNA by Jim Watson and Crick” (William Cochran, “This week’s citation classic,” Current Contents [May 18, 1987] 16).

Cochran, Crick & Vand, "The Structure of Synthetic Polypeptides. I. The Transform of Atoms on a Helix,” Acta Crystallographica 5 (1952) 109-116. 

View Map + Bookmark Entry

First West Coast Computer Meeting April 30 – May 2, 1952

From April 30 to May 2, 1952 the first electronic computer symposium on the west coast of the United States was held at UCLA. The proceeds appeared later that year as  Proceedings of the Electronic Computer Symposium . . .  at University of California, Los Angeles.

Hook & Norman, Origins of Cyberspace (2002) no. 842.

View Map + Bookmark Entry

Rosalind Franklin's Photo #51 of Crystalline DNA May 2 – May 6, 1952

Between May 2 and May 6, 1952 English molecular biologist Rosalind Franklin, working at King's College, Cambridge took photograph No. 51 of the B-form of crystalline DNA. This was her finest photograph of the substance,  showing the characteristic X-shaped "Maltese cross" clearer than before. 

About eight months later, on January 26, 1953, Franklin showed this photograph to physicist and molecular biologist Maurice Wilkins. Four days later, on January 30, 1953 Wilkins showed the photograph to James Watson. 

The following day Watson asked laboratory director Lawrence Bragg if he could order model components from the Cavendish Laboratory machine shop. Bragg agreed. Watson's account of Franklin's photo 51 to Francis Crick confirmed that they had the vital statistics to build a B-form model: the photo confirmed the 20Å diameter, with a 3.4Å distance between bases. This, plus the repeat distance of 34Å, helix slope about 40°, and the likehood of 2 chains, not 3, seemed to be sufficient to build a model.

Franklin's file copy of Photograph 51, labeled in her handwriting, is preserved at the J. Craig Venter Institute.

View Map + Bookmark Entry

The IAS Machine is Fully Operational June 10, 1952

The IAS computer was fully operational at Princeton on June 10, 1952.

View Map + Bookmark Entry

Applying Computer Methods to Library Cataloguing and Research June 24 – June 27, 1952

At a meeting of the Medical Library Association that took place from June 24-27, 1952 physician and librarian Sanford Larkey reported on progress in the Welch Medical Library Indexing Project which had begun in 1949. This project was probably the earliest attempt to apply punched card tabulating in library cataloguing and information retrieval.

"The goal of the project, of which I was a member until its termination in 1953, was to develop computer-derived indexes to the scientific and medical literature. This mechanization of bibliographic information involved the use of IBM tabulating equipment designed for statistical analysis. The Welch project used standard punched-card machines to
prepare subject-heading lists for the Armed Forces Medical Library, the precursor to the National Library of Medicine (E. Garfield, "The preparation of subject-heading lists by automatic punched-card techniques," Journal of Documentation, 10:1-10, 1954)" (Garfield, "Tribute to Calvin N. Mooers, A Pioneer of Information Retrieval", The Scientist, Vol11, #6 (March 17, 1997) 9).

In Larkey's 1952 report there is a very interesting section which he called the "Psychology of Machines", which I quote:

"I think I should say something about 'machines' themselves at this point. Since we are using machines in all the major phases of our work, I should like to describe the machines we are using and just how we are using them. I will discuss the present status of each phase of our work primarily on the basis of the machine operations involved. Another reason for this approach is that we are have found in discussing our program with others, our use of machines seems either to interest or worry people more than any other feature.

"This brings me to what might be called the 'psychology of machines.' The very word 'machines' seems to do things to people. We hear talk of 'electronic robots,' as though they were some sort of 'men from Mars' who could take over all intellectual activities by merely pushing buttons. This sort of talk leads to excessive hopes or to inordinate fears and precludes objective thinking about the possible uses of machines. One should consider machines as practice adjuncts, as we do typewriters, 3 x 5 cards, and visible indexes. Machines are only doing very rapidly what one could do with his own eyes and brain if had all the time in the world to do it and wanted to do it. There is no magic about it.

"There is, however, a more valid psychological aspect to machines. Since machines operate on a strict yes-or-no principle, we must be rigidly exact in presenting a problem. Each step must be in the most precise logical form, since one rarely can stop to correct as one goes along. Each step must be gone over and over in relation to every other one. One has to think not once, but many times. Programming often takes almost as long as the machine operation itself, but the end result is still reached much more quickly than by manual operations.

"These strict limitations of machines have been very useful to us. They not only have tightened up our own thinking processes, but their application has emphasized many semantic inconsistencies in our terminology and classifications. So, perhaps there may be a good psychological side to machines." (pp. 33-34).

View Map + Bookmark Entry

"A Sound of Thunder": Famous Science Fiction Story; Dubious Film June 27, 1952

"A Sound of Thunder,"a science fiction short story by Ray Bradbury, was first published in Collier's magazine in June 28, 1952, and was very widely reprinted for decades. The story was based on the idea of the butterfly effect, in which a very small event could cause a major change in the outcome of later events. Bradbury's story, set in 2055, concerned the use of a time machine to travel back into the very distant past. In the story the killing of a butterfly during the time of dinosaurs caused the future to change in subtle, but meaningful ways. For those of us who sometimes wonder what might have happened had this or that event been a little different, this story may have special interest.

Here is a radio adaptation of the story:

In 2004 Bradbury's story was made into a feature film with the same title. Why the distinguished actor Ben Kingsley accepted a leading role in this questionable film remains unclear. The film was widely panned by critics and viewers, and bombed at the box office, but I found it amusing enough to include this in the database:

View Map + Bookmark Entry

The First Electronic Computer in Germany September 1952

In September 1952 Heinz Billing's G1 was in full operation at the Max Planck Institute in Göttingen, directed by Werner Heisenberg. This was the first electronic computer in Germany. It used drum memory, but it was not a stored-program machine.

View Map + Bookmark Entry

The First Electronic Computer in Canada September 8 – September 10, 1952

On September 8, 1952 the ACM held a special meeting in Toronto in honor of the installation of the first electronic digital computer in Canada, installed at the University of Toronto. It was a Ferranti Mark I, known as the FERUT computer

View Map + Bookmark Entry

The First Journal on Electronic Computing October 1952

In October 1952 Edmund Berkeley began publication of Computing Machinery Field, the first journal on electronic computing, and the ancestor of all commercially published periodical publications on computing. The first three quarterly issues were mimeographed. By the March 1953 issue the title was changed to Computers and Automation.

View Map + Bookmark Entry

The National Security Agency is Founded November 4, 1952

The National Security Agency/Central Security Service (NSA/CSS), a cryptologic intelligence agency of the United States Department of Defense responsible for the collection and analysis of foreign communications and foreign signals intelligence, as well as protecting U.S. government communications and information systems, officially came into existence on November 4, 1952. 

"The National Security Agency's predecessor was the Armed Forces Security Agency (AFSA), created on May 20, 1949. This organization was originally established within the U.S. Department of Defense under the command of the Joint Chiefs of Staff. The AFSA was to direct the communications and electronic intelligence activities of the U.S. military intelligence units: the Army Security Agency, the Naval Security Group, and the Air Force Security Service. However, that agency had little power and lacked a centralized coordination mechanism. . . . As the change in the security agency's name indicated, the role of NSA was extended beyond the armed forces" (Wikipedia article on National Security Agency, accessed 01-14-2012).

View Map + Bookmark Entry

IBM Produces an "Electronic Data Processing Machine" December 1952

In December 1952 IBM introduced the 701, their first stored-program electronic computer for commercial production. Designed by Nathaniel Rochester, and based on the IAS machine at Princeton, the IBM 701 was intended for scientific use. Feeling that the word "computer" was too closely associated with UNIVAC, IBM called the 701 an “electronic data processing machine.” IBM eventually sold nineteen of these machines. (See Reading 8.9.)

View Map + Bookmark Entry

The First Widely Read English Book on Electronic Computing 1953 – 1968

In 1953 English scientist and educationist Bertram V. Bowden, who for a time worked as a computer salesman for Ferranti Limited, and was later made a life peer as Baron Bowden, edited Faster than Thought, the first widely read English book on electronic digital computing.

Reflective of the slow speed of advances in computing at this time, the book remained in print without change until 1968.

View Map + Bookmark Entry

Invention of the MASER 1953

In 1953 Charles H. Townes, while professor at Columbia University, invented the MASER (Microwave Amplification by Stimulated Emission of Radiation). It was a precursor to the LASER that amplifies light.

View Map + Bookmark Entry

Filed under: Science

The Idea of a Genetic Code 1953 – 1954

In 1953 and 1954 Russian-American theoretical physicist, cosmologist and science writer George Gamow, while at George Washington University, came up with the idea of a genetic code in his paper “Possible Mathematical Relation between Deoxyribonucleic Acids and Proteins” (Det. Kongelige Danske Videnskabernes Selskab: Biologiske Meddeleiser 22, no. 3 [1954] 1-13).

In the fall of 1953 Gamov gave Crick an earlier draft of this paper entitled “Protein synthesis by DNA molecules.”

“Gamov’s scheme was decisive, Crick has often said since, because it forced him, and soon others, to begin to think hard and from a particular slant—that of the coding problem—about the next stage, now that the structure of DNA was known” (Judson,The Eighth Day of Creation, 236).

View Map + Bookmark Entry

Ray Bradbury's Early Dystopian View of Books: "Fahrenheit 451" 1953 – November 2011

Having written the entire book on a pay typewriter in the basement of UCLA's Powell Library, in 1953 Ray Bradbury published the dystopian science fiction novel Fahrenheit 451, named after the temperature at which books are supposed to combust spontaneously. Besides the regular trade edition, the publisher, Ballantine Books, issued a limited edition of 200 copies signed by Bradbury and bound in white boards made of "Johns-Manville Quinterra," a fire-proof asbestos material.

"The novel presents a future American society in which the masses are hedonistic, and critical thought through reading is outlawed. The central character, Guy Montag, is employed is a 'fireman' (which, in this future, means 'book burner'). The number '451' refers to the temperature (in Fahrenheit) at which the books burn when the 'Firemen' burn them 'For the good of humanity'. Written in the early years of the Cold War, the novel is a critique of what Bradbury saw as an increasingly dysfunctional American society.

Bradbury's original intention in writing Fahrenheit 451 was to show his great love for books and libraries. "He has often referred to Montag as an allusion to himself" (Wikipedia article on Fahrenheit 451).

François Truffaut and Jean-Louis Richard wrote a screenplay based on the novel, and Truffault directed a film, released in 1966, entitled Fahrenheit 451, starring Julie Christie and Oskar Werner. The film was re-issued on DVD by Universal Studios in 2003.

♦ After publically opposing ebooks for several years, telling The New York Times in 2009 that "that the Internet is a big distraction," in November 2011, at the age of 91, Bradbury authorized an ebook edition of Fahrenheit 451, and several other of his best-selling books. By this date Fahrenheit 451 had sold more than 10 million copies in print, and had been translated into many languages. Also by this date, ebooks comprised 20% of the fiction book market in the U.S. 

View Map + Bookmark Entry

The Beginning of Positron Emission Tomography (PET) 1953

In 1953 William H. Sweet and Gordon L. Brownell at Massachusetts General Hospital, Boston, described the first positron imaging device, and and the first attempt to record three dimensional data in positron detection in their paper entitled "Localization of brain tumors with positron emitters',' Nucleonics XI (1953) 40-45. This was the beginning of positron emission tomography (PET).

"Despite the relatively crude nature of this imaging instrument, the brain images were markedly better than those obtained by other imaging devices. It also contained several features that were incorporated into future positron imaging devices. Data were obtained by translation of two opposed detectors using coincidence detection with mechanical motion in two dimensions and a printing mechanism to form a two-dimensional image of the positron source. This was our first attempt to record three-dimensional data in positron detection" (Brownell, A History of Positron Imaging [1999], accessed 12-25-2008)

View Map + Bookmark Entry

Probably the First Computer-Controlled Aesthetic System 1953 – 1957

Between 1953 and 1957 English cybernetician and psychologist Gordon Pask, in collaboration with Robin McKinnon-Wood, created Musicolour, a reactive system for theatre productions, or a computer-controlled aesthetic system, that "drove an array of lights that adapted to a musician's performance" (Mason, a computer in the art room. the origins of british computer arts 1950-1980 [2008] 6). This was one of the earliest examples of "computer art." The system's analog computer was transported from performance to performance.

Pask discussed and explained Musicolour in A comment, a case history and a plan (1968) written before the Cybernetic Serendipity exhibition (1968) in which Musicolour was demonstrated. However the text was not published in the catalogue of that exhibition. It was first published in Reichardt ed., Cybernetics: Art and Ideas (1971) 76-99.

Pickering, The Cybernetic Brain. Sketches of Another Future (2010) 313-324.

(This entry was last revised on 08-14-2014.)

View Map + Bookmark Entry

The IBM 650: The First Mass-Produced Computer 1953

In 1953 IBM Endicott, New York, announced the IBM 650 Magnetic Drum Data Processing Machine. This was the first mass-produced computer. Between 1953 and 1962 almost 2000 systems were produced.

"the IBM 650 Magnetic Drum Data Processing Machine brought a new level of reliability to the young field of electronic computing. For example, whenever a random processing error occurred, the 650 could automatically repeat portions of the processing by restarting the program at one of a number of breaking points and then go on to complete the processing if the error did not reoccur. That was a big improvement over the previous procedure requiring the user to direct the machine to repeat the process.

"At the time the 650 was announced, IBM said it would be "a vital factor in familiarizing business and industry with the stored program principles." And it certainly did just that.

"The original market forecast for the 650 envisioned that a mere 50 machines would be sold or installed. But by mid-1955, there already were more than 75 installed and operating, and the company expected to deliver "more than 700" additional 650s in the next few years. Just one year later, there were 300 machines installed -- many more times than all of the IBM 700 series large-scale computers combined -- and new 650s were coming off the production line at the rate of one every day. In all, nearly 2,000 were produced before manufacturing was completed in 1962. No other electronic computer had been produced in such quantity.

"In net terms, the development requirement underlying the 650 was for a small, reliable machine offering the versatility of a stored-program computer that could operate within the traditional punched card environment. IBM -- and the industry -- wanted a machine capable of performing arithmetic, storing data, processing instructions and providing suitable read-write speeds at reasonable cost. The magnetic drum concept was seen as the answer to the speed and storage problems.

"Data and instructions were stored in the form of magnetized spots on the surface of a drum four inches in diameter and 16 inches long, which rotated 12,500 times a minute. The drum memory could hold 20,000 digits at 2,000 separate "addresses" (http://www-03.ibm.com/ibm/history/exhibits/650/650_intro2.html, accessed 10-22-2013).

On September 14, 1956 IBM announced the 355 disk memory unit for the IBM 650.  Systems incorporating the 355 were known as the 650 RAMAC.

View Map + Bookmark Entry

To What Extent Can Human Mental Processes be Duplicated by Switching Circuits? February 1953

In 1953 Bell Laboratories engineer John Meszar published "Switching Systems as Mechanical Brains," Bell Laboratories Record XXXI (1953) 63-69.

This paper, written in the earliest days of automatic switching systems, when few electronic computers existed, and, for the most part, human telephone operators served as "highly intelligent and versatile switching systems," raised the question of whether certain aspects of human thought are computable and others are not. Meszar argued for "the necessity of divorcing certain mental operations from the concept of thinking," in order to "pave the way for ready acceptance of the viewpoint that automatic systems can accomplish many of the functions of the human brain." 

"We are faced with a basic dilemma; we are forced either to admit the possibility of mechanized thinking, or to restrict increasingly our concept of thinking. However, as is apparent from this article, many of us do not find it hard to make the choice. The choice is to reject the possibility of mechanized thinking but to admit readily the necessity for an orderly declassification of many areas of mental effort from the high level of thinking. Machines will take over such areas, whether we like it or not.

"This declassification of wide areas of mental effort should not dismay any one of us. It is not an important gain for those who are sure that even as machines have displaced muscles, they will also take over the functions of the 'brain.' Neither is it a real loss for those who feel that there is something hallowed about all functions of the human mind. What we are giving up to the machines— some of us gladly, others reluctantly— are the uninteresting flat lands of routine mental chores, tasks that have to be performed according to rigorous rules. The areas we are holding unchallenged are the dominating heights of creative mental effort, which comprise the ability to speculate, to invent, to imagine, to philosophize, the dream better ways for tomorrow than exist today. These are the mental activities for which rigorous rules cannot be formulated— they constitute real thinking, whose mechanization most of us cannot conceive" (p. 69).

View Map + Bookmark Entry

"Once the government can demand of a publisher the names of the purchasers of his publications, the free press as we know it disappears." March 9, 1953

In United States v. [Edward] Rumely 345 U.S. 41 (73 S.Ct. 543, 97 L.Ed. 770), decided on March 9, 1953, Justice William O. Douglas, in his concurrence, included the following: 

“If the present inquiry were sanctioned the press would be subjected to harassment that in practical effect might be as serious as censorship. A publisher, compelled to register with the federal government, would be subjected to vexatious inquiries. A requirement that a publisher disclose the identity of those who buy his books, pamphlets, or papers is indeed the beginning of surveillance of the press. True, no legal sanction is involved here. Congress has imposed no tax, established no board of censors, instituted no licensing system. But the potential restraint is equally severe. The finger of government leveled against the press is omnious. Once the government can demand of a publisher the names of the purchasers of his publications, the free press as we know it disappears. Then the specter of a government agent will look over the shoulder of everyone who reads. The purchase of a book or pamphlet today may result in a subpoena tomorrow. Fear of criticism goes with every person into the bookstall. The subtle, imponderable pressures of the orthodox lay hold. Some will fear to read what is unpopular what the powers-that-be dislike. When the light of publicity may reach any student, any teacher, inquiry will be discouraged. The books and pamphlets that are critical of the administration, that preach an unpopular policy in domestic or foreign affairs, that are in disrepute in the orthodox school of thought will be suspect and subject to investigation. The press and its readers will pay a heavy price in harassment. But that will be minor in comparison with the menace of the shadow which government will cast over literature that does not follow the dominant party line. If the lady from Toledo can be required to disclose what she read yesterday and what she will read tomorrow, fear will take the place of freedom in the libraries, bookstores, and homes of the land. Through the harassment of hearings, investigations, reports, and subpoenas government will hold a club over speech and over the press. Congress could not do this by law. The power of investigation is also limited.”

View Map + Bookmark Entry

IBM Installs its First Stored Program Electronic Computer, the 701, but They Don't Call it a Computer March 27, 1953

"The 701 has at least 25 times the over-all speed but is less than one-quarter the size of IBM's Selective Sequence Electronic Calculator, which was dismantled to make room for its speedier successor."

"During its five-year reign as one of the world's best-known "electronic brains," the SSEC solved a wide variety of scientific and engineering problems, some involving many millions of sequential calculations. Such other projects as computing the positions of the moon for several hundred years and plotting the courses of the five outer planets -- with resulting corrections in astronomical tables which had been considered standard for many years -- won such popular acclaim for the SSEC that it stimulated the imaginations of pseudo-scientific fiction writers and served as an authentic setting for such motion pictures as "Walk East on Beacon," a spy-thriller with an FBI background.

"Though the 701 occupies the same quarters as the SSEC, which it rendered obsolete, it is not "built in" to the room as was its predecessor. Instead, it is smartly housed between serrated walls of soft-finished aluminum. A balconied conference room, overlooking the calculator and, separated from it by sloping plate glass, provides a vantage point for observing operations and discussing computations. Ample space is provided for writing the complex and abstract equations that are the stock in trade of engineers and scientists in an age of atomic energy and supersonic flight.

"The 701 uses all three of the most advanced electronic storage, or "memory" devices -- cathode ray tubes, magnetic drums and magnetic tapes. The computing unit uses small versions of the familiar electronic tubes, which are able to count at millions of pulses a second. In addition, several thousand germanium diodes are used in place of vacuum tubes, with resultant savings in space and power requirements.

"The 701 was designed for scientific and research purposes, and similar components are adaptable to the requirements of accounting and record-keeping. Research on commercial, data processing machines is under way.

"The 701 is capable of performing more than 16,000 addition or subtraction operations a second, and more than 2,000 multiplication or division operations a second. In solving a typical problem, the 701 performs an average of 14,000 mathematical operations a second."

(quotations from IBM's original May 27, 1953 press release from the IBM Archives website).

View Map + Bookmark Entry

Discovery of The Double Helix April 25, 1953

At the Cavendish Laboratory, University of Cambridge, in 1953 James D. Watson and Francis Crick discovered the self-complimentary double-helical structure of the DNA molecule. In their paper, “Molecular Structure of Nucleic Acids. A Structure for Deoxyribose Nucleic Acid,” Nature 171 (1953) 737-38, they stated that, “It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.”

View Map + Bookmark Entry

Proposal of a Method of DNA's Method of Replication May 30, 1953

On May 30, 1953 James D. Watson and Francis Crick published “Genetical Implications of the Structure of Deoxyribonucleic Acid, ” Nature 171 (1953) 964-7. In this paper Watson and Crick proposed the the method of replication of DNA. This discovery has been called as significant, or possibly even more significant, than their discovery of the double-helical structure of DNA published in April 1953.

View Map + Bookmark Entry

The First Report on the Application of Electronic Computers to Business June 1953

In June 1953 Richard W. Appel and other students at Harvard Business school issued Electronic Business Mchines: A New Tool for Management.

This was the first report on the application of electronic computers to business. The report was issued before any electronic computer was delivered to an American corporation. (See Reading 10.4.)

View Map + Bookmark Entry

IBM 702 September 1953

In September 1953 IBM announced the development of the 702, a version of the 701 designed for business rather than scientific applications.

View Map + Bookmark Entry

The Beginning of Medical Ultrasonography October 29, 1953

On October 29, 1953 Inge Edler and Carl Hellmuth Hertz at Lund University in Sweden obtained the first recording of the ultrasound echo from the heart. This was the beginning of echocardiography from which diagnostic sonography, or medical ultrasonography, evolved.

"The principle for echocardiography is as follows. The vibrations in a piezoelectric crystal create a beam of high frequency sound waves that are transmitted into the chest. When the waves pass an interface, such as between the heart wall and the surrounding area or the surface of a cardiac valve, some of the sound is reflected, creating an echo. The crystal is reset, enabling it to receive the echo. The longer it took for the echo to return to the crystal, the longer the distance between the crystal and the surface that was the source of the echo. The principle was the same as for sonar, used to measure the depth of water under a vessel, only in this case you measure the distance from the structure that is the source of the echo to the chest wall."

Edler, Inge & Hertz, Carl Hellmuth. The Use of the Ultrasonic Reflectoscope for Continuous Recording of the Movements of Heart Walls. K. Fysiogr. Sellsk. Lund. Foresch., 24 (1954) 1-19.

View Map + Bookmark Entry

The First Transistor Computer November 1953

In November 1953 the University of Manchester's experimental Transistor Computer became operational for the first time. This appears to be the first stored-program computer to use mainly transistors as switches rather than vacuum tubes. The transition from vacuum tubes to transistors in computer design was generally delayed because of reliablility problems in early transistor manufacturing.

"There were two versions of the Transistor Computer, the prototype, operational in 1953, and the full-size version, commissioned in April 1955. The 1953 machine had 92 point-contact transistors and 550 diodes, manufactured by STC. It had a 48-bit machine word. The 1955 machine had a total of 200 point-contact transistors and 1300 point diodes, which resulted in a power consumption of 150 watts. There were considerable reliability problems with the early batches of transistors and the average error free run in 1955 was only 1.5 hours. The Computer also used a small number of tubes in its clock generator, so it was not the first fully transistorized machine" (Wikipedia article on Transistor Computer, accessed 09-19-2013).

View Map + Bookmark Entry

The Deuce Computer (After the Pilot ACE, of Course) 1954

In 1954 English Electric constructed a commercial version of Alan Turing’s Pilot ACE called DEUCE.

Thirty-three of the DEUCE machines were sold, the last in 1962.

View Map + Bookmark Entry

Early Library Information Retrieval System 1954

In 1954 Harley Tillet built the perhaps the first operating library information retrieval system on a general purpose computer (IBM 701) at the Naval Ordnance Test Station (NOTS) at Inyokern, California, later called China Lake.

"Searching started with a file of about 15,000 bibliographic records, indexed only by the Uniterms, and search output was limited to report accession numbers. The task was made even more difficult by the fact that the IBM 701, a scientific calculator, did not have any built-in character representation" (Bourne).

View Map + Bookmark Entry

Coining the Phrase "Social Network" 1954

In 1954 Australian sociologist John A. Barnes coined the phrase, "Social Network" in "Class and Committees in a Norwegian Island Parish," Human Relations VII (1954) 39-58, in which he presented the result of nearly two years of fieldwork in Bremnes on Bømlo Island, Norway.

View Map + Bookmark Entry

First Computer to Incorporate Indexing & Floating Point Arithmetic 1954

In 1954 IBM announced the 704. It was the first commercially available computer to incorporate indexing and floating point arithmetic as standard features. The 704 also featured a magnetic core memory, far more reliable than its predecessors’ cathode ray tube memories. A commercial success, IBM produced one hundred twenty-three 704s between 1955 and 1960.

View Map + Bookmark Entry

The First Light Pen 1954 – 1963

In 1954 development began for NORAD on the SAGE Air Defense System, using a computer built by IBM after a design based on the Whirlwind. The system included the first light pen.

The full SAGE (Semi-Automatic Ground Environment) automated control system for tracking and intercepting enemy bomber aircraft was completed by 1963.

View Map + Bookmark Entry

Probably the First Widely-Accepted Controlled Vocabulary 1954 – 1960

Probably the first widely used controlled vocabulary for searching information was the Subject Heading Authority List issued by the National Library of Medicine from 1954 to 1960.

"The first official list of subject headings published by the National Library of Medicine appeared in 1954 under the title Subject Heading Authority List. It was based on the internal authority list that had been used for publication of Current List of Medical Literature which in turn had incorporated headings from the Library's Index-Catalogue and from the 1940 Quarterly Cumulative Index Medicus Subject Headings. With the inception of Index Medicus in 1960, a new and thoroughly revised Medical Subject Headings [MeSH] appeared.

"With the 1954 Subject Heading Authority List, there appeared a 'Categorical Listing' of standard subheadings. 'Abnormalities,' for instance, was listed as a standard subheading for use with terms for organs, tissues, and regions, and 'anesthesia and analgesia' was to be used under surgical procedure headings. But such subheadings could be used only for subject headings which fell within the category of headings to which they were to be applied. There were over 100 such subheadings, some of which varied only slightly according to the category of main heading with which they were used. For instance, 'therapeutic use' was used under physical agents and drugs and chemicals, and 'therapy' was used with diseases. In the 1960 Medical Subject Headings, the number of subheadings was reduced to sixty-seven. They could be used under any kind of main heading if the combination was not patently foolish or impossible. These sixty-seven subheadings were applied with more generalized meanings. For instance, the subheading "therapy" was used to mean 'therapy of,' 'therapeutic use of' or just 'therapeutic aspects.' Though this solution was simpler, many problems still remained. The use of one subheading might prevent the use of another. For instance, if a paper covered the etiology, pathology, and therapy of a disease, it might occur without further subdivision, or it might occur under the subheading which seemed most appropriate to the indexer. If 'therapy' was chosen, the article would be lost to the searcher looking for the etiology of the disease under the subheading 'etiology.' In addition, if the subheading 'diseases' had been appended to the term for an anatomic part, it would not be possible to subdivide further for the therapy or complications of such diseases. A related problem was the overlap in meaning of the subheadings themselves. It was difficult, for example, to decide whether a paper on chemical biosynthesis fit best under 'chemistry' or 'metabolism.'

"Categorized lists of terms were printed for the first time in the 1963 Medical Subject Headings and contained thirteen main categories and a total of fifty-eight separate groups in subcategories and main categories. These categorized lists made it possible for the user to find many more related terms than were in the former cross-reference structure. In 1963, the second edition of Medical Subject Headings contained 5,700 descriptors, compared with 4,400 in the 1960 edition. Of the headings used in the 1960 list, 113 were withdrawn in favor of newer terms. In contrast, the 2009 edition of MeSH contains 25,186 descriptors.

"In 1960, medical librarianship was on the cusp of a revolution. The first issue of the new Index Medicus series was published. On the horizon was a computerization project undertaken by the National Library of Medicine (NLM) to store and retrieve information. The Medical Literature Analysis and Retrieval System (MEDLARS) would speed the publication process for bibliographies such as Index Medicus, facilitate the expansion of coverage of the literature, and permit searches for individuals upon demand. The new list of subject headings introduced in 1960 was the underpinning of the analysis and retrieval operation. MeSH was a new and thoroughly revised version of lists of subject headings compiled by NLM for its bibliographies and cataloging. Frank B. Rogers, then NLM director, announced several innovations as he introduced MeSH in 1960" (http://www.nlm.nih.gov/mesh/2009/introduction/intro_preface.html#pref_hist. accessed 05-04-2009).

View Map + Bookmark Entry

Publication of Chartae Latinae Antiquiores Begins 1954

Under the editorship of Albert Bruckner and Robert Marichal publication of the Chartae Latinae Antiquiores by Urs Graf Verlag in Dietikon, Switzerland, began in 1954.

Intended as supplement the Codices Latini Antiquiores of E.A. Lowe, the CLA constitutes a photographic catalogue of all Latin 'literary' manuscripts rather than codices, written on papyrus or parchment, which antedate 800 CE. It was necessary to provide a codicological description of each one of them and a photograph of some lines each. The 49th and last volume in this series appeared in 1997.

The second series (ChLA2), founded by Guglielmo Cavallo and Giovanna Nicolaj,  and beginning with volume 50, intends to publish all the documents surviving from 800 to 900 CE preserved in European archives and libraries. The first phase of the project dedicated to Italy reached Volume 75. Including the Appendix it will be completed with Volume 99. Volume 100 marks the beginning of the series dedicated to St. Gall, with 13 volumes the largest collection of charters besides Italy. The publication of the charters should eventually be extended to the other European countries. 

♦ The Chartae Latinae Antiquiores database can be searched online at: http://www.urs-graf-verlag.com/index.php?funktion=chla_suche.

View Map + Bookmark Entry

The First Industrial Robot 1954 – 1961

In 1961 the first industrial robot, Unimate, created by American inventor George Devol, was in operation on a General Motors assembly line at the Inland Fisher Guide Plant in Ewing Township, New Jersey. The 4000 pound robotic arm transported die castings from an assembly line and welded these parts on auto bodies, a dangerous task for workers, who could be poisoned by exhaust gas or lose a limb if they were not careful.

Unimate was based on Devol's 1954 patent specification on Programmed Article Transfer that introduced the concept of "Universal Automation" or Unimation. For this patent there were no prior citationsU.S. Patent 2,988,237 was granted in 1961. In his patent application Devol wrote, "the present invention makes available for the first time a more or less general purpose machine that has universal application to a vast diversity of applications where cyclic digital control is desired."  The robot was built by Unimation, the world's first robot manufacturing company, founded in 1956 by Devol and Joseph Engelberger in Danbury, Connecticut.

"The first Unimate prototypes were controlled by vacuum tubes used as digital switches though later versions used transistors. Further, the "off-the-shelf" parts available in the late 1950s, such as digital encoders, were not adequate for the Unimate's purpose, and as a result, with Devol's guidance and a team of skilled engineers, Unimation designed and machined practically every part in the first Unimates. They also invented a variety of new technologies, including a unique rotating drum memory system with data parity controls.

"In 1960, Devol personally sold the first Unimate robot, which was shipped in 1961 from Danbury, Connecticut to General Motors. . . . In 1966, after many years of market surveys and field tests, full scale production began in Connecticut. Unimation's first production robot was a materials handling robot and was soon followed by robots for welding and other applications. In 1975, Unimation showed its first profit" (Wikipedia article on George Devol, accessed 09-17-2013).

View Map + Bookmark Entry

Journal of the ACM January 1954

The Journal of the Association of Computing Machinery (Journal of the ACM) began publication in January 1954. At this time the ACM had twelve hundred members.

View Map + Bookmark Entry

The First National Color Television Broadcast in the United Sates January 1, 1954

All electronic television was introduced in the United States in 1953. The first national color television broadcast was the 1954 Tournament of Roses Parade on January 1, 1954.

View Map + Bookmark Entry

Filed under: Television

The FCC Approves the RCA System of Color Television Broadcasting in the United States January 22, 1954

On January 22, 1954 the Federal Communications Commission (FCC) approved the National Television Committee’s recommendation for a system of color television broadcasting based on the RCA Dot Sequential Color System.

View Map + Bookmark Entry

RCA Begins Manufacturing Color Televisions March 24, 1954

On March 24, 1954 RCA began manufacture of its twelve-inch model CT100 color television set, which used phosphor dots deposited on an internal glass plate. Five thousand of these sets were produced and sold at the then very high retail price of $1000.

View Map + Bookmark Entry

Filed under: Television

Texas Instruments Manufactures the First Silicon Transistor May 10, 1954

In 1954 Texas Instruments manufactured the first silicon transistor, the 900-905 series.

View Map + Bookmark Entry

Grace Hopper Organizes the First Symposium on Software May 13 – May 14, 1954

Programmer Grace Hopper organized the first symposium strictly on software for the Office of Naval Research in Washington, D.C. It took place on May 13 and 14, 1954, and was attended by over 200 people. The published proceedings were entitled Symposium on Automatic Programming for Digital Computers (1954). (See Reading 9.6.)

View Map + Bookmark Entry

Filed under: Software

Alan Turing's Ambiguous Suicide June 8, 1954

On June 7, 1954 Turing probably committed suicide in Wilmslow, a town in Cheshire, England, by eating an apple laced with cyanide. He was only 42 years old. Had Turing lived a normal life span he would have contributed profoundly to further developments in computing, and it is even possible that as a result of those contributions, the English computing industry might have been more competitive with that in the United States. Some ambiguity remains about Turing's cause of death. In December 2013 the best summary of the issues was in the Wikipedia article on Alan Turing, from which I quote:

"On 8 June 1954, Turing's cleaner found him dead. He had died the previous day. A post-mortem examination established that the cause of death was cyanide poisoning. When his body was discovered, an apple lay half-eaten beside his bed, and although the apple was not tested for cyanide, it was speculated that this was the means by which a fatal dose was consumed. This suspicion was strengthened when his fascination with Snow White and the Seven Dwarfs was revealed, especially the transformation of the Queen into the Witch and the ambiguity of the poisoned apple. An inquest determined that he had committed suicide, and he was cremated at Woking Crematorium on 12 June 1954. Turing's ashes were scattered there, just as his father's had been.

'Hodges and David Leavitt have suggested that Turing was re-enacting a scene from the 1937 Walt Disney film Snow White, his favourite fairy tale, both noting that (in Leavitt's words) he took "an especially keen pleasure in the scene where the Wicked Queen immerses her apple in the poisonous brew". This interpretation was supported in an article in The Guardian written by Turing's friend, the author Alan Garner, in 2011.

"Professor Jack Copeland (philosophy) has questioned various aspects of the coroner's historical verdict, suggesting the alternative explanation of the accidental inhalation of cyanide fumes from an apparatus for gold electroplating spoons, using potassium cyanide to dissolve the gold, which Turing had set up in his tiny spare room. Copeland notes that the autopsy findings were more consistent with inhalation than with ingestion of the poison. Turing also habitually ate an apple before bed, and it was not unusual for it to be discarded half-eaten. In addition, Turing had reportedly borne his legal setbacks and hormone treatment (which had been discontinued a year previously) 'with good humour' and had shown no sign of despondency prior to his death, in fact, setting down a list of tasks he intended to complete upon return to his office after the holiday weekend. At the time, Turing's mother believed that the ingestion was accidental, caused by her son's careless storage of laboratory chemicals. Biographer Andrew Hodges suggests that Turing may have arranged the cyanide experiment deliberately, to give his mother some plausible deniability." 

View Map + Bookmark Entry

The First Use of a Computer to Write Literary Texts October 1954

In the October 1954 issue of the journal Enounter (pp. 25-31) British computer scientist Christopher Strachey published "The 'Thinking' Machine."  Strachey's paper included two love letters written by the Ferranti Mark I computer at the University of Manchester running a program which he had written. This represented the first use of a computer to write literary texts.

Herzogenrath & Nierhoff-Wielk, Ex Machina-Frühe Computergrafik bis 1979. . . . Ex Machina- Early Computer Graphics to 1979 (2007) 229.

View Map + Bookmark Entry

The First Commercial Transistor Radio: The First Widely Sold Tranistorized Product November 1954

In November 1954 the first pocket-sized commercial transistor radio, the Regency TR-1 was offered for sale. Designed by Texas Instruments of Dallas, Texas, it was built and marketed by Industrial Development Engineering Associates (I.D.E.A.) of Indianapolis, Indiana. This was the first wide-sold transistorized product, and because of the novelty and small size, about 150,000 units were sold despite mediocre performance.

"In May 1954, Texas Instruments had designed and built a prototype transistor radio and was looking for an established radio manufacturer to develop and market a radio using their transistors. No major radio maker, including RCAPhilco, and Emerson, was interested. The President of I.D.E.A. at the time, Ed Tudor, jumped at the opportunity to manufacture the TR-1, predicting sales of the transistor radios would be '20 million radios in three years.' The Regency Division of I.D.E.A announced the TR-1 on October 18, 1954, and put it on sale in November 1954" (Wikipedia article on Regency TR-1, accessed 12-1-2013).

View Map + Bookmark Entry

The First Routine Real-Time Numerical Weather Forecasting December 1954

Starting in December 1954, the Royal Swedish Air Force Weather Service in Stockholm made weather forecasts for the North Atlantic region three times a week using the Swedish BESK computer running a barotropic model developed by the Institute of Meteorology at the University of Stockholm, associated with the eminent meteorologist Carl-Gustaf Rossby. These were the first routine real-time numerical weather forecasts.

Staff Members, Institute of Meterology, University of Stockholm. "Results of Forecasting with the Barotropic Model on an Electronic Computer (BESK)," Tellus 6 (1954): 139-149.

View Map + Bookmark Entry

One of the Earliest Surviving British Television Dramas December 12 – December 14, 1954

From December 12-14, 1954 the BBC presented a television production of George Orwell's Nineteen Eighty-Four, adapted for television by Nigel  Kneale.

"Kneale's script was a largely faithful adaptation of the novel as far as was practical with the limitations of the medium. The writer did, however, make some small additions of his own, the most notable being the creation of a sequence in which O'Brien observes Julia at work in PornoSec, and reads a small segment from one of the erotic novels being written by the machines there."

"When it had become clear what an important production Nineteen Eighty-Four was, it was arranged for the second performance [December 14, 1954] to be telerecorded onto 35mm film – the first performance having simply disappeared off into the ether, as it was shown live, seen only by those who were watching on the Sunday evening. At this stage, Videotape recording was still at the development stage and television images could only be preserved on film by using a special recording apparatus (known as "telerecording" in the UK and "kinescoping" in the USA), but was only used sparingly, then in Britain for historic preservation reasons and not for pre-recording. It is thus the second performance that survives in the archives, one of the earliest surviving British television dramas" (Wikipedia article on Nineteen Eight-Four (TV Programme), accessed 07-26-2009).

View Map + Bookmark Entry

Magnetic Core Storage Units 1955

In 1955 IBM developed magnetic core storage units, a dramatic improvement over cathode ray tube memory technology. By successfully adapting pill-making machines for production, IBM greatly improved the manufacture of these tiny, “doughnut” shaped, iron oxide cores, making the cores reliable and cost effective enough to serve as the basic technology behind every computer’s main memory until the early 1970s.

View Map + Bookmark Entry

The First Amino Acid Sequence of a Protein 1955

In 1955 English biochemist Frederick Sanger sequenced the amino acids of insulin, the first of any protein.

Sanger's work “revealed that a protein has a definite constant, genetically determined sequence—and yet a sequence with no general rule for its assembly. Therefore it had to have a code” (Judson, Eighth Day of Creation, 188).

"Sanger's first triumph was to determine the complete amino acid sequence of the two polypeptide chains of bovine insulin, A and B, in 1952 and 1951, respectively. Prior to this it was widely assumed that proteins were somewhat amorphous. In determining these sequences, Sanger proved that proteins have a defined chemical composition. For this purpose he used the "Sanger Reagent", fluorodinitrobenzene (FDNB), to react with the exposed amino groups in the protein and in particular with the N-terminal amino group at one end of the polypeptide chain. He then partially hydrolysed the insulin into short peptides, either with hydrochloric acid or using an enzyme such as trypsin. The mixture of peptides was fractionated in two dimensions on a sheet of filter paper, first by electrophoresis in one dimension and then, perpendicular to that, by chromatography in the other. The different peptide fragments of insulin, detected with ninhydrin, moved to different positions on the paper, creating a distinct pattern that Sanger called 'fingerprints'. The peptide from the N-terminus could be recognised by the yellow colour imparted by the FDNB label and the identity of the labelled amino acid at the end of the peptide determined by complete acid hydrolysis and discovering which dinitrophenyl-amino acid was there. By repeating this type of procedure Sanger was able to determine the sequences of the many peptides generated using different methods for the initial partial hydrolysis. These could then be assembled into the longer sequences to deduce the complete structure of insulin. Finally, because the A and B chains are physiologically inactive without the three linking disulfide bonds (two interchain, one intrachain on A), Sanger and coworkers determined their assignments in 1955. Sanger's principal conclusion was that the two polypeptide chains of the protein insulin had precise amino acid sequences and, by extension, that every protein had a unique sequence. It was this achievement that earned him his first Nobel prize in Chemistry in 1958. This discovery was crucial for the later sequence hypothesis of Crick for developing ideas of how DNA codes for proteins" (Wikipedia article on Frederick Sanger, accessed 11-20-2013).

View Map + Bookmark Entry

The First Stored-Program Computer Produced for Sale in France 1955

Compagnie des Machines Bull launched the first stored-program electronic computer produced for commercial sale in France-- the Gamma ET.

View Map + Bookmark Entry

Machine Methods for Information Searching 1955

On the completion of the Welch Medical Library Indexing Project, five authors, including Eugene Garfield, issued  the Final Report on Machine Methods for Information Searching.

View Map + Bookmark Entry

The First Transatlantic Telephone Cable is Operational 1955 – September 25, 1956

On September 25, 1956 the first transatlantic telephone cable, TAT-1, became operational, carrying 36 telephone channels. It was laid between Gallanach Bay, near Oban, Scotland and Clarenville, Newfoundland between 1955 and 1956. 

Prior to this development, since 1927, very expensive radio-based transatlantic telephone service was available. However, radio-based transatlantic telephone service carried only around 2000 calls per year.

View Map + Bookmark Entry

The First Artificial Intelligence Program 1955 – July 1956

During 1955 and 1956 computer scientist and cognitive psychologist Allen Newell, political scientist, economist and sociologist Herbert A. Simon, and systems programmer John Clifford Shaw, all working at the Rand Corporation in Santa Monica, California, developed the Logic Theorist, the first program deliberately engineered to mimic the problem solving skills of a human being. They decided to write a program that could prove theorems in the propositional calculus like those in Principia Mathematica by Alfred North Whitehead and Bertrand Russell. As Simon later wrote,

"LT was based on the system of Principia mathematica, largely because a copy of that work happened to sit in my bookshelf. There was no intention of making a contribution to symbolic logic, and the system of Principia was sufficiently outmoded by that time as to be inappropriate for that purpose. For us, the important consideration was not the precise task, but its suitability for demonstrating that a computer could discover problem solutions in a complex nonnumerical domain by heuristic search that used humanoid heuristics" (Simon,"Allen Newell: 1927-1992," Annals of the History of Computing 20 [1998] 68).

The collaborators wrote the first version of the program by hand on 3 x 5 inch cards. As Simon recalled:

"In January 1956, we assembled my wife and three children together with some graduate students. To each member of the group, we gave one of the cards, so that each one became, in effect, a component of the computer program ... Here was nature imitating art imitating nature" (quoted in the Wikipedia article Logic Theorist, accessed 01-02-2013). 

The team showed that the program could prove theorems as well as a talented mathematician. Eventually Shaw was able to run the program on the computer at RAND's Santa Monica facility. It proved 38 of the first 52 theorems in Principia Mathematica. For Theorem 2.85 the Logic Theorist surpassed its inventors’ expectations by finding a new and better proof. This was the “the first foray by artificial intelligence research into high-order intellectual processes” (Feigenbaum and Feldman, Computers and Thought [1963]).

Newell and Simon first described the Logic Theorist in Rand Corporation report P-868 issued on June 15, 1956, entitled The Logic Theory Machine. A Complex Information Processing System. (For some reason the only online version of this report available in January 2014 began on p. 25; however, the text available included the complete program.) The report was first officially published in September, 1956 under the same title in IRE Transactions on Information Theory IT-2, 61-79.

Newell and Simon demonstrated the program at the Dartmouth Summer Session on Artificial Intelligence held during the summer of 1956. 

Hook & Norman, Origins of Cyberspace (2002) no. 815.

View Map + Bookmark Entry

The Courier Monospaced Typeface Debuts 1955

Howard "Bud" Kettler designed the monospaced, or fixed-width or non-proportional, slab serif typeface to resemble the output from a strike-on typewriter.

"The design of the original Courier typeface was commissioned in the 1950s by IBM for use in typewriters, but they did not secure legal exclusivity to the typeface and it soon became a standard font used throughout the typewriter industry. As a monospaced font, it has recently found renewed use in the electronic world in situations where columns of characters must be consistently aligned. . . .

"Kettler was once quoted about how the name was chosen. The font was nearly released with the name "Messenger." After giving it some thought, Kettler said, 'A letter can be just an ordinary messenger, or it can be the courier, which radiates dignity, prestige, and stability' " (Wikipedia article on Courier [typeface], accessed 04-26-2009).

View Map + Bookmark Entry

The Sensorama: One of the First Functioning Efforts in Virtual Reality 1955 – 1962

American cinematographer and inventor Morton Heilig described his vision of a multi-sensory theater in a 1955 paper entitled "The Cinema of the Future."

In 1962 Heilig built a prototype of his immersive, multi-sensory, mechanical multimodal theater called the Sensorama, and created five short films to be displayed in it.  On August 28, 1962 Heilig was granted U.S. Patent 3,050,870 for a "Sensorama Simulator."  This invention is considered one of the earliest functioning efforts in virtual reality.

View Map + Bookmark Entry

The First Independent Software Company 1955

Elmer C. Kubie and John W. Sheldon founded Computer Usage Company, the first independent company to specialize in software.  It declared bankrupcy in 1986.

View Map + Bookmark Entry

Pioneer Program in Pattern Recognition 1955

In "Self Pattern recognition and modern computers," Proceedings of the Western Joint Computer Conference (1955) 91–93, English artificial intelligence researcher Oliver Selfridge described one of the first attempts to devise an optical character-reading program by “teaching” the computer to extract the significant features of a given letter-pattern from a background of irrelevant detail.

“This involved getting the machine to accept slightly different versions of the same typed symbol as exactly that—different versions of the same symbol. In attacking this problem Selfridge launched a project that continues to absorb energy, the project of making machines recognize certain slightly different configurations of elements as constituting the same pattern (or, looking at it in another way, getting the machine to recognize the same identities as the human being). Visual pattern recognition was Selfridge’s particular concern, but, in its general form, pattern recognition is a fundamental topic in almost all AI projects” (Pratt, Thinking Machines. The Evolution of Artificial Intelligence [1987] 204).Selfridge, a native of England, matriculated at MIT at the age of fourteen. He published a paper on neural nets in 1948 (Archives of the Institute of Cardiology of  Mexico [1948]: 177–87) and in 1955 organized with Marvin Minsky the first conference on AI.

Selfridge, a native of England, matriculated at MIT at the age of fourteen. He published a paper on neural nets in 1948 (Archives of the Institute of Cardiology of  Mexico [1948]: 177–87) and in 1955 organized with Marvin Minsky the first summer conference on AI

In an interview, "Oliver Selfridge—in from the start, IEEE Expert 11, no. 5 (1996) 15-17, Selfridge discussed his early involvement with artificial intelligence:

"Q: How did you become interested in AI?

Oliver Selfridge: It was at MIT, a long time before the Dartmouth Conference, and I was studying mathematics under Norbert Wiener. By luck, of which I’ve had a great deal in my life, I was introduced to Walter Pitts, who was working with Warren McCulloch on a topic they called theoretical neurophysiology. I had studied logic, and through Walter, Warren, and Norbert got introduced to neural nets at that time. I went to the Pacific at the end of World War II with the United States Navy and came back to graduate school, again at MIT. Norbert was then writing Cybernetics, and Walter and I were helping him with various aspects of it. As I studied mathematics (my original field) and interacted with Norbert, Warren, and Walter, I began to be interested in the specific processing that neural nets could do and even more interested in the general properties of learning.

At this point McCulloch and Pitts had written the first two AI papers (although it wasn’t called that). The first showed that a neural net could work out certain kinds of problems, such as pattern recognition in the general cognitive sense, and the second discussed acquisition of patterns (how we know “universals”). These two works followed all the glorious mathematics that Turing and Gödel had done in the twenties and thirties about computability and Turing machines. This mathematics was, of course, the beginning of a formal description of what computability meant. Johnny Von Neumann visited us at MIT occasionally, so again by pure luck, before the age of twenty, I had been introduced to McCulloch, Pitts, Wiener, and Von Neumann." 

Hook & Norman, Origins of Cyberspace (2002) No. 877.

(This entry was last revised on 04-19-2014.)

View Map + Bookmark Entry

The First Book on Machine Translation 1955

In 1955 William N. Locke of the Department of Modern Languages at MIT, and English electrical engineer, computer scientist and machine translation pioneer Andrew Donald Booth issued Machine Translation of Languages, the first book on machine translation. This was an anthology of essays by fourteen of the earliest pioneers in the field. The foreword to the book was by Warren Weaver, who largely set research on machine translation in motion with his July 1949 memorandum Translation, republished as the first chapter in the volume. The editors began the book with an historical introduction that they wrote jointly, and ended it with an annotated bibliography of 46 references that represented virtually the entire literature on the subject at the time. The history as the authors saw it, began with discussions by Booth and Weaver in 1946 in which Weaver thought that cryptanalysis techniques developed in WWII could be adapted for translation, while Booth thought that, given the extremely limited memory capacity of the earliest machines, some kind of electronic dictionaries could be created.

A review of the book by Martin Joos published in Language in 1956 summarized the primitive state of the art at the time, pointing out that in 1956 human translation remained cheaper and faster— not to say more accurate— than machine translation. I quote its first paragraph:

"M [achine] T[ranslation] is today both a dream and a reality. The dream is that some day electronic computing machines will do our translation for us. The reality is that MT is being done currently, experimentally and with low-grade results and that dozens of earnest workers are also trying, of course, to expand and sharpen their methods. Nowadays it is not usually a  computer that performs the MT work it is a person (or crew) duplicating with paper and pencil the very procedures that the computer would use. The procedures are rigidly controlled, and it is known that a computer could be 'programmed' to follow them. But in the experimental and development stage of MT it is not only cheaper to do the work by hand; it is also faster."

View Map + Bookmark Entry

SHARE, The First Computer Users' Group, is Founded 1955

In 1955 the SHARE volunteer-run user group for IBM mainframe computers was founded in the Los Angeles area by users of the IBM 701, IBM's first commercial stored-program computer. The group evolved into a forum for exchanging technical information about programming languages, operating systems, database systems, and user experiences for enterprise users of IBM computers. Because IBM originally distributed its operating systems in source form, systems programmers commonly made small local additions or modifications and exchanged them with other users. These shared exchanges became the SHARE library, and the process of distributed and collaborative development it fostered was one of the major origins of open source software.

Before the Internet and email SHARE distributions were made by mail, including correspondence, print-outs of programs, decks of punched cards, programs on magnetic tape, etc. The first distribution from SHARE, as documented in the Paul Pierce collection at the Computer History Museum, was issued on October 17, 1955. The group initially paginated its many and long distributions consecutively, apparently giving up this practice when it reached page 13,853 with distribution No. 625 in February, 1959. In 2015 SHARE.org celebrated its 60th anniversary.

View Map + Bookmark Entry

"The Design of Machines to Simulate the Behavior of the Human Brain" March 1955 – December 1956

At the 1955 Institute of Radio Engineers (IRE) Convention held in New York in March the Professional Group on Electronic Computers (PGEC) sponsored a symposium on "The Design of Machines to Simulate the Behavior of the Human Brain." The four panel members were Warren McCulloch of MIT, Anthony G. Oettinger of Harvard, Otto H. Schmitt of the University of Minnesota, and Nathaniel Rochester of IBM. The moderator was Howard E. Tompkins, then of Burroughs Corporation.

After the panel members read prepared statements, and a brief discussion, a group of invited questioners cross-examined the panel members. The invited questioners were Marvin Minsky, then of Harvard, Morris Rubinoff of the University of Pennsylvania, Elliot L. Gruenberg of the W. L. Maxson Corporation, John Mauchly, of what was then Remington Rand, M. E. Maron of IBM, and Walter Pitts of MIT. The transcript of the symposium was edited by the speakers with the help of Howard Tompkins, and published in the IRE Transactions on Electronic Computers, December 1956, 240-255.

From the transcript of the symposium, which was available online when I wrote this entry in April 2014, we see that many of the issues of current interest in 2014 were being discussed in 1955-56. McCulloch began the symposium with the following very quotable statement:

"Since nature has given us the working model, we need not ask, theoretically, whether machines can be built to do what brains can do with information. But it will be a long time before we can match this three-pint, three-pound, twenty-five-watt computer, with its memory storing 10¹³ or 10 [to the 15th power] bits with a mean half-life of half a day and successful regeneration of 5 per cent of its traces for sixty years, operating continuously wih its 10 [to the 10th power] dynamically stable and unreplaceable relays to preserve itself by governing its own activity and stabilizing the state of the whole body and its relation to its world by reflexive and appetitive negative feedback."

As I read through this discussion, I concluded that it was perhaps the best summary of ideas on the computer and the human brain in 1955-1956. As quoting it in its entirety would have been totally impractical, I instead listed the section headings and refer those interested to the original text:

McCulloch: "Brain," A Computer With Negative Feedback

Oettinger: Contrasts and Similarities

Rochester: Simulation of Brain Action on Computers

Schmitt: The Brain as a Different Computer


Chemical Action, Too

Cell Assemblies

Why Build a Machine "Brain"?

Is Systematicness Undesirable?

Growth as a Type of Learning

What Does Simultation Prove?

The Semantics of Reproduction

Where is the Memory?

"Distributed Memories"

"Memory Half-Life"

Analog vs. Digital

Speed vs. Equipment

The Neurophysiologists' Contribution

Pattern Recognition

Creative Thinking by Machines?

What Model Do We Want?

View Map + Bookmark Entry

The First Solid State Computer April 1955 – December 1957

In April 1955 IBM announced the development of the IBM 608 calculator, the first all solid-state (fully transistorized) computer commercially marketed. The machine was first shipped to customers in December 1957. Development of the 608 was preceded by prototyping an experimental all-transistor version of the 604. This was built and demonstrated in October 1954, but was not commercialized.

View Map + Bookmark Entry

"The Magical Number Seven, Plus or Minus Two. . . " April 15, 1955 – 1956

In 1956 American cognitive psychologist George Armitage Miller, then teaching at Harvard, published "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information," Psychological Review, Vol. 63, No. 2, 81-97. He had read the paper before the Eastern Psychological Association on April 15, 1955. 

"From the days of William James, psychologists had the idea memory consisted of short-term and long-term memory. While short-term memory was expected to be limited, its exact limits were not known. In 1956, Miller would quantify its capacity limit in the paper 'The magical number seven, plus or minus two'. He tested immediate memory via tasks such as asking a person to repeat a set of digits presented; absolute judgment by presenting a stimulus and a label, and asking them to recall the label later; and span of attention by asking them to count things in a group of more than a few items quickly. For all three cases, Miller found the average limit to be seven items. He had mixed feelings about the focus on his work on the exact number seven for quantifying short-term memory, and felt it had been misquoted often. He stated, introducing the paper on the research for the first time, that he was being persecuted by an integer. Miller also found humans remembered chunks of information, interrelating bits using some scheme, and the limit applied to chunks. Miller himself saw no relationship among the disparate tasks of immediate memory and absolute judgment, but lumped them to fill a one-hour presentation" (Wikipedia article on George Armitage Miller, accessed 12-30-2012). 

"The word ‘'chunking’' comes from a famous 1956 paper by George A. Miller, The Magical Number Seven, Plus or Minus Two: Some Limits on our Capacity for Processing Information. At a time when information theory was beginning to be applied in psychology, Miller observed that some human cognitive tasks fit the model of a 'channel capacity,' characterized by a roughly constant capacity in bits, but short-term memory did not. A variety of studies could be summarized by saying that short-term memory had a capacity of about "seven plus-or-minus two" chunks. Miller wrote that 'With binary items the span is about nine and, although it drops to about five with monosyllabic English words, the difference is far less than the hypothesis of constant information would require (see also, memory span ). The span of immediate memory seems to be almost independent of the number of bits per chunk, at least over the range that has been examined to date.' Miller acknowledged that 'we are not very definite about what constitutes a chunk of information.' Miller noted that according to this theory, it should be possible to effectively increase short-term memory for low-information-content items by mentally recoding them into a smaller number of high-information-content items. 'A man just beginning to learn radio-telegraphic code hears each dit and dah as a separate chunk. Soon he is able to organize these sounds into letters and then he can deal with the letters as chunks. Then the letters organize themselves as words, which are still larger chunks, and he begins to hear whole phrases.' Thus, a telegrapher can effectively 'remember' several dozen dits and dahs as a single phrase. Naive subjects can only remember about nine binary items, but Miller reports a 1954 experiment in which people were trained to listen to a string of binary digits and (in one case) mentally group them into groups of five, recode each group into a name (e.g. "twenty-one" for 10101), and remember the names. With sufficient drill, people found it possible to remember as many as forty binary digits. Miller wrote: 'It is a little dramatic to watch a person get 40 binary digits in a row and then repeat them back without error. However, if you think of this merely as a mnemonic trick for extending the memory span, you will miss the more important point that is implicit in nearly all such mnemonic devices. The point is that recoding is an extremely powerful weapon for increasing the amount of information that we can deal with " (Wikipedia article on Chunking (pschology), accessed 12-30-2012).

View Map + Bookmark Entry

The Foundation of Citation Analysis July 15, 1955

Eugene Garfield published "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas," Science, Vol. 122, No. 3159, 108-11. This paper may be the foundation of "bibliometrics" or citation analysis.

"Eugene Garfield . . . was deeply involved in the research relating to machine generated indexes in the mid-1950's and early 1960's. One of his earliest points of involvement was a project sponsored by the Armed Forces Medical Library (predecessor to our current National Library of Medicine). The Welch Medical Library Indexing project, as it was called, was to investigate the role of automation in the organization and retrieval of medical literature. The hope was that the problems associated with subjective human judgement in selection of descriptors and indexing terms could be eliminated. By removing the human element, one might thereby increase the speed with which information was incorporated in to the indexes. It might also increase the cost-effectiveness of the indexes. Garfield grasped early on that review articles in the journal literature were heavily reliant on the bibliographic citations that referred the reader to the original published source for the notable idea or concept. By capturing those citations, Garfield believed, the researcher could immediately get a view of the approach taken by another scientist to support an idea or methodology based on the sources that the published writer had consulted and cited as pertinent in the bibliography. As retrieval terms, citations could function as well as keywords and descriptors that were thoughtfully assigned by a professional indexer."

View Map + Bookmark Entry

Coining the Term, Artificial Intelligence August 31, 1955

On August 31, 1955 John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon invited participants to a summer session at Dartmouth College to conduct research on what they called Artificial Intelligence (AI), thereby coining the term. (See Reading 11.5.)

View Map + Bookmark Entry

The Beginning of Computerization of Banking September 1955

Stanford Research Institute in Menlo Park, California, began the computerization of the banking industry by demonstrating a prototype electronic accounting machine using its ERMA (Electronic Recording Method of Accounting) system.

View Map + Bookmark Entry

The First Full-Scale Programmable Japanese Computer October 1955

ETL-Mark-2, the first full-scale programmable computer in Japan, was produced by the Electrotechnical Laboratory in Roppongi, Tokyo. It was built from 21,000 relays, and did not store a program.

View Map + Bookmark Entry

One of the First Fictional Depictions of Autonomous Machine Self-Replication November 1955

In November 1955 science fiction writer Philip K. Dick published a short story entitled “Autofac” in Galaxy Science Fiction magazine, pp. 70-95. The story described a nationwide system of automated factories that produced food, consumer goods, and “miniature replicas” of more factories. Possibly influenced by John von Neumann's theory of self-reproducing automata (1948), this was one of the first descriptions of autonomous machine self-replication to appear in science fiction. Dick’s story ended with the almost destroyed factory shooting out a torrent of metal seeds that germinated into miniature factories. In December 2013, when I wrote this entry, these metal seeds might be viewed as self-replicating nanorobots.

View Map + Bookmark Entry

The Origins of The Term "Software" Within the Context of Computing 1956 – January 1958

The first published use of the term "software" in a computing context is often credited to American statistician John W. Tukey, who published the term in "The Teaching of Concrete Mathematics," American Mathematical Monthly, January 9, 1958. Tukey wrote:

"Today the 'software' comprising the carefully planned interpretive routines, compilers, and other aspects of automative programming are at least as important to the modern electronic calculator as its 'hardware' of tubes, transistors, wires, tapes and the like" (http://www.maa.org/mathland/mathtrek_7_31_00.html, accessed 02-02-2010).

Note that Tukey referred to computers as "calculators." Up to this time the word "computer" typically referred to people, and the use of the word computer for a machine was just coming into popular use.

On April 30, 2013 Paul Niquette informed me that Richard R. Carhart used the term in the  Proceedings of the Second National Symposium on Quality Control and Reliability in Electronics: Washington, D.C., January 9-10, 1956

In August 2014 William J. Rapaport of the Department of Computer Science at SUNY Buffalo emailed me the text of the paragraph in which Carhart used the word software. Carhart wrote:

"In short, we need a total systems approach to reliability. There are four aspects of such an approach which have an important bearing on how a reliablity program is shaped. First, the scope of the program should include the entire system. As an example a missile system includes the vehicle and warhead, the auxiliary ground or airborne equipment, the support and test equipment, and the operating personnel. In addition, the interactions between these various elements, hardware and software (people), must be recognized and included as the glue that holds the system together."

From this it is clear that Carhart did not use the term "software" within the specific context of programming, and priority for the term in that context may rest with Tukey. It is, of course, possible – even likely – that others used the word in spoken, rather than printed, context before either Carhart or Tukey. Paul Niquette stated that he used the term as early as 1953.

(This entry was last revised on 08-04-2014.)

View Map + Bookmark Entry

The First Video Tape Recorder 1956

In 1956 Ray Dolby, Charles Ginsberg and Charles Anderson of Ampex in San Carlos, California, sold the first video tape recorder. It cost $50,000.

View Map + Bookmark Entry

Intelligence Amplification by Machines 1956

In 1956 English psychiatrist and cybernetician W[illiam] Ross Ashby wrote of intelligence amplification by machines in his book, An Introduction to Cybernetics.

View Map + Bookmark Entry

Changes in Tissue Density Can be Computed 1956 – 1964

In work initiated at the University of Cape Town and Groote Schuur Hospital in early 1956, and continued briefly in mid-1957, South African-born American physicist Allen M. Cormack showed that changes in tissue density could be computed from x-ray data. His results were subsequently published in two papers:

"Representation of a Function by its Line Integrals, with Some Radiological Applications," Journal of Applied Physics 34 (1963) 2722-27; "Representation of a Function by its Line Integrals, with Some Radiological Applications. II," Journal of Applied Physics 35 (1964) 2908-13.  

Because of limitations in computing power no machine was constructed during the 1960s. Cormack's papers generated little interest until Godfrey Hounsfield and colleagues invented computed tomography, and built the first CT scanner in 1971, creating a real application of Cormack's theories.

View Map + Bookmark Entry

Proving the Feasibility of Weather Prediction by Numerical Process 1956

In 1956 theoretical meterologist Norman A. Phillips of the National Weather Service, National Meteorological Center, Silver Spring, Maryland, published "The General Circulation of the Atmosphere: A Numerical Experiment," Quarterly Journal of the Royal Meteorological Society 82, no. 352 (1956) 123-164.  By 1955 Phillips completed a 2-layer, hemispheric, quasi-geostrophic computer model. "Despite its primitive nature, Phillips's model is now often regarded as the first AGCM" (P. N. Edwards, Atmospheric General Circulation Modeling: A Participatory History, accessed 06-20-2009)

"Norman Phillips was the first to show, with a simple General Circulation model, that weather prediction with numerical models was even feasible. The advent of numerical weather predictions in the 1950s also signaled the transformation of weather forecasting from a highly individualistic effort to one in which teams of experts developed complex computer programs, eventually for high-speed computers" (Franklin Institute, Franklin Laureate database, accessed 06-20-2009).

View Map + Bookmark Entry

Standing up to Censorship and McCarthyism During the "Second Red Scare" 1956

In 1956 Storm Center, an American drama film directed by screenwriter Daniel Taradash, from a screenplay by Taradash and Elick Moll, and starring Bette Davis as the librarian, Alicia Hull, was first overtly anti-McCarthyism film to be produced in Hollywood during the height of the "Second Red Scare" (late 1940s through late 1950s).  During the Second Red Scare hundreds of Hollywood entertainment professionals lost their jobs as a result of the unofficial Hollywood blacklist, and thousands of people in other occupations also lost jobs.

"Alicia Hull is a widowed small town librarian dedicated to introducing children to the joy of reading. In exchange for fulfilling her request for a children's wing, the city council asks her to withdraw the book The Communist Dream from the library's collection. When she refuses to comply with their demand, she is fired and branded as a subversive. Judge Ellerbe feels she has been treated unfairly and calls a town meeting. Ambitious attorney and aspiring politician Paul Duncan, who is dating assistant librarian Martha Lockeridge, uses the meeting as an opportunity to make a name for himself by denouncing Alicia as a Communist. His forceful rhetoric turns the entire town, with the exception of young Freddie Slater, against her. The boy, increasingly upset by the mistreatment his mentor is suffering and affected by the influence of his narrow-minded father, finally turns on her himself and sets the library on fire. His action causes the residents to have a change of heart, and they ask Alicia to return and supervise the construction of a new building" (Wikipedia article on Storm Center, accessed 05-30-2009).

Raven, "Introduction: The Resonances of Loss," (Raven [ed.] Lost Libraries. The Destruction of Great Book Collections Since Antiquity [2004] 31).

View Map + Bookmark Entry

"Nineteen Eighty-Four" Filmed 1956

In 1956 English director Michael Anderson directed 1984, a science fiction drama film based on the novel Nineteen Eighty-Four by George Orwell, and starring Edmond O'Brien, Jan Sterling, Michael Redgrave, and Donald Pleasance.

This was the first cinema rendition of the novel. It was released on DVD in 2004.

View Map + Bookmark Entry

The First Sample-Playback Keyboard Circa 1956

About 1956 inventor Henry Chamberlin of Upland, California, introduced the Chamberlin, the first sample-playback keyboard.

View Map + Bookmark Entry

Filed under: Music

Semantic Networks for Machine Translation 1956

In 1956 Richard H. Richens of the Cambridge Language Research Unit invented semantic nets for computing by creating semantic networks for machine translation of natural languages.

Richens, "General program for mechanical translation between any two languages via an algebraic interlingua [Abstract]" In: Report on Research: Cambridge Language Research Unit. Mechanical Translation 3 (2), November 1956; p. 37.

Richens, "Preprogramming for mechanical translation," Mechanical Translation 3 (1), July 1956, 20–25

View Map + Bookmark Entry

The First Step Toward Automation of Logic Minimalization 1956

A primary goal in electronic circuit design is obtaining the smallest logic circuit (Boolean formula) that represents a given Boolean function or truth table. In his Ph.D. thesis in electrical engineering at MIT entitled Algebraic Minimization and the Design of Two-Terminal Contact Networks, Edward J. McCluskey developed the first algorithm for designing combinational circuits — the Quine-McCluskey logic minimization procedure. This was the first step toward automation of logic minimization that could be implemented on a computer.

View Map + Bookmark Entry

Television: A Retrospective in 1956 from the RCA / NBC Viewpoint 1956

In 1956 David Sarnoff of RCA /NBC had a film produced telling the early history of electronic television from their prospective:

View Map + Bookmark Entry

The First Japanese Stored-Program Computer March 1956

In March 1956 FUJIC, the first Japanese stored-program electronic computer was operational. It was designed and built by essentially one person—Dr. Okazaki Bunji—for the Fuji Photo Film Company in Odawara, western Kanagawa Prefecture, Japan. The project began in 1949.

"Originally designed to perform calculations for lens design by Fuji, the ultimate goal of FUJIC's construction was to achieve a speed 1,000 times that of human calculation for the same purpose – amazingly, the actual performance achieved was double that number.

"Employing approximately 1,700 vacuum tubes, the computer's word length was 33 bits. It had an ultrasonic mercury delay line memory of 255 words, with an average access time of 500 microseconds. An addition or subtraction was clocked at 100 microseconds, multiplication at 1,600 microseconds, and division at 2,100 microseconds."

FUJIC is preserved in The National Museum Of Nature and Science in Tokyo.

View Map + Bookmark Entry

First International Congress on Cybernetics June 26 – June 29, 1956

From June 26-29, 1956 the First International Congress on Cybernetics was held in Namur, Belgium. Few, if any, of the computer pioneers attended. By this time the field of cybernetics was separated from those of computing and artificial intelligence to emphasize issues of control and communication in learning, automation, and biology.

View Map + Bookmark Entry

The First Demonstration of Magnetic Ink Character Reading July 1956

In July 1956 MICR (Magnetic Ink Character Reading) was demonstrated to the Bank Management Committee of the American Bankers’ Association.

View Map + Bookmark Entry

Werner Buchholz Coins the Term "Byte" July 1956

In July 1956 German born American computer scientist Werner Buchholz coined the term byte as a unit of digital information during the
early design phase for the IBM 7030 Stretch, IBM's first transistorized supercomputer. A byte was an ordered collection of bits, which were the smallest amounts of data that a computer could process ("bite").The Stretch incorporated addressing to the bit, and variable field length (VFL) instructions with a byte size encoded in the instruction. Byte was a deliberate respelling of bite to avoid accidental confusion with bit.

"Early computers used a variety of 4-bit binary coded decimal (BCD) representations and the 6-bit codes for printable graphic patterns common in the U.S. Army (Fieldata) and Navy. These representations included alphanumeric characters and special graphical symbols. These sets were expanded in 1963 to 7 bits of coding, called the American Standard Code for Information Interchange (ASCII) as the Federal Information Processing Standard which replaced the incompatible teleprinter codes in use by different branches of the U.S. government. ASCII included the distinction of upper and lower case alphabets and a set of control characters to facilitate the transmission of written language as well as printing device functions, such as page advance and line feed, and the physical or logical control of data flow over the transmission media. During the early 1960s, while also active in ASCII standardization, IBM simultaneously introduced in its product line of System/360 the 8-bit Extended Binary Coded Decimal Interchange Code (EBCDIC), an expansion of their 6-bit binary-coded decimal (BCDIC) representation used in earlier card punches. The prominence of the System/360 led to the ubiquitous adoption of the 8-bit storage size, while in detail the EBCDIC and ASCII encoding schemes are different" (Wikipedia article on Byte, accessed 01-15-2015).

View Map + Bookmark Entry

Chomsky's Hierarchy of Syntactic Forms September 1956

In September 1956 American linguist, philosopher, cognitive scientist, and activist Noam Chomsky published "Three Models for the Description of Language" in IRE Transactions on Information Theory IT-2, 113-24. In the paper Chomsky introduced two key concepts, the first being “Chomsky’s hierarchy” of syntactic forms, which was widely applied in the construction of artificial computer languages.

“The Chomsky hierarchy places regular (or linear) languages as a subset of the context-free languages, which in turn are embedded within the set of context-sensitive languages also finally residing in the set of unrestricted or recursively enumerable languages. By defining syntax as the set of rules that define the spatial relationships between the symbols of a language, various levels of language can be also described as one-dimensional (regular or linear), two-dimensional (context-free), three-dimensional (context sensitive) and multi-dimensional (unrestricted) relationships. From these beginnings, Chomsky might well be described as the ‘father of formal languages’ ” (Lee, Computer Pioneers [1995] 164). 

The second concept Chomsky presented here was his transformational-generative grammar theory, which attempted to define rules that can generate the infinite number of grammatical (well-formed) sentences possible in a language, and seeks to identify rules (transformations) that govern relations between parts of a sentence, on the assumption that beneath such aspects as word order a fundamental deep structure exists. As Chomsky expressed it in his abstract of the present paper,

"We investigate several conceptions of linguistic structure to determine whether or not they can provide simple and “revealing” grammars that generate all of the sentences of English and only these. We find that no finite-state Markov process [a random process whose future probabilities are determined by its most recent values] that produces symbols with transition from state to state can serve as an English grammar. We formalize the notion of “phrase structure” and show that this gives us a method for describing language which is essentially more powerful. We study the properties of a set of grammatical transformations, showing that the grammar of English is materially simplified if phrase-structure is limited to a kernel of simple sentences from which all other sentences are constructed by repeated transformation, and that this view of linguistic structure gives a certain insight into the use and understanding of language" (p. 113).

Minsky, "A Selected Descriptor-Indexed Bibliography to the Literature on Artificial Intelligence" in Feigenbaum & Feldman eds., Computers and Thought (1963) 453-523, no. 484. Hook & Norman, Origins of Cyberspace (2002) no. 531.

View Map + Bookmark Entry

The First Commercial Computer Designed to Use a Moving Head Hard Drive for Secondary Storage September 4 – September 13, 1956

On September 4, 1956 IBM announced the IBM 350 disk storage unit or 350 RAMAC for the IBM 305 RAMAC, which they introduced on September 13, 1956. The IBM 305 RAMAC was the first commercial computer that used a moving head hard disk drive (magnetic disk storage) for secondary storage. One day later, on September 14, 1956, IBM announced the 650 RAMAC system, which paired an IBM 650 computer with the IBM 355 RAMAC disk storage unit.  However, the 650 RAMAC was a modification of the best-selling IBM 650 system rather than a new system designed specifically to use the RAMAC hard drives. 

"The 305 was one of the last vacuum tube computers that IBM built. It weighed over a ton. The IBM 350 disk system stored 5 million 7-bit (6 data bits plus 1 parity bit) alphanumeric characters (5 MB). It had fifty 24-inch-diameter (610 mm) disks. Two independent access arms moved up and down to select a disk, and in and out to select a recording track, all under servo control. Average time to locate a single record was 600 milliseconds. Several improved models were added in the 1950s. The IBM RAMAC 305 system with 350 disk storage leased for $3,200 per month in 1957 dollars, equivalent to a purchase price of about $160,000. More than 1,000 systems were built. Production ended in 1961; the RAMAC computer became obsolete in 1962 when the IBM 1405 Disk Storage Unit for the IBM 1401 was introduced, and the 305 was withdrawn in 1969.

"The original 305 RAMAC computer system could be housed in a room of about 9 m (30 ft) by 15 m (50 ft); the 350 disk storage unit measured around 1.5 square metres (16 sq ft). The first hard disk unit was shipped September 13, 1956. The additional components of the computer were a card punch, a central processing unit, a power supply unit, an operator's console/card reader unit, and a printer. There was also a manual inquiry station that allowed direct access to stored records. IBM touted the system as being able to store the equivalent of 64,000 punched cards.

"Programming the 305 involved not only writing machine language instructions to be stored on the drum memory, but also almost every unit in the system (including the computer itself) could be programmed by inserting wire jumpers into a plugboard control panel. . . .

"Currie Munce, research vice president for Hitachi Global Storage Technologies (which has acquired IBM's hard disk drive business), stated in a Wall Street Journal interview that the RAMAC unit weighed over a ton, had to be moved around with forklifts, and was delivered via large cargo airplanes. According to Munce, the storage capacity of the drive could have been increased beyond five megabytes, but IBM's marketing department at that time was against a larger capacity drive, because they didn't know how to sell a product with more storage" (Wikipedia article on IBM 305 RAMAC, accessed 10-22-2013).

View Map + Bookmark Entry

The First Computer with a Hard Drive: $10,000 per Megabyte September 14, 1956

On September  14, 1956, at IBM's Glendale Laboratory in Endicott, New York, IBM demonstrated the 650 RAMAC (Random Access Method of Accounting and Control) Magnetic Drum Data Processing Machine. The machine used a series of IBM 355 disk memory units. Also in September 1956 the machine was demonstrated at the U.S. Atomic Energy Commission exhibit at the Atoms for Peace Conference in Geneva.

"The addition of disk storage to the IBM 650 Magnetic Drum Data Processing Machine made possible 'single step processing.' Instead of accumulating data to be processed in stages, transactions could now be processed randomly as they occurred and every record affected by the transaction could be automatically adjusted in the same processing step. Each IBM 355 held 50 disks subdivided on each side into tracks for the storage of almost all active accounting records. Up to four IBM 355 units could be connected to the 650 system" (http://www-03.ibm.com/ibm/history/exhibits/storage/storage_355.html, accessed 10-21-2013).

The 355 disk memory unit was the first hard drive. It permitted random access to any of the million characters distributed over both sides of 50 two-foot-diameter disks. It stored about 2,000 bits of data per square inch and had a purchase price of about $10,000 per megabyte. (By 1997 the cost of storing a megabyte on a hard drive dropped to around ten cents.)

View Map + Bookmark Entry

First Computer Conference in Italy October 17 – October 18, 1956

On October 18 and 18, 1956 the first Italian computer conference was held in Rome.

View Map + Bookmark Entry

First Japanese Conference on Electronic Computers November 1956

I November 1956 the first Japanese conference on electronic computers was held at Waseda University, Shinjuku, Tokyo.  

View Map + Bookmark Entry

IBM Phases Out Vacuum Tubes 1957

In 1957 IBM phased out vacuum tubes in computer design:

“It shall be the policy of IBM to use solid-state circuitry in all machine developments. Furthermore, no new commercial machines or devices shall be announced which make primary use of tube circuitry.”

By this time IBM was satisfied with the reliability of transistors and convinced of the advantages of solid state over vacuum tube technology.  

View Map + Bookmark Entry

Control Unit Based on Microprogramming 1957

In 1957 EDSAC 2, the first large-scale computer with a control unit based on microprogramming, became operational at the University of Cambridge.

View Map + Bookmark Entry

SAGE: Physically the Largest Computers Ever Built 1957

In 1957 the first SAGE (Semi-Automatic Ground Environment)  AN/FSQ-7 (DC-01) computer was operational on a limited basis for the SAGE Air Defense System at McGuire Air Force Base in Burlington County, New Jersey.  Twenty AN/FSQ-7s would eventually be built. The AN/FSQ-7 computer contained 55,000 vacuum tubes, occupied 0.5 acres (2,000 m2) of of floor space, weighed 275 tons, and used up to three megawatts of power. Performance was about 75,000 instructions per second. From the standpoint of physical dimensions, the fifty-two AN/FSQ-7s remain the largest computers ever built.  

"Although the machines used a large number of vacuum tubes, the failure rate of an individual tube was low due to efforts in quality control and a novel quality assurance system called marginal checking that discovered tubes that were growing weak, before they failed. Each SAGE site included two computers for redundancy, with one processor on "hot standby" at all times. In spite of the poor reliability of the tubes, this dual-processor design made for remarkably high overall system uptime. 99% availability was not unusual."

The system allowed online access, in graphical form, to data transmitted to and processed by its computers. Fully deployed by 1963, the IBM-built early warning system remained operational until 1984. With 23 direction centers situated on the northern, eastern, and western boundaries of the United States, SAGE pioneered the use of computer control over large, geographically distributed systems.

"Both MIT and IBM supported the project as contractors. IBM's role in SAGE (the design and manufacture of the AN/FSQ-7 computer, a vacuum tube computer with ferrite core memory based on the never-built Whirlwind II) was an important factor leading to IBM's domination of the computer industry, accounting for more than a half billion dollars in revenue, nearly 10% of IBM's income in the late 1950s" (Wikipedia article on Semi-Automatic Ground Environment, accessed 03-03-2012).

View Map + Bookmark Entry

Mechanized Encoding of Library Information 1957

In 1957 Hans Peter Luhn of IBM published "A Statistical Approach to Mechanized Encoding of Library Information," IBM Journal of Research and Development I (1957) no. 4, 309-317, issued by the IBM T. J. Watson Research Center, Yorktown Heights, New York.


"Written communication of ideas is carried out on the basis of statistical probability in that a writer chooses that level of subject specificity and that combination of words which he feels will convey the most meaning. Since this process varies among individuals and since similar ideas are therefore relayed at different levels of specificity and by means of different words, the problem of literature searching by machines still presents major difficulties. A statistical approach to this problem will be outlined and the various steps of a system based on this approach will be described. Steps include the statistical analysis of a collection of documents in a field of interest, the establishment of a set of “notions” and the vocabulary by which they are expressed, the compilation of a thesaurus-type dictionary and index, the automatic encoding of documents by machine with the aid of such a dictionary, the encoding of topological notations (such as branched structures), the recording of the coded information, the establishment of a searching pattern for finding pertinent information, and the programming of appropriate machines to carry out a search."

View Map + Bookmark Entry

So-Called Second Generation of Computers 1957

In 1957 commercial transistorized computers, including the UNIVAC Solid State 80 and the Philco TRANSAC S-2000, were introduced. These solid-state machines inaugurated the so-called second generation of electronic computers.

View Map + Bookmark Entry

The First English-Language Data-Processing Compiler 1957

In 1957 Grace Hopper wrote the first English-language data-processing compiler, B-0 (FLOW-MATIC) for the UNIVAC II. This was the first English-language data-processing compiler.

View Map + Bookmark Entry

Filed under: Software

FORTRAN: The First Widely Used First High-Level Programming Language 1957

In 1957 John Backus and his team at IBM shipped FORTRAN for the IBM 704. This software, proprietary to IBM, became the first widely-used high-level programming language.

"Fortran, released in 1957, was 'the turning point' in computer software, much as the microprocessor was a giant step forward in hardware, according to J.A.N. Lee, a leading computer historian.

"Fortran changed the terms of communication between humans and computers, moving up a level to a language that was more comprehensible by humans. So Fortran, in computing vernacular, is considered the first successful higher-level language.

"Mr. Backus and his youthful team, then all in their 20s and 30s, devised a programming language that resembled a combination of English shorthand and algebra. Fortran, short for Formula Translator, was very similar to the algebraic formulas that scientists and engineers used in their daily work. With some training, they were no longer dependent on a programming priesthood to translate their science and engineering problems into a language a computer would understand.

"In an interview several years ago, Ken Thompson, who developed the Unix operating system at Bell Labs in 1969, observed that '95 percent of the people who programmed in the early years would never have done it without Fortran' " (http://www.nytimes.com/2007/03/20/business/20backus.html, accessed 10-22-2013).

View Map + Bookmark Entry

Filed under: Software

The First Significant Computer Music Composition 1957

In 1957 Lejaren Hiller and Leonard Isaacson of the University of Illinois at Urbana-Champaign collaborated on the first significant computer music composition, the Illiac Suite, composed on the University of Illinois ILLIAC I computer.

The ILLIAC I was the first von Neumann architecture computer built and owned by an American university.

View Map + Bookmark Entry

Beginning of Doppler Ultrasound 1957

In 1957 Shigeo Satomura  of the Institute of Scientific and Industrial Research, Osaka University, demonstrated the application of the Doppler shift in the frequency of ultrasound backscattered by moving cardiac structures.

This was the beginning of doppler ultrasound for evaluating blood flow and pressure by bouncing high-frequency sound waves (ultrasound) off red blood cells.

S. Satomura, "Ultrasonic Doppler Method for the Inspection of Cardiac Functions," J. Accoust. Soc. Amer. 29 (1957) 1181-85.

View Map + Bookmark Entry

The Movie "Desk Set", Satirizing the Role of Automation in Eliminating Jobs, and Librarians 1957

The romantic comedy film, Desk Set, brought to the silver screen in 1957, was the first film to dramatize and satirize the role of automation in eliminating traditional jobs. The name of the computer in the film, EMERAC, and its room-size installation, was an obvious take-off on UNIVAC, the best-known computer at the time. In the film, the computer was brought-in to replace the library of books, and its staff—an early foreshadowing of the physical information versus digital information issue.  Directed by Walter Lang and starring Spencer Tracy, Katharine Hepburn, Gig Young, Joan Blondell, and Dina Merrill, the screenplay was written by Phoebe Ephron and Henry Ephron from the play by William Marchant.

The film "takes place at the "Federal Broadcasting Network" (exterior shots are of Rockefeller Center, in New York City, headquarters of NBC). Bunny Watson (Katharine Hepburn) is in charge of its reference library, which is responsible for researching and answering questions on all manner of topics, such as the names of Santa's reindeer. She has been involved for seven years with network executive Mike Cutler (Gig Young), with no marriage in sight.

"The network is negotiating a merger with another company, but is keeping it secret. To help the employees cope with the extra work that will result, the network head has ordered two computers (called "electronic brains" in the film). Richard Sumner (Spencer Tracy), the inventor of EMERAC and an efficiency expert, is brought in to see how the library functions, to figure out how to ease the transition. Though extremely bright, as he gets to know Bunny, he is surprised to discover that she is every bit his match.

"When they find out the computers are coming, the employees jump to the conclusion the machines are going to replace them. Their fears seem to be confirmed when everyone on the staff receives a pink slip printed out by the new payroll computer. Fortunately, it turns out to be a mistake; the machine fired everybody in the company, including the president" Wikipedia article on Desk Set, accessed 12-23-2008).

View Map + Bookmark Entry

The First Paper on Machine Learning 1957

In 1957 American mathematician and researcher in artificial intelligence Ray Solomonoff published "An Inductive Inference Machine". IRE Convention Record, Section on Information Theory, Part 2 (1957) 56-62. This was the first paper written on machine learning. It emphasized the importance of training sequences, and the use of parts of previous solutions to problems in constructing trial solutions to new problems. Solomonoff presented an early version of this paper at the 1956 Dartmouth Summer Research Conference on Artificial Intelligence.  In March 2012 a copy of that version was available at this link.

View Map + Bookmark Entry

Invention of the Image Scanner; Creation of the First Digital Image 1957

In 1957 Russell A. Kirsch and a team at the U.S. National Bureau of Standards, using the SEAC computer, built the first image scanner—a drum scanner. Using that device they took the first digital photograph: 

"The first image ever scanned on this machine was a 5 cm square photograph of Kirsch's then-three-month-old son, Walden. The black and white image had a resolution of 176 pixels on a side" (Wikipedia article on Image Scanner, accessed 04-01-2009).

View Map + Bookmark Entry

The Helvetica Typeface Debuts Under a Different Name 1957

In 1957 Swiss typographer Max Miedinger and Eduard Hoffmann at the Haas'sche Schriftgiesserei (Haas type foundry) of Münchenstein, Switzerland designed the sans-serif typeface Helvetica. Its original name was Die Neue Haas Grotesk.

"The aim of the new design was to create a neutral typeface that had great clarity, had no intrinsic meaning in its form, and could be used on a wide variety of signage.

"In 1960, the typeface's name was changed by Haas' German parent company Stempel to Helvetica — derived from Confoederatio Helvetica, the Latin name for Switzerland — in order to make it more marketable internationally" (Wikipedia article on Helvetica, accessed 04-26-2009).

In 2007 Gary Hustwit produced Helvetica a "feature-length independent film about typography, graphic design and global visual culture. It looks at the proliferation of one typeface (which recently celebrated its 50th birthday in 2007) as part of a larger conversation about the way type affects our lives" (from the superb website for the film, accessed 04-26-2009). In December 2013 several clips from the film were available at this link.

View Map + Bookmark Entry

Chomsky's Syntactic Structures 1957

In 1957 Noam Chomsky's Syntactic Structures was published in S-Gravenhage (The Hague), Netherlands, by Mouton & Co. That it did not initially find an American publisher might have been reflective of the advanced nature of the contents. Through its numerous printings Syntactic Structures, a small book of 116 pageswas the vehicle through which Chomsky's innovative ideas first became more widely known.

Chomsky’s text was an expansion of the ideas first expressed in his “Three Models for the Description of Language," in particular the concept of transformational grammar. The cognitive scientist David Marr, who developed a general account of information-processing systems, described Chomsky’s theory of transformation grammar as a top-level computational theory, in the sense that it deals with the goal of a computation, why it is appropriate, and the logic of the strategy used to carry it out (Anderson and Rosenfeld, Neurocomputing: Foundations of Research [1988] 470–72). Chomsky’s work had profound influence in the fields of linguistics, philosophy, psychology, and artificial intelligence. 

Hook & Norman, Origins of Cyberspace (2002) no. 532.

View Map + Bookmark Entry

There are Forty Computers on American University Campuses 1957

". . . in 1957 there were only 40 computers on unversity campuses across the country [the United States]" (Bowles (ed.), Computers in Humanistic Research [1967] v).

View Map + Bookmark Entry

J. W. Ellison Issues the First Computerized Concordance of the Bible 1957

In Italy Roberto Busa began his experimentation with computerized indexing of the text of Thomas Aquinas using IBM punch-card tabulators in 1949-51. The first significant product of computerized indexing in the humanities in the United States, and one of the earliest large examples of humanities computing or digital humanities anywhere, was the first computerized concordance of the Bible: Nelson's Complete Concordance to the Revised Standard Version Bible edited by J. W. Ellison and published in New York and Nashville, Tennessee in 1957. The book consists of 2157 large quarto pages printed in two columns in small type. 

The Revised Standard Version of the Bible was completed in 1952, when the Univac was little-known. UNIVAC 1, serial one, was not actually delivered tihe U.S. Census Bureau until 1953, and the first UNIVAC delivered to a commercial customer was serial 8 in 1954. Using the UNIVAC to compile a concordance was highly innovative, and, of course, it substantially reduced compilation time, as Ellison wrote in his preface dated 1956. Though Ellison offered to make the program available he did not provide data concerning the actual time spent in inputting the data on punched cards and running the program: 

"An exhaustive concordance of the Bible, such as that of James Strong, takes about a quarter of a century of careful, tedious work to guarantee accuracy. Few students would want to wait a generation for a CONCORDANCE of the REVISED STANDARD VERSION of the HOLY BIBLE. To distribute the work among a group of scholars would be to run the risk of fluctuating standards of accuracy and completeness. The use of mechanical or electronic assistance was feasible and at hand. The Univac I computer at the offices of Remington Rand, Inc. was selected for the task. Every means possible, both human and mechanical, was used to guarantee accuracy in the work.

"The use of a computer imposed certain limitations upon the Concordance. Although it could be 'exhaustive,' it could not be 'analytical'; the context and location of each and every word could be listed, but not the Hebrew and Greek words from which they were translated. For students requiring that information, the concordance of the Holy Bible in its original tongues or the analytical concordances of the King James Version must be consulted. . . .

"The problem of length of context was arbitrarily solved. A computer, at least in the present stage of engineering, can perform only the operations specified for it, but it will precisely and almost unerringly perform them. In previous concordances, each context was made up on the basis of a human judgment which took in untold familiarity with the text and almost unconscious decisions in g rouping words into familiar phrases. This kind of human judgement could not be performed by the computer; it required a set of definite invariable rules for its operation. The details of the program are available for those whose interest prompts them to ask for them."

The March 1956 issue of Publishers' Weekly, pp. 1274-78, in an article entitled "Editing at the Speed of Light," reported that Ellison's concordance deliberately omited 132 frequent words- articles, most conjuctions, adverbs, prepositions and common verbs.

"From an account in the periodical Systems it appears that the text of the Bible was transferred direct to magnetic tape, using a keyboard device called the Unityper (McCulley 1956). This work took nine months (800,000 words). The accuracy of the tapes was checked by punching the text a second time, on punched cards, then transferring this material to magenetic tape using a card-to-tape converter. The two sets of tapes were then compared for divergences by the computer and discrepancies eliminated. The computer putput medium was also magnetic tape and this operated a Uniprinter which produced the manuscrpt sheets ready for typesetting" (Hymes ed., The Use of Computers in Anthropology [1965] 225).

View Map + Bookmark Entry

The First Computer Widely Used Program for Sound Generation 1957

In 1957 electrical engineer Max Mathews of Bell Labs wrote MUSIC, the first widely-used program for sound generation, on an IBM 704 computer. MUSIC was the first computer program for generating digital audio waveforms through direct synthesis. Prior to this the first computer music was generated in Sydney Australia in 1951 by programmer Geoff Hill on the CSIRAC computer, which was designed and built by Trevor Pearcey and Maston Beard. However, CSIRAC produced sound by sending raw pulses to the speaker; it did not produce standard digital audio with PCM samples, like the MUSIC-series of programs.  According to the Wikipedia article on Music-N, Mathews' original MUSIC program spawned a family of computer music programs and programming languages.

View Map + Bookmark Entry

Filed under: Music , Software

The Premature Death of John von Neumann February 8, 1957

On February 8, 1957 mathematician and physicist John von Neumann died of cancer at the age of fifty-four. Like the death of Alan Turing at the age of 42, von Neumann's premature death was an enormous loss for computer science, as well as for mathematics and physics.

View Map + Bookmark Entry

Crick's "On Protein Synthesis" September 1957

In September 1957 molecular Biologist Francis Crick delivered his paper “On Protein Synthesis,” published in Symp. Soc. Exp. Biol. 12 (1958): 138-63. In it Crick proposed two general principles:

1) The Sequence Hypothesis:

“The order of bases in a portion of DNA represents a code for the amino acid sequence of a specific protein. Each ‘word’ in the code would name a specific amino acid. From the two-dimensional genetic text, written in DNA, are forged the whole diversity of uniquely shaped three-dimensional proteins

"In this context, Crick discussed the 'coding problem'—how the ordered sequence of the four bases in DNA might constitute genes that encode and disburse information directing the manufacture of proteins. Crick hypothesized that, with four bases to DNA and twenty amino acids, the simplest code would involve "triplets"—in which sequences of three bases coded for a single amino acid" (Genome News Network, Genetics and Genomics Timeline 1957).

2) The Central Dogma:

“Information is transmitted from DNA and RNA to proteins but information cannot be transmitted from a protein to DNA.” This paper “permanently altered the logic of biology.” (Judson)

View Map + Bookmark Entry

The First Operational Satellite Navigation System October 4, 1957 – 1960

On October 4, 1957 the U.S. Navy launched NAVSAT, also known as TRANSIT. NAVSAT was the first operational satellite navigation system. 

"The TRANSIT satellite system was developed by the Applied Physics Laboratory (APL) of Johns Hopkins University for the U.S. Navy. Just days after the Soviet launch of Sputnik 1, the first man-made earth-orbiting satellite on October 4, 1957, two physicists at APL, William Guier and George Weiffenbach, found themselves in discussion about the microwave signals that would likely be emanating from the satellite. They were able to determine Sputnik's orbit by analyzing the Doppler shift of its radio signals during a single pass. Frank McClure, the chairman of APL's Research Center, suggested that if the satellite's position were known and predictable, the Doppler shift could be used to locate a receiver on Earth.

"Development of the TRANSIT system began in 1958, and a prototype satellite, Transit 1A, was launched in September 1959. That satellite failed to reach orbit. A second satellite, Transit 1B, was successfully launched April 13, 1960, by a Thor-Ablestar rocket. The first successful tests of the system were made in 1960, and the system entered Naval service in 1964" (Wikipedia article on Transit (satellite), accessed 12-26-2012).

Using a constellation of five satellites, NAVSAT was primarily employed to obtain accurate location information by ballistic missile submarines, and was also used as a general navigation system by the Navy, and in hydrographic and geodetic surveying. 

"Since no computer small enough to fit through a submarine's hatch existed (in 1958), a new computer was designed, named the AN/UYK-1. It was built with rounded corners to fit through the hatch and was about five feet tall and sealed to be water-proof. The principal design engineer was then-UCLA-faculty-member Lowell Amdahl, brother of Gene Amdahl. The AN/UYK-1 was built by the Ramo-Wooldridge Corporation (later TRW) for the Lafayette class SSBNs. It was equipped with 8,192 words of 15-bit core memory plus parity bit, threaded by hand at their Canoga Park factory. Cycle time was about one microsecond.

"The AN/UYK-1 was a "micro-programmed" machine with a 15-bit word length that lacked hardware commands to subtract, multiply or divide, but could add, shift, form one's complement, and test the carry bit. Instructions to perform standard fixed and floating point operations were software subroutines and programs were lists of links and operators to those subroutines. For example, the "subtract" subroutine had to form the one's complement of the subtrahend and add it. Multiplication required successive shifting and conditional adding.

"The most interesting feature of the AN/UYK-1 instruction set was that the machine-language instructions had two operators that could simultaneously manipulate the arithmetic registers, for example complementing the contents of one register while loading or storing another. It also may have been the first computer that implemented a single-cycle indirect addressing ability.

"During a satellite pass, a GE receiver would receive the orbital parameters and encrypted messages from the satellite, as well as measure the Doppler shifted frequency at intervals and provide this data to the AN/UYK-1 computer. The computer would also receive from the ship's inertial navigation system (SINS), a reading of latitude and longitude. Using this information the AN/UYK-1 ran the least squares algorithm and provided a location reading in about fifteen minutes" (http://en.wikipedia.org/wiki/Transit_(satellite)#The_AN.2FUYK-1_Computer, accessed 12-01-2013).

View Map + Bookmark Entry

Sputnik is Launched October 4, 1957

On October 4, 1957 the Soviet Union launched Sputnik, the first artificial earth satellite, during the International Geophysical Year from Site No.1/5, at the 5th Tyuratam range, in Kazakh SSR (now at the Baikonur Cosmodrome).  This began the "Space Race" between the United States and the Soviet Union.

View Map + Bookmark Entry

Invention of the "Planar" Manufacturing Process December 1, 1957 – January 1959

On December 1, 1957, on pages 3-4 of his manuscript patent notebook for Fairchild Semiconductor in Palo Alto, California, physicist Jean Hoerni recorded a "Method of protecting exposed p-n junctions at the surface of silicon transistors by oxide masking techniques."  This was his first expression of the planar process, a radically new transistor design in which the oxide layer is left in place on the silicon wafer to protect the sensitive p-n junctions underneath. Focused on getting its first semiconductor devices into production, Fairchild Semiconductor did not pursue Hoerni’s planar approach at that time.

Due to concerns about possible contaminants, conventional wisdom required removing the oxide layer after completion of oxide masking, thus exposing the junctions. Hoerni viewed the oxide instead as a possible solution - his "planar" approach, named after the flat topography of the finished device, would protect these junctions, and two years later it would become an essential element in the manufacturing of Robert Noyce’s 1959 invention of the first commercially manufactured monolithic integrated circuit, the basis for virtually all semiconductor manufacturing today. Therefore Hoerni did not write a patent disclosure for what would become U.S. patent 3025589 until January 1959.

Hoerni, J. A., "Method of Manufacturing Semiconductor Devices," U. S. Patent 3,025,589 (Filed May 1, 1959. Issued March 20, 1962). See also Hoerni’s U.S. Patent No. 3,064,167.  

Hoerni, J. A., "Planar Silicon Diodes and Transistors," paper presented at the 1960 Electron Devices Meeting, Washington, D. C. - October 1960 reprinted as Fairchild Semiconductor Technical Paper TP-14. (1961).

View Map + Bookmark Entry

John Kendrew Reports the First Solution of the Three-Dimensional Molecular Structure of a Protein 1958 – 1960

In 1958 and 1960 molecular biologist John Kendrew published  "A Three-Dimensional Model of the Myoglobin Molecule Obtained by X-ray Analysis" (with G. Bodo, H. M. Dintzis, R. G. Parrish, H. Wyckoff,) Nature 181 (1958) 662-666, and "Structure of Myoglobin: A Three-Dimensional Fourier synthesis at 2 Å Resolution" (with R. E. Dickerson, B. E. Strandberg, R. G. Hart, D. R. Davies, D. C. Phillips, V. C. Shore). Nature 185 (1960) 422-27. These papers reported the first solution of the three-dimensional molecular structure of a protein, for which Kendrew received the 1962 Nobel Prize in chemistry, together with his friend and colleague Max Perutz, who solved the structure of the related and more complex protein, hemoglobin, two years after Kendrew’s achievement. 

Kendrew began his investigation into the structure of myoglobin in 1949, choosing this particular protein because it was “of low molecular weight, easily prepared in quantity, readily crystallized, and not already being studied by X-ray methods elsewhere” (Kendrew, “Myoglobin and the structure of proteins. Nobel Prize Lecture [1962],” pp. 676-677). Protein molecules, which contain, at minimum, thousands of atoms, have enormously convoluted and irregular formations that are extremely difficult to elucidate. In the 1930s J. D. Bernal, Dorothy Hodgkin and Max Perutz performed the earliest crystallographic studies of proteins at Cambridge’s Cavendish Laboratory; however, the intricacies of three-dimensional structure of proteins were too complex for analysis by conventional X-ray crystallography, and the process of calculating the structure factors by slide-rules and electric calculators was far too slow. It was not until the late 1940s, when Kendrew joined the Cavendish Laboratory as a graduate student, that new and more sophisticated tools emerged that could be used to attack the problem. The first of these tools was the technique of isomorphous replacement, developed by Perutz during his own researches on hemoglobin, in which certain atoms in a protein molecule are replaced with heavy atoms. When these modified molecules are subjected to X-ray analysis the heavy atoms provide a frame of reference for comparing diffraction patterns. The second tool was the electronic computer, which Kendrew introduced to computational biology in 1951. The first electronic computer, the ENIAC, which became operational in Philadelphia in 1945, was 10,000 times the speed of a human performing a calculation. In 1951 Cambridge University was one of only three or four places in the world with a high-speed stored-program electronic computer, and Kendrew took full advantage of the speed of Cambridge’s EDSAC computer, and its more powerful successors, to execute the complex mathematical calculations required to solve the structure of myoglobin. Kendrew was the first to apply an electronic computer to the solution of a complex problem in biology.

Nevertheless, even with the EDSAC computer performing the calculations, the research progressed remarkably slowly. Only by the summer of 1957 did Kendrew and his team succeed in creating a three-dimensional map of myoglobin at a resolution the so-called “low resolution”of 6 angstroms; thus myoglobin became “the first protein to be solved” (Judson, p. 538).

“A cursory inspection of the map showed it to consist of a large number of rod-like segments, joined at the ends, and irregularly wandering through the structure; a single dense flattened disk in each molecule; and sundry connected regions of uniform density. These could be identified respectively with polypeptide chains, with the iron atom and its associated porphyrin ring, and with the liquid filling the interstices between neighboring molecules. From the map it was possible to ‘dissect out’ a single protein molecule . . . The most striking features of the molecule were its irregularity and its total lack of symmetry” (Kendrew, “Myoglobin,” p. 681).  

The 6-angstrom resolution was too low to show the molecule’s finer features, but by 1960 Kendrew and his team were able to obtain a map of the molecule at 2-angstrom resolution. “To achieve a resolution of 2 Å it was necessary to determine the phases of nearly 10,000 reflections, and them to compute a Fourier synthesis with the same number of terms . . . the Fourier synthesis itself (excluding preparatory computations of considerable bulk and complexity) required about 12 hours of continuous computation on a very fast machine (EDSAC II)” (Kendrew, “Myoglobin,” p. 682).

View Map + Bookmark Entry

Von Neumann's "The Computer and the Brain" 1958

Because of failing health, John von Neumann did not finish his last book, The Computer and the BrainThe book, issued posthumously in 1958, was a published version of the Silliman Lectures which von Neumann was invited to deliver at Yale in 1956. Although von Neumann prepared the lectures by March 1956, he was already too sick to travel to New Haven and could not deliver the lectures as scheduled. He continued to work on the manuscript until his death on February 8, 1957. The manuscript remained unfinished, as his widow Klara von Neumann explained in her preface to the posthumous edition. 

Von Neumann's 82 page essay was divided into two parts. The first part discussed the computer: its procedures, control mechanisms, and other characteristics. The second part focused on the brain, systematically comparing the operations of the brain with what was then state-of-the-art in computer science. In what seems to have been the groundwork for a third part—but it was not organized as a separate part—von Neumann drew some conclusions from the comparison with respect to the role of code and language. Von Neumann wrote that "A deeper mathematical study of the nervous system may alter our understanding of mathematics and logic."

View Map + Bookmark Entry

Seymour Cray Builds the First Transistorized Supercomputer 1958

In 1958 Seymour Cray of Control Data Corporation, Minneapolis, Minnesota, built the first transistorized supercomputer, the CDC 1604.

View Map + Bookmark Entry

The First Video Game: "Tennis for Two" 1958

In 1958 William Higinbotham, head of the Instrumentation Division at Brookhaven National Laboratory, Upton, New York, invented the first video game, "Tennis for Two". It ran on an analog computer hooked up to an oscilloscope.

View Map + Bookmark Entry

The IBM 1401, a Relatively Inexpensive Computer 1958

In 1958 IBM announced their 1401, a relatively inexpensive computer that proved very popular with businesses, and began to compete seriously with existing punched-card tabulating equipment.

View Map + Bookmark Entry

Hans Peter Lund of IBM Develops an Automatic Document Indexing Program 1958

In 1958 Hans Peter Luhn of IBM developed an automatic document indexing program for the production of literature abstracts.

"The complete text of an article in machine-readable form is scanned by an IBM 704 data-processing machine and analyzed in accordance with a standard program. Statistical information derived from word frequency and distribution is used by the machine to compute a relative measure of significance, first for individual words and then for sentences. Sentences scoring highest in significance are extracted and printed out to become the "auto-abstract."

View Map + Bookmark Entry

An Improved Modem 1958

Though modems existed for teletype since the 1940s, these transmitted at speeds of about 150 bpi. To meet demands of the U.S. military, in 1958 researchers at Bell Labs developed an improved modem (modulator-demodulator), using amplitude magnification to provide a way to convert digital signals to analog signals and back for transmission at speeds up to 1600 bpi over analog telephone lines.

View Map + Bookmark Entry

Semi Automatic Ground Environment (SAGE) 1958

In 1958 MITRE Corporation was founded to manage the development and production of SAGE (Semi Automatic Ground Environment) "an automated control system for collecting, tracking and intercepting enemy bomber aircraft."

SAGE was used by NORAD into the 1980s.

View Map + Bookmark Entry

Longevity of Paper is a Function of its Acidity or Alkalinity Circa 1958

In the late 1950s it was recognized that the longevity of paper is a function of its acidity or alkalinity: the lower the acidity and higher the alkalinity, the greater the longevity of paper.

View Map + Bookmark Entry

The First Obstetrical or Gynecological Sonograms 1958

In 1958 Ian Donald, Regius Professor of Midwifery at the University of Glasgow, and his colleagues John MacVicar, an obstetrician, and Tom Brown, an engineer, published a paper in The Lancet entitled "Investigation of Abdominal Masses by Pulsed Ultrasound." This article described their experience using an ultrasound scanner on 100 patients, and included 12 illustrations of various gynecologic disorders (eg, ovarian cysts, fibroids) as well as demonstration of obstetric findings such as the fetal skull at 34 weeks' gestation,"hydramnios" (polyhydramnios), and twins in breech presentation. The somewhat grainy and indistinct "Compound B-mode contact scanner" images were the first published obstetrical or gynecological sonograms.

J. M. Norman (ed),  Morton's Medical Bibliography 5th ed.(1991) no. 2682.

View Map + Bookmark Entry

Animated Title Sequence by Electromechanical Analog Computer 1958

Title sequence from Vertigo; titles designed by Saul Bass; spirographic images contributed by John Whitney.

In the late 1950s American animator, composer and inventor John Whitney a WWII vintage Kerrison Predictor electromechanical analog computer at an army surplus store. He connected the electrical outputs to servos controlling the positioning of small lit targets and light bulbs. Whitney's next step was to modify the "mathematics" of the system to move the targets in various mathematically controlled ways, a technique he referred to as incremental drift. As the power of the systems grew they eventually evolved into what is today known as motion control photography, a widely used technique in special effects filming.

Probably Whitney's best known work from this early period was the animated title sequence from Alfred Hitchcock's 1958 film Vertigo, on which Whitney collaborated with graphic designer Saul Bass.

View Map + Bookmark Entry

A Model for Learning and Adaptation to a Complex Environment 1958

In 1958 English-American artificial intelligence pioneer Oliver Selfridge of MIT published "Pandemonium: A Paradigm for Learning," Mechanisation of Thought Processes: Proceedings of a Symposium Held at the National Physical Laboratory on 24th, 25th, 26th and 27th November 1958 (1959) 511–26. In it he proposed a collection of small components dubbed “demons” that together would allow machines to recognize patterns, and might trigger subsequent events according to patterns they recognized. This model of learning and adaptation to a complex environment based on multiple independent processing systems was influential in psychology as well as neurocomputing and artificial intelligence.

Hook & Norman, Origins of Cyberspace (2002) no. 878.

View Map + Bookmark Entry

First Proof of the Semiconservative Replication of DNA 1958

The deciphering of the structure of DNA by James Watson and Francis Crick in 1953 suggested that each strand of the double helix would serve as a template for synthesis of a new strand. However, there was no way of knowing how the newly synthesized strands might combine with the template strands to form two double helical DNA molecules. The Meselson–Stahl experiment by American geneticists and molecular biologists Matthew Meselson and Franklin Stahl at Caltech in 1958 supported the hypothesis that DNA replication was semiconservative. In semiconservative replication, each of the two new double-stranded DNA helices consist of one strand from the original helix and one newly synthesized.

Meselson & Stahl, "The Replication of DNA in Escherichia coli," Proceedings National Academy of Sciences 44 (1958) 671-82. 

J. Norman (ed) Morton's Medical Bibliography 5th ed (1991) no. 256.6.

View Map + Bookmark Entry

The U.S. Launches its First Artificial Satellite, Explorer-1 January 31, 1958

On January 31, 1968, four months after the Soviets launched Sputnik, the United States launched its first artificial satellite, Explorer-1, officially known as Satellite 1958 Alpha, from Cape Canaveral Missile Annex, Florida. It was built at the Jet Propulsion Laboratory at Caltech, and it ceased transmission on May 23 after less than 4 months.

Explorer I is credited with the most important discovery of the International Geophysical Year: the discovery of one of the belts of radiation surrounding the earth. They were subsequently named the Van Allen Belts after James Van Allen, the scientist who identified them.

View Map + Bookmark Entry

ARPA is Founded February 7, 1958

In response to the Soviet Union’s launching of Sputnik, on February 7, 1958 President Dwight Eisenhower created the Advanced Research Planning Agency of the Department of Defense (ARPA). It was renamed DARPA in 1972.

View Map + Bookmark Entry

Kilby Conceives of the Integrated Circuit July 1958

In July 1958 Jack Kilby of Texas Instruments in Dallas, Texas, conceived of the integrated circuit. On September 12, 1958 he constructed the first integrated working prototype using germanium mesa p-n-p transistor slices he had etched to form transistor, capacitor, and resistor elements. Using fine gold "flying-wires" he connected the separate elements into an oscillator circuit. One week later he demonstrated an amplifier.

"In his patent application of February 6, 1959, Kilby described his new device as 'a body of semiconductor material . . . wherein all the components of the electronic circuit are completely integrated' ” (Wikipedia article on Integrated circuit, accessed 03-03-2012).

T.I. announced Kilby's "solid circuit" concept in March 1959 and introduced its first commercial device in March 1960, the Type 502 Binary Flip-Flop priced at $450 each. However the flying-wire interconnections were not a practical production technique. In October 1961, T.I. introduced the Series 51 DCTL "fully-integrated circuit" family using deposited-metal planar technology invented by Jean Hoerni at Fairchild Semiconductor.

View Map + Bookmark Entry

The Burroughs Atlas Guidance Computer July 19, 1958

On July 19, 1958 the BurroughsAtlas Guidance” computer was used at Cape Canaveral to control the launch of the Atlas missile. It was one of the first computers to use transistors rather than vacuum tubes.

". . .the first machine was installed at the Cape Canaveral missile range in June 1957. Although Atlas missile launches started in September 1957, test patterns were transmitted to the missile in place of actual guidance commands for the first four flights. The first computer-controlled launch was on July 19, 1958. The computer had separate memory areas for instructions (2048 18-bit words) and data (256 24-bit words). The instruction area was increased to 2816 words, beginning with the Model III version, which was first delivered in December 1958. The Atlas guidance computer had no facilities for developing programs, so they were written on the UDEC II, the Datatron, and the 220, using simulator software. Burroughs was still doing Atlas programming on the 220 in 1964. In all, 18 Atlas guidance computers were built at a total project cost of $37 million. The computer was very reliable, and no Atlas launch was ever aborted due to computer failure." 

View Map + Bookmark Entry

NASA is Founded July 29 – October 1, 1958

On July 29, 1958, the same day that Congress passed the National Aeronautics and Space Act disabling the National Advisory Committee for Aeronautics (NACA), President Dwight D. Eisenhower signed the National Aeronautics and Space Act, establishing NASA, the National Aeronautics and Space Administration. The new agency— responsible for America's space program and for civilian, rather than military, aerospace and aviation research—became operational on October 1, 1958.

From 1946, the National Advisory Committee for Aeronautics (NACA) had been experimenting with rocket planes such as the supersonic Bell X-1. In the early 1950s, there was challenge to launch an artificial satellite for the International Geophysical Year (1957–58). An effort for this was the American Project Vanguard. After the Soviet launch of the world's first artificial satellite (Sputnik 1) on October 4, 1957, the attention of the United States turned toward its own fledgling space efforts. The U.S. Congress, alarmed by the perceived threat to national security and technological leadership (known as the "Sputnik crisis"), urged immediate and swift action; President Dwight D. Eisenhower and his advisers counseled more deliberate measures. This led to an agreement that a new federal agency mainly based on NACA was needed to conduct all non-military activity in space. The Advanced Research Projects Agency (ARPA) was created in February 1958 to develop space technology for military application.

"On July 29, 1958, Eisenhower signed the National Aeronautics and Space Act, establishing NASA. When it began operations on October 1, 1958, NASA absorbed the 46-year-old NACA intact; its 8,000 employees, an annual budget of US$100 million, three major research laboratories (Langley Aeronautical LaboratoryAmes Aeronautical Laboratory, and Lewis Flight Propulsion Laboratory) and two small test facilities. A NASA seal was approved by President Eisenhower in 1959. Elements of the Army Ballistic Missile Agency and the United States Naval Research Laboratory were incorporated into NASA. A significant contributor to NASA's entry into the Space Race with the Soviet Union was the technology from the German rocket program (led by Wernher von Braun, who was now working for ABMA) which in turn incorporated the technology of American scientist Robert Goddard's earlier works. Earlier research efforts within the U.S. Air Force and many of ARPA's early space programs were also transferred to NASA. In December 1958, NASA gained control of the Jet Propulsion Laboratory, a contractor facility operated by the California Institute of Technology" (Wikipedia article NASA, accessed 12-02-2013).

View Map + Bookmark Entry

BankAmericard is Launched September 1958

BankAmerica card.

In September 1958 Bank of America, then headquartered in San Francisco, created the BankAmericard, the first credit card issued by a conventional bank. Together with its overseas affiliates, this product eventually evolved into the Visa system.

View Map + Bookmark Entry

Game Tree Pruning October 1958

In October 1958 Allan Newell, Clifford Shaw, and Herbert Simon invented game tree pruning, an artificial intelligence technique.

View Map + Bookmark Entry

The American Express Card October 1, 1958

On October 1, 1958 American Express launched the American Express card. Because American Express previously had an international network of offices in place, and their traveler's' cheques had been accepted throughout the world for decades, this was the first credit card accepted internationally. 

". . . public interest had become so significant that they issued 250,000 cards prior to the official launch date. The card was launched with an annual fee of $6, $1 higher than Diners Club, to be seen as a premium product. The first cards were paper, with the account number and cardmember's name typed. It was not until 1959 that American Express began issuing embossed ISO 7810 plastic cards, an industry first" (Wikipedia article on American Express, accessed 12-27-2008).

View Map + Bookmark Entry

The Perceptron November 1958 – 1960

In November 1958 Frank Rosenblatt invented the Perceptron, or Mark I, at Cornell University. Completed in 1960, this was the first computer that could learn new skills by trial and error, using a type of neural network that simulated human thought processes.

View Map + Bookmark Entry

Keyword in Context (KWIC) Indexing November 1958

In November 1958 computer scientist Hans Peter Luhn of IBM published Bibliography and index: Literature on information retrieval and machine translation.  This contained titles indexed by the Key Words in Context system, or KWIC. The concept of Keyword in Context indexing had been first proposed and implemented manually by librarian Andrea Crestadoro in 1856-1864.

"The International Conference on Scientific Information (ICSI), Washington, DC, in November 1958, where Luhn introduced his new equipment and illustrated the practical results by producing the KWIC indexes for the conference program. Two new Luhn inventions, the 9900 Index Analyzer and the Universal Card Scanner, and the new Luhn Keyword-in-Context (KWIC) indexing technique were introduced. Following the conference, newspapers all over the country carried stories about the auto-abstracting and auto-indexing." (http://www.ischool.utexas.edu/~ssoy/organizing/l391d2c.htm, accessed 04-26-2009).

View Map + Bookmark Entry

The First International Symposium on Artificial Intelligence November 24 – November 27, 1958

From November 24 to 27, 1958 the National Physical Laboratory at Teddington, England held the first international symposium on artificial intelligence, calling it Mechanisation of Thought Processes. 

The proceedings were published in 1959 by Her Majesty's Stationery Office in London as a two volume set nearly 1000 pages long, also called Mechanisation of Thought Processes. In December 2013 volume one was available from aitopics.org at this link. Volume two was available from the same site at this link.

At this conference John McCarthy delivered his paper Programs with Common Sense.(See Reading 11.6.)

View Map + Bookmark Entry

The First Voice Transmission from the First Communications Satellite December 19, 1958

On December 19, 1958 President Eisenhower's brief Christmas greeting was transmitted from the Project SCORE (Signal Communication by Orbiting Relay Equipment) satellite. This was the first voice transmission from the world's first communications satellite. Eisenhower said:

"This is the President of the United States speaking. Through the marvels of scientific advance, my voice is coming to you from a satellite traveling in outer space. My message is a simple one: Through this unique means I convey to you and all mankind, America's wish for peace on Earth and goodwill toward men everywhere."


View Map + Bookmark Entry

ERMA and MICR 1959

Based on technology originally developed at the Stanford Research Institute, in 1959 General Electric delivered the first 32 ERMA (Electronic Recording Method of Accounting) computing systems to the Bank of America. The system used MICR (Magnetic Ink Character Reading.) ERMA served as the BofA's accounting computer and check handling system until 1970.

View Map + Bookmark Entry

The First Practical Monolithic Integrated Circuit Concept 1959

Independently of Jack Kilby at Texas Instruments, Robert N. Noyce of Fairchild Semiconductor, Mountain View, California, invented the first practical monolithic circuit concept.  Based on the "planar" technology invented in 1957 by physicist Jean Hoerni at Fairchild, Noyce's invention consisted of a complete electronic circuit inside a small silicon chip. Noyce's first description of his invention was entitled "Methods of isolating multiple devices," written on January 23, 1959 on pp. 70-71 of his patent notebook for Fairchild Semiconductor.

Noyce filed for a patent on "Semiconductor Device-and-Lead Structure" on July 30, 1959.  U.S. patent 2,981,877 was granted on April 25, 1961.

Because Kilby and Noyce shared the invention of the integrated circuit Fairchild and Texas Instruments engaged in litigation over integrated circuit patents for many years. The courts eventually ruled in Noyce's, and Fairchild Semiconductor’s favor, but by then the companies had already settled on a cross-licensing agreement that included a net payment to Fairchild.

View Map + Bookmark Entry

The TX-2 Computer for the Study of Human-Computer Interaction 1959

In 1959 Wesley A. Clark designed and built the TX-2 computer at MIT’s Lincoln Laboratory in Lexington, Massachusetts. It had 320 kilobytes of fast memory, about twice the capacity of the biggest commercial machines. Other features were magnetic tape storage, an online typewriter, the first Xerox printer, paper tape for program input, and a nine inch CRT screen. Among its applications were development of interactive graphics and research on human-computer interaction.

View Map + Bookmark Entry

The Complicated Discovery of the LASER 1959

"In 1957, Charles Hard Townes and Arthur Leonard Schawlow, then at Bell Labs, began a serious study of the infrared laser. As ideas developed, they abandoned infrared radiation to instead concentrate upon visible light. The concept originally was called an "optical maser". In 1958, Bell Labs filed a patent application for their proposed optical maser; and Schawlow and Townes submitted a manuscript of their theoretical calculations to the Physical Review, published that year in Volume 112, Issue No. 6.

"Simultaneously, at Columbia University, graduate student Gordon Gould was working on a doctoral thesis about the energy levels of excited thallium. When Gould and Townes met, they spoke of radiation emission, as a general subject; afterwards, in November 1957, Gould noted his ideas for a "laser", including using an open resonator (later an essential laser-device component). Moreover, in 1958, Prokhorov independently proposed using an open resonator, the first published appearance (the USSR) of this idea. Elsewhere, in the U.S., Schawlow and Townes had agreed to an open-resonator laser design – apparently unaware of Prokhorov's publications and Gould's unpublished laser work.

"At a conference in 1959, Gordon Gould published the term LASER in the paper The LASER, Light Amplification by Stimulated Emission of Radiation. Gould's linguistic intention was using the "-aser" word particle as a suffix – to accurately denote the spectrum of the light emitted by the LASER device; thus x-rays: xaser, ultraviolet: uvaser, et cetera; none established itself as a discrete term, although "raser" was briefly popular for denoting radio-frequency-emitting devices.

"Gould's notes included possible applications for a laser, such as spectrometry, interferometry, radar, and nuclear fusion. He continued developing the idea, and filed a patent application in April 1959. The U.S. Patent Office denied his application, and awarded a patent to Bell Labs, in 1960. That provoked a twenty-eight-year lawsuit, featuring scientific prestige and money as the stakes. Gould won his first minor patent in 1977, yet it was not until 1987 that he won the first significant patent lawsuit victory, when a Federal judge ordered the U.S. Patent Office to issue patents to Gould for the optically pumped and the gas discharge laser devices. The question of just how to assign credit for inventing the laser remains unresolved by historians.

On May 16, 1960, Theodore H. Maiman operated the first functioning laser, at Hughes Research Laboratories, Malibu, California, ahead of several research teams, including those of Townes, at Columbia University, Arthur Schawlow, at Bell Labs, and Gould, at the TRG (Technical Research Group) company. Maiman's functional laser used a solid-state flashlamp-pumped synthetic ruby crystal to produce red laser light, at 694 nanometres wavelength; however, the device only was capable of pulsed operation, because of its three-level pumping design scheme. Later in 1960, the Iranian physicist Ali Javan, and William R. Bennett, and Donald Herriott, constructed the first gas laser, using helium and neon that was capable of continuous operation in the infrared (U.S. Patent 3,149,290); later, Javan received the Albert Einstein Award in 1993. Basov and Javan proposed the semiconductor laser diode concept. In 1962, Robert N. Hall demonstrated the first laser diode device, made of gallium arsenide and emitted at 850 nm the near-infrared band of the spectrum. Later, in 1962, Nick Holonyak, Jr. demonstrated the first semiconductor laser with a visible emission. This first semiconductor laser could only be used in pulsed-beam operation, and when cooled to liquid nitrogen temperatures (77 K). In 1970, Zhores Alferov, in the USSR, and Izuo Hayashi and Morton Panish of Bell Telephone Laboratories also independently developed room-temperature, continual-operation diode lasers, using the heterojunction structure." (Wikipedia article on Laser, accessed 04-25-2013).

View Map + Bookmark Entry

The Nautical Almanac is Finally Produced by an Electronic Computer 1959

Having been computed by human computers since 1767, in 1959 the Nautical Almanac was finally produced by an electronic computer.

"The computation of the data for the almanacs involved a considerable amount of effort. As late as the mid-20th century, HMNAO employed a small army of human computers to carry out this work. They used the latest technology available at the time: logarithm tables, mechanical calculating machines and electro-mechanical calculating machines. In 1959 the Office obtained its own electronic computer, making it the first part of the RGO to use this emerging technology."

View Map + Bookmark Entry

First Book on Computer Music 1959

In 1959 Lejaren Hiller and Leonard Isaacson published the first book on computer-generated music: Experimental Music: Composition with an Electronic Computer, based on work done on the University of Illinois’s ILLIAC computer.

View Map + Bookmark Entry

Filed under: Music

The U.S. Banking Industry Adopts Magnetic Ink Character Recognition 1959 – 1960

Between 1959 and 1960 the United States banking industry adopted MICR, (Magnetic Ink Character Recognition), which allowed computers to read the data printed on checks.

View Map + Bookmark Entry

The Most Voluminous Printed Catalogue of a Single Library 1959 – 1972

From 1959 to 1966 the British Museum (now the British Library) published its General Catalogue of Printed Books. Photolithographic Edition to 1955 in 263 folio volumes from 1959 to 1966. These volumes reproduced the catalogue cards of 4,350,000 items. In 1971 and 1972 the BM issued a Ten-Year Supplement, 1956-1970 in 23 volumes. This set of nearly 300 folio volumes was the "most voluminous" printed catalogue of a single library ever published in print.

Breslauer & Folter, Bibliography: Its History and Development (1984) no. 109.

Between 1967 and 1980 Readex Microprint of New York issued the Compact Edition of the entire British Museum Catalogue in a microprint edition (8 or 10 volumes in one). This was complete in 37 volumes, occupying 7 feet of shelf space. When it was published this was widely viewed as a very valuable reference source, and many antiquarian booksellers, such as myself, bought it.  However, I don't think we ever got much use out of it, and it was one of the first large sets we sold when it was evident that online resources would replace sets of this kind. In November 2013 the value of the Microprint Edition was limited. A colleague, Ian Jackson, offered a set for $100 in Cedules from a Berkeley Bookshop, No. 28.  (Readex Microprint evolved into Readex, an online publisher of mainly of historical source materials in digital form.)

View Map + Bookmark Entry

Auto-Encoding of Documents for Information Retrieval 1959

In 1959 computer scientist Hans Peter Luhn published "Auto-Encoding of Documents for Information Retrieval Systems,  M. Boaz (ed) Modern Trends in Documentation (1959) 45-58.

"Luhn believed that the growing rate of information and document production necessitated the invention of methods allowing data to be retrieved from stores of documents without expensive human intervention. This paper discusses auto-encoding based on statistical procedures performed by a machine on the original text of a document already in machine-readable form. The prevalent machine-readable form of that time was primarily punched cards or paper tape and less frequently magnetic tape. The auto-encoding method used word frequency rates, a special thesaurus, and the development of multi-dimensional patterns based on word proximity. At the time, application of the method was limited to articles of 500 to 5000 words, but Luhn was confident that the logical capabilities of electronic machines, statistical methods, and "further research into the characteristics of human behavior as manifested in writing" would lead to better information dissemination and retrieval. Earlier articles by this author discuss the automatic creation of abstracts and the development of thesauri" (http://www.ischool.utexas.edu/~ssoy/organizing/l391d2b.htm, accessed 04-26-2009).

View Map + Bookmark Entry

Human Versus Machine Intelligence and Communication (1959) 1959

"Somewhat the same problem arises in communicating with a machine entity that would arise in communicating with a person of an entirely different language background than your own. A system of logical definition and translation would have to be available. In order that meanings should not be lost, such a system of translation would also need to be precise. We are all familiar with the unhappy results of language translations which are either lacking in precision or where suitable words of equivalent meaning cannot be found. Likewise, translating into a machine language cannot be anything but an exact operation. Machines even more than people must be addressed with clarity and unambiguity, for machines cannot improvise on their own or imagine that about which they have not been specifically informed, as a human might do within reasonable limits of error. . . .

"We must now ascertain how concepts are formulated within the framework of computer language. For analogy, let us first consider the manner in which instructions are usually given to a non-mechanical entity. When we instruct, for example, a human being, we are aided by the fact that the human is usually able to fill in gaps in our instructions through acumen acquired from his own past experiences. It is seldom necessary that instructions be either detailed or literal, although we may have lost sight of this fact.

"The computer in a correlate example is a mechanical 'being' which must be instructed at each and every step. But it can be given a very long list of instructions upon which it can be expected to subsequently act with great speed and accuracy and with untiring repetition. Machine traits are: low comprehension, high retention, extreme reliability, and tremendous speed. The use of superlatives here to describe these traits is not exaggerative. Since speed becomes in practice the equivalent of number, the machine might be, and has sometimes been, equated to legions — an army, if you will — of lowgrade morons whose conceptualization is entirely literal, who remember as long as is necessary or as you desire them to, whose loyalty and subservience is complete, who require no holidays, no spurious incentives, no morale programs, pensions, not even gratitude for past service, and who seemingly never tire of doing elementary repetitive tasks such as typing, accounting, bookkeeping, arithmetic, filling in forms, and the like. In about all these respects the machine may be seen to be the exact opposite of nature's loftiest creature, the intellligent human being, who becomes bored with the petty and repetitious, who is unreliable, who wanders from the task for the most trivial reasons, who gets out of humor, who forgets, who requires constant incentives and rewards, who improvises on his own even when to do so is impertinent to the objectives being undertaken, and who in summary (let's face it) is unsuitable to most forms of industry as the latter are ideally and practically conceived in our times. It becomes apparent in retrospect that the only excuse we might ever have had for employing him to do many of civilization's more literal and repetitious tasks was the absence of something more efficient with which to replace him!

"It is not the purpose of this volume to explore further the ramifications of the above statements of fact. . . ."(Nett & Hetzler, An Introduction to Electronic Data Processing [1959] 86-88).

View Map + Bookmark Entry

Randolph Quirk Founds the Survey of English Usage: Origins of Corpus Linguistics 1959

In 1959 Randolph Quirk founded the Survey of English Usage, the first research center in Europe to carry out research in corpus linguistics.

"The original Survey Corpus predated modern computing. It was recorded on reel-to-reel tapes, transcribed on paper, filed in filing cabinets, and indexed on paper cards. Transcriptions were annotated with a detailed prosodic and paralinguistic annotation developed by Crystal and Quirk (1964) Sets of paper cards were manually annotated for grammatical structures and filed, so, for example, all noun phrases could be found in the noun phrase filing cabinet in the Survey. Naturally, corpus searches required a visit to the Survey.

"This corpus is now known more widely as the London-Lund Corpus (LLC), as it was the responsibility of co-workers in Lund, Sweden, to computerise the corpus" (Wikipedia article on Survey of English Usage, accessed 06-07-2010).

View Map + Bookmark Entry

Merle Curti's "The Making of an American Community": the First "Large Scale" Application of Humanities Computing in the U. S. 1959

The first "large scale" use of machine methods in humanities computing in the United States was Merle Curti's study of Trempealeau County, WisconsinThe making of an American Community: A Case Study of Democracy in a Frontier County (1959).

"Confronted with census material for the years 1850 through 1880–actually several censuses covering population, agriculture, and manufacturing–together with a population of over 17,000 persons by the latter date, Curti turned to punched cards and unit record equipment for the collection and analysis of his data. By this means a total of 38 separate items of information on each individual were recorded for subsequent manifpulation. Quite obviously, the comprehensive nature of this study was due in part to the employment of data processing techniques" (Bowles [ed.] Computers in Humanistic Research (1967) 57-58).

View Map + Bookmark Entry

The First Digital Poetry 1959

In 1959 German computer scientist Theo Lutz from Hochschule Esslingen created the first digital poetry using a text-generating program called “Stochastiche Text” written for the ZUSE Z22 computer. The program consisted of only 50 commands but could theoretically generate over 4,000,000 sentences.

Working with his teacher, Max Bense, one of the earliest theorists of computer poetry, Lutz used a random number generator to create texts where key words were randomly inserted within a set of logical constants in order to create a syntax. The programme thus demonstrated how logical structures like mathematical systems could work with language.

Funkhouser, Prehistoric Digital Poetry: An Archaeology of Forms 1959-1995 (2007).

View Map + Bookmark Entry

The First Computer Computer Matching Dating Service 1959

In 1959 Philip A. Fialer and James Harvey, students in Professor Jack Herriot’s computer course, "Math 139, Theory and Operation of Computing Machines," at Stanford University, devised the "Happy Families Planning Service" as a final math class project, pairing up 49 men and 49 women. For the project Fialer and Harvey had limited access to Stanford's newly acquired IBM 650 computer.

"The notion of melding Math 139 with the Great- Date-Matching Party surfaced early in the quarter when Fialer and Harvey needed to come up with a term project. For some time, Fialer and Harvey had hosted parties in houses that they rented with several electrical engineering and KZSU buddies at 1203 and 1215 Los Trancos Woods Road in Portola Valley. Student nurses from the Veterans Administration psychiatric hospital on Willow Road in Menlo Park were often invited. The boys represented themselves to the nurses as the “SRI Junior Engineers Social Club”—which was at least partially true, since one Los Trancos housemate worked summers and part-time as a junior engineer at Stanford Research Institute (SRI). KZSU radio station parties also were held in Los Trancos, featuring the KZSU musical band marching around the Los Trancos circle loop road at midnight. (This somewhat impromptu band was the basis for the current Los Trancos Woods Community Marching Band, officially organized at a KZSU party on New Year’s Eve in 1960.)  

'Fialer and Harvey figured a KZSU-Los Trancos type party could emerge as a positive by-product of Math 139, using the computer to match “a given number of items of one class to the same number of items of another class.” The classes would be male and female subjects, and the population would be Stanford students, with a few miscellaneous Los Trancos Woods residents thrown in.  

"The pair wrote a program to measure the differences in respondents’ answers to a questionnaire. A “difference” score was then computed for each possible male-female pair.  

"The program compared one member of a “class”—one man—with all members of the other class—women—and then repeated this for all members of the first class. The couple—a member from each class—with the lowest difference score was then matched, and the process repeated for the remaining members of each class. Thus, the first couple selected was the “best” match. As fewer couples remained in the pool, the matched couples had larger and larger difference scores.

"Given the limitations of computer time available and the requirements of the course, Fialer and Harvey did not use a “best-fit” algorithm, so the last remaining pairs were indeed truly “odd” couples. Two of the women in the sample, not Stanford students, were single mothers with two or three children. One of them, age 30, ended up paired with a frosh member of the Stanford Marching Band" (Computers in Love: Stanford and the First Trials of Computer Date Matching by C. Stewart Gillmor http://www.mgb67.com/computersinlove.htm, accessed 02-14-2013).

On February 13, 2013 The New York Times published a video interview with Fialer and Harvey regarding their early experiment in computer dating. In the interview they called the project the "Marriage Planning Service." The video showed pages from the program they wrote for the matching process, as apparently their complete file for the project was preserved.

View Map + Bookmark Entry

One of the First Computer Models of How People Learn 1959 – 1961

For his 1960 Ph.D thesis at Carnegie Institute of Technology (Carnegie Mellon University) carried out under the supervision of Herbert A. Simon, computer scientist Edward Feigenbaum developed EPAM (Elementary Perceiver and Memorizer), a computer program designed to model elementary human symbolic learning. Feigenbaum's thesis first appeared as An Information Processing Theory of Verbal Learning, RAND Corporation Mathematics Dvisision Report P-1817, October 9, 1959. In December 2013 a digital facsimile of Feigenbaum's personal corrected copy of the thesis was available from Stanford University's online archive of Feigenbaum papers at this link.

Feigenbaum's first publication on EPAM may have been "The Simulation of Verbal Learning Behavior," Proceedings of the Western Joint Computer Conference.... May 9-11, 1961 (1961) 121-32. In December 2013 a digital facsimile of this was also available at the same link.

Hook & Norman, Origins of Cyberspace (2002) no. 598.

View Map + Bookmark Entry

Stephen Parrish's Concordance of the Poems of Matthew Arnold: the First Computerized Literary Concordance 1959

The first published concordance of a literary work was probably Stephen M Parrish's A Concordance to the Poems of Matthew Arnold published by Cornell University Press in 1959. According to the Cornell Daily Sun newspaper issue for February 15, 1960, p. 6:

"The University Press introduced the use of an electronic computer to prepare "A Concordance to the Poems of Matthew Arnold," edited by Prof. Stephen M. Parrish of the Department of English.

"The device eliminates years of tedious work previously needed to prepare such volumes, and will serve as a model for future editions.

"The IBM 704 Computer reads 15,000 characters and makes 42,000 logical decisions per second. The computer run took 38 hours and the printing took 10 hours.

"The new process produces finished pages ready for offset reproduction and greatly reduces the number of errors.

"One feature of the concordance, unavailable in hand-edited volumes, is the Appendix, which lists the words of Arnold's vocabulary in order of frequency, and also gives the frequency of the word."

Parrish's concordance was reproduced by offset from line printer output in uppercase letters, with punctuation omitted, causing such ambiguities as making shell indistinguishable from she'll.

View Map + Bookmark Entry

The Inspiration for Artificial Neural Networks, Building Blocks of Deep Learning 1959

In 1959 Harvard neurophysiologists David H. Hubel and Torsten Wiesel, inserted a microelectrode into the primary visual cortex of an anesthetized cat. They then projected patterns of light and dark on a screen in front of the cat, and found that some neurons fired rapidly when presented with lines at one angle, while others responded best to another angle. They called these neurons "simple cells." Still other neurons, which they termed "complex cells," responded best to lines of a certain angle moving in one direction. These studies showed how the visual system builds an image from simple stimuli into more complex representations. Many artificial neural networks, fundamental components of deep learning, may be viewed as cascading models of cell types inspired by Hubel and Wiesel's observations.

For two later contributions Hubel and Wiesel shared the 1981 Nobel Prize in Physiologist or Medicine with Roger W. Sperry.

". . . firstly, their work on development of the visual system, which involved a description of ocular dominance columns in the 1960s and 1970s; and secondly, their work establishing a foundation for visual neurophysiology, describing how signals from the eye are processed by the brain to generate edge detectors, motion detectors, stereoscopic depth detectors and color detectors, building blocks of the visual scene. By depriving kittens from using one eye, they showed that columns in the primary visual cortex receiving inputs from the other eye took over the areas that would normally receive input from the deprived eye. This has important implications for the understanding of deprivation amblyopia, a type of visual loss due to unilateral visual deprivation during the so-called critical period. These kittens also did not develop areas receiving input from both eyes, a feature needed for binocular vision. Hubel and Wiesel's experiments showed that the ocular dominance develops irreversibly early in childhood development. These studies opened the door for the understanding and treatment of childhood  cataracts  and strabismus. They were also important in the study of cortical plasticity.

"Furthermore, the understanding of sensory processing in animals served as inspiration for the SIFT descriptor (Lowe, 1999), which is a local feature used in computer vision for tasks such as object recognition and wide-baseline matching, etc. The SIFT descriptor is arguably the most widely used feature type for these tasks" (Wikipedia article on David H. Hubel, accessed 11-10-2014). 

View Map + Bookmark Entry

Grace Hopper and Colleagues Introduce COBOL May 28, 1959 – December 7, 1960

On May 28 and 29, 1959 a group representing computer users, programmers, manufacturers, universities, and the government met at The Pentagon, Arlington, Virginia, to plan COBOL (COmmon Business Oriented Language), a non proprietary computer language designed for business use that could be run on all electronic computers. Its specifications were inspired by the FLOW-MATIC language invented by Grace Hopper, and the IBM COMTRAN language.

The first report on Cobol, Initial Specifications for a COmmon Business-Oriented Language for Programming Electronic Digital Computers was issued by the Defense Department and published in Washington, D.C. in April 1960.  

On December 6 and 7, 1960  essentially the same program written in COBOL was run on two different makes of computers— an RCA computer and a Remington-Rand Univac computer— demonstrating for the first time that compatibility between computers produced by different manufacturers could be achieved.

In 1961 a revised version of the initial COBOL report, titled COBOL-61: Revised Specifications for a Common Business-Oriented Language, was issued by the COBOL committee. Even after this revision, COBOL still lacked certain major components necessary for business data-processing programming; these were provided in the revised edition "which contained major (Report Writer facilities and SORT verb), and a few minor additions to COBOL-61" (Sammet, History of Programming Languages [1969] 333). This was followed by COBOL-61 Extended. Report ... Including Specifications for a Common Business Oriented Language (COBOL) [Washington, D.C.:] U. S. Department of Defense, 1962. 

Pertinent to the durability of COBOL in 2014 Vikram Chandra wrote:

"COBOL, a language first introduced in 1959 by Grace Hopper (‘Grandma COBOL’), still processes 90 per cent of the planet’s financial transactions, and 75 per cent of all business data. You can make a comfortable living maintaining code in languages like COBOL, the computing equivalents of Mesopotamian cuneiform dialects" (Chandra, "Most Code is an Ugly Mess. Here's How to Make it Beautiful," Wired, 9.02.14).

Hook & Norman, Origins of Cyberspace (2002) nos. 543, 544.

(This entry was last revised on 12-26-2014.)

View Map + Bookmark Entry

Filed under: Software

The First Formal Definition of Hacker June 1959

In June 1959 Peter R. Samson, Public Relations Committee of the MIT Tech Model Railroad Club, defined the term "hack" in the Tech Model Railroad Club Dictionary as:

"1) an article or project without constructive end

"2) a project undertaken on bad self-advice

"3) an entropy booster

"4) to produce, or attempt to produce, a hack(3)."

Samson defined hacker as "one who hacks, or makes them."

Much of the Tech Model Railroad Club jargon was later incorporated into early computer culture. In 2005 Samson commented:

"I saw this as a term for an unconventional or unorthodox application of technology, typically deprecated for engineering reasons. There was no specific suggestion of malicious intent (or of benevolence, either). Indeed, the era of this dictionary saw some 'good hacks:' using a room-sized computer to play music, for instance; or, some would say, writing the dictionary itself" (http://www.gricer.com/tmrc/dictionary1959.html, accessed 06-01-2009).

View Map + Bookmark Entry

The Corona Satellite Series: America's First Imagining Satellite Program June 1959 – May 31, 1972

In June 1959 KH-1, the first of the Corona series of American strategic imaging reconnaissance satellites was launched. Produced and operated by the Central Intelligence Agency Directorate of Science and Technology with assistance from the U.S. Air Force, the Corona satellites were used for photographic surveillance of the Soviet Union, the People's Republic of China and other areas. The 145th and last Corona satellite was launched on May 25, 1972 with its film recovered on May 31, 1972. Over its lifetime, CORONA provided photographic coverage totaling approximately 750,000,000 square miles of the earth’s surface.

"The Corona satellites used 31,500 feet (9,600 meters) of special 70 millimeter film with 24 inch (60 centimeter) focal length cameras. Initially orbiting at altitudes from 165 to 460 kilometers above the surface of the Earth, the cameras could resolve images on the ground down to 7.5 meters in diameter. The two KH-4 systems improved this resolution to 2.75 meters and 1.8 meters respectively, because they operated at lower orbital altitudes. . . .

"The first dozen or more Corona satellites and their launches were cloaked with disinformation as being part of a space technology development program called the Discoverer program. The first test launches for the Corona/Discoverer were carried out early in 1959. The first Corona launch containing a camera was carried out in June 1959 with the cover name Discoverer 4. This was a 750 kilogram satellite launched by a Thor-Agena rocket.

"The plan for the Corona program was for its satellites to return canisters of exposed film to the Earth in re-entry capsules, called by the slang term "film buckets", which were to be recovered in mid-air by a specially-equipped U.S. Air Force planes during their parachute descent. (The buckets were designed to float on the water for a short period of time for possible recovery by U.S. Navy ships, and then to sink if the recovery failed, via a water-dissolvable plug made of salt at the base of the capsule. This was for secrecy purposes.)" (Wikipedia article on Corona (satellite) accessed 11-29-2010).

"The return capsule of the Discoverer 13 mission, which launched August 10, 1960, was successfully recovered the next day. This was the first time that any object had been recovered successfully from orbit. After the mission of Discoverer 14, launch on August 18, 1960, its film bucket was successfully retrieved two days later by a C-119 Flying Boxcar transport plane. This was the first successful return of photographic film from orbit."

"CORONA enabled the US to specify verifiable terms of the Strategic Arms Limitation Treaty (SALT) with the Soviet Union in 1971. US negotiators confidently knew that photointerpreters could monitor changes in the size and characteristics of missile launchers, bombers, and submarines. Satellite imagery became the mainstay of the US arms-control verification process" (Central Intelligence Agency, CORONA: America's First Imaging Satellite Program, accessed 11-08-2014).


View Map + Bookmark Entry

Machines Can Learn from Past Errors July 1959

In July 1959 Arthur Lee Samuel published "Some Studies in Machine Learning Using the Game of Checkers," IBM Journal of Research and Development 3 (1959) no. 3, 210-29. In this work Samuel demonstrated that machines can learn from past errors — one of the earliest examples of non-numerical computation.

Hook & Norman, Origins of Cyberspace (2002) no. 874.

View Map + Bookmark Entry

The Beginning of Expert Systems for Medical Diagnosis July 3, 1959

"Reasoning Foundations of Medical Diagnosis," by Robert S. Ledley and Lee B. Lusted published in Science, 130, No. 3366, 9-21, on July 3, 1959 represented the beginning of the development of clinical decision support systems (CDSS) — interactive computer programs, or expert systems, designed to assist physicians and other health professionals with decision making tasks.

"Areas covered included: symbolic logicBayes’ theorem (probability), and value theory. In the article, physicians were instructed how to create diagnostic databases using edge-notched cards to prepare for a time when they would have the opportunity to enter their data into electronic computers for analysis. Ledley and Lusted expressed hope that by harnessing computers, much of physicians’ work would become automated and that many human errors could therefore be avoided.

"Within medicine, Ledley and Lusted’s article has remained influential for decades, especially within the field of medical decision making. Among its most enthusiastic readers was cardiologist Homer R. Warner, who emulated Ledley and Lusted’s methods at his research clinic at LDS Hospital in Utah. Warner’s work, in turn, shaped many of the practices and priorities of the heavily computerized Intermountain Healthcare, Inc., which was in 2009 portrayed by the Obama administration as an exemplary model of a healthcare system that provided high-quality and low-cost care.

"The article also brought national media attention to Ledley and Lusted’s work. Articles about the work of the two men ran in several major US newspapers. A small demonstration device Ledley built to show how electronic diagnosis would work was described in the New York World Telegram as a “A Metal Brain for Diagnosis,” while the New York Post ran a headline: “Dr. Univac Wanted in Surgery.” On several occasions, Ledley and Lusted explained to journalists that they believed that computers would aid physicians rather than replace them, and that the process of introducing computers to medicine would be very challenging due to the non-quantitative nature of much medical information. They also envisioned, years before the development of ARPANET, a national network of medical computers that would allow healthcare providers to create a nationally-accessible medical record for each American and would allow rapid mass data analysis as information was gathered by individual clinics and sent to regional and national computer centers" (Wikipedia article on Robert Ledley, accessed 05-03-2014.)

(This entry was last revised on 05-03-2014.)

View Map + Bookmark Entry

The First Photograph of Earth from an Orbiting Satellite August 14, 1959

The first photograph of the earth from an orbiting satellite was taken by the U.S. Explorer 6 on August 14, 1959. The crude image shows a sun-lit area of the Central Pacific ocean and its cloud cover. The picture was made when the satellite was about 17,000 miles above the surface of the earth on August 14, 1959. At the time, the satellite was crossing Mexico. The signals were received at the tracking station at South Point, Hawaii (also known as Ka Lae).

(This entry was last revised on 11-08-2014.)

View Map + Bookmark Entry

The Xerox 914 September 16, 1959

Xerox 914.

On September 16, 1959 Haloid Xerox, Rochester, New York, introduced the Xerox 914, the first successful commercial plain paper xerographic copier, roughly the size of a desk.

". . .  commercial models were not available until March 1960. The first machine, delivered to a Pennsylvania metal-fastener maker, weighed nearly 650 pounds. It needed a carpenter to uncrate it, an employee with 'key operator' training, and its own 20-amp circuit. In an episode of Mad Men, set in 1962, the arrival of the hulking 914 helps get Peggy Olson her own office, after she tells her boss, 'It’s hard to do business and be credible when I’m sharing with a Xerox machine' " (http://www.theatlantic.com/magazine/archive/2010/07/the-mother-of-all-invention/8123/, accessed 06-11-2010).

View Map + Bookmark Entry

Highlights of the Digital Equipment Corporation PDP Series of Minicomputers December 1959 – 1975

In December 1959, at the Eastern Joint Computer Conference in Boston, Digital Equipment Corporation (DEC) of Maynard, Massachusetts, demonstrated the prototype of its first computer, the PDP-1 (Programmed Data Processor-1), designed by a team headed by Ben Gurley.

"The launch of the PDP-1 (Programmed Data Processor-1) computer in 1959 marked a radical shift in the philosophy of computer design: it was the first commercial computer that focused on interaction with the user rather than the efficient use of computer cycles" (http://www.computerhistory.org/collections/decpdp-1/, accessed 06-25-2009).

Selling for $120,000, the PDP-1 was a commercialization of the TX-O and TX-2 computers designed at MIT’s Lincoln Laboratory. On advice from the venture-capital firm that financed the company, DEC did not call it a “computer,” but instead called the machine a “programmed data processor.” The PDP-1 was credited as being the most important in the creation of hacker culture. 

In 1963 DEC introduced the PDP-5, it's first 12-bit computer. The PDP-5 was later called “the world’s first commercially produced minicomputer.” However, the PDP-8 introduced in 1965 was also given this designation.

Two years later, in 1965 DEC introduced the PDP-8, the first “production model minicomputer.” “Small in physical size, selling in minimum configuration for under $20,000.”

In 1970 DEC (Digital Equipment Corporation) of Maynard, Massachusetts, introduced the PDP-11minicomputer, which popularized the notion of a “bus” (i.e.“Unibus”) onto which a variety of additional circuit boards or peripheral products could be placed. DEC sold 20,000 PDP-11s by 1975.

View Map + Bookmark Entry