4406 entries. 94 themes. Last updated December 26, 2016.

Codes / Cryptography / Cryptanalysis Timeline

Theme

8,000 BCE – 1,000 BCE

Linear B and its Decipherment: Records of Mycenaean Civilization Circa 1,450 BCE – 1953

"Before the advent of the Greek alphabet, the written records of Mainland Greece, Crete, and Cyprus were recorded using a family of five related scripts. The earliest of these was Cretan Hieroglyphic, devised by the Minoans on Crete at the end of the 3rd millennium BCE; their later script, Linear A, is based on Cretan Hieroglyphic. Linear A in turn served as the model for two more scripts near the end of the Bronze Age: Cypro-Minoan, the script of the pre-Greek inhabitants of Cyprus; and Linear B, the script of the Mycenaeans, used for writing Mycenaean Greek. Finally, in the early Iron Age, the Greek-speaking peoples of Cyprus used Cypro-Minoan as the model for a new script, the Cypriot Syllabary, and employed it to write in their own dialect of Greek" (Brent Davis, Introduction to the Aegean Pre-Alphabetic Scripts [2010]).

About 1450 BCE, during or shortly after the period in which Cypro-Minoan was created on Cyprus, the Mycenaeans devised their own script based on the still undeciphered script known today as Linear A, and began using it to create administrative records. This syllabic script, different from Linear A, and first found in the discovery of Knossos on Crete in 1878, was called Linear B by archaeologist Arthur Evans

"Mycenaean artifacts have been found well outside the limits of the Mycenaean world: namely Mycenaean swords are known from as far away as Georgia in the Caucasus, an amber object inscribed with Linear B symbols has been found in BavariaGermany and Mycenaean bronze double axes and other objects dating from the 13th century BC have been found in Ireland and in Wessex and Cornwall in England" (Wikipedia article on Mycenaean Greece, accessed 10-13-2014).

During 1952 and 1953 English architect and classical scholar Michael Ventris deciphered Linear B without the aid of a bilingual document—the use of which was so instrumental in the decipherment of other ancient languages such as PhoenicianEgyptian hieroglyphs and cuneiform script. Ventris's remarkable achievement proved that Linear B is an early form of Greek (Mycenaean Greek) used from about 1450 to 1200 BCE.

During the Bronze Age Collapse, from circa 1200-1150 BCE Linear B, found mainly in the palace archives  at Knossos, Cydonia, Pylos, Thebes and Mycenae, disappeared with the fall of Mycenaean  civilization. The succeeding period, known as the Greek Dark Ages, left no evidence of the use of writing. With the collapse of the palatial centers at Knossos and elsewhere—one possible but no longer widely accepted explanation for which was the eruption of the volcano on the island of Thera (Santorini)—no more monumental stone buildings were built and the practice of wall painting may have ceased. Writing in Linear B also ceased, vital trade links were lost, and towns and villages were abandoned. The population of Greece was reduced, and the world of organized state armies, kings, officials, and redistributive economies disappeared. Most of the information concerning the Greek Dark Ages comes from burial sites and artifacts contained within the graves. To what extent the earliest Greek literary sources, the Iliad and Odyssey— products of the oral tradition— and Hesiod's Works and Days written after writing was reintroduced to Greece, describe life in the Greek Dark Ages or earlier remains an issue debated by scholars.

Ventris & Chadwick, Documents in Mycenaean Greek (1956), chapters 1-2. Chadwick, The Decipherment of Linear B (1958).

In February 2014 a very useful anonymous illustrated historical summary of "The Decipherment Process" was available from the classics department at the University of Cambridge at this link. The latest bibliographical reference in this PDF was dated 2013.

In 2013 attention was drawn to the work of the American classicist Alice Kober, who worked for years on the decipherment of Linear B, but died of lung cancer in 1950 at the early age of 43. It was suggested that Ventris may have been assisted in his discovery by work done by Kober. 

(This entry was last revised on October 13, 2014.)

View Map + Bookmark Entry

1,000 BCE – 300 BCE

The Skytale: An Early Greek Cryptographic Device Used in Warfare Circa 650 BCE

The skytale (scytale, σκυτάλη "baton"), a cylinder with a strip of parchment wrapped around it on which was written a message, was used by the ancient Greeks and Spartans to communicate secretly during military campaigns. It was first mentioned by the Greek poet Archilochus (fl. 7th century BCE), but the first clear indication of its use as a cryptographic device appeared in the writings of the poet and Homeric scholar, Apollonius of Rhodes, who also served as librarian at the Royal Library of Alexandria. 

Plutarch, writing in the first century CE, provided the first detailed description of the operation of the skytale:

The dispatch-scroll is of the following character. When the ephors send out an admiral or a general, they make two round pieces of wood exactly alike in length and thickness, so that each corresponds to the other in its dimensions, and keep one themselves, while they give the other to their envoy. These pieces of wood they call scytalae. Whenever, then, they wish to send some secret and important message, they make a scroll of parchment long and narrow, like a leathern strap, and wind it round their scytale, leaving no vacant space thereon, but covering its surface all round with the parchment. After doing this, they write what they wish on the parchment, just as it lies wrapped about the scytale; and when they have written their message, they take the parchment off and send it, without the piece of wood, to the commander. He, when he has received it, cannot otherwise get any meaning out of it,--since the letters have no connection, but are disarranged,--unless he takes his own scytale and winds the strip of parchment about it, so that, when its spiral course is restored perfectly, and that which follows is joined to that which precedes, he reads around the staff, and so discovers the continuity of the message. And the parchment, like the staff, is called scytale, as the thing measured bears the name of the measure.
—Plutarch, Lives (Lysander 19), ed. Bernadotte Perrin (quoted in Wikipedia article on Scytale, accessed 04-05-2014).

From Plutarch's description we might draw the conclusion that the skytale was used to transmit a transposition cipher. However, because earlier accounts do not confirm Plutarch's account, and because of the cryptographic weakness of the device, it was suggested that the skytale was used for conveying messages in plaintext, and that Plutarch's description is mythological. Another hypothesis is that the skytale was used for "message authentication rather than encryption. Only if the sender wrote the message around a scytale of the same diameter as the receiver's would the receiver be able to read it. It would therefore be difficult for enemy spies to inject false messages into the communication between two commanders" (Wikipedia article on Scytale, accessed 08-05-2014).

View Map + Bookmark Entry

The Hydraulic Telegraph 350 BCE

Polybius (View Larger)

According to Polybius, a Greek historian of the Hellenistic period, Aeneas Tacticus, one of the earliest Greek writers on the art of war, invented the hydraulic telegraph about 350 BCE. It was a semaphore system used during the First Punic War to send messages between Sicily and Carthage.

"The system involved identical containers on separate hills; each container would be filled with water, and a vertical rod floated within. The rods were inscribed with various predetermined codes.

"To send a message, the sending operator would use a torch to signal the receiving operator; once the two were synchronized, they would simultaneously open the spigots at the bottom of their containers. Water would drain out until the water level reached the desired code, at which point the sender would lower his torch, and the operators would simultaneously close their spigots."

View Map + Bookmark Entry

300 BCE – 30 CE

The Rosetta Stone: Key to the Decipherment of Egyptian Hieroglyphs 196 BCE – 1822

Only July 15, 1799 French Capitaine Pierre-François Bouchard, with Napoleon in Egypt, discovered a dark stone 112.3 cm tall, 75.7 wide and 28.4 thick in the ruins of Fort St. Julien near the coastal city of Rosetta (Arabic: رشيد‎ Rašīd, French: Rosette), 65 kilometers east of Alexandria. This stone, which had been used in the construction of a fortress by the fifteenth century Mamluk ruler of Egypt, Al-Ashraf Sayf ad-Din Qa'it Bay (Sultan Qaitbay), was later understood to be a fragment of an ancient stela (stele)— a stone on which one of a series of Ptolemaic decrees issued over the reign of the Hellenistic Ptolemaic dynasty, which ruled Egypt from 305 BCE to 30 BCE, were inscribed and put up in major temple complexes in Egypt. The decree, known as the third Memphis decree, passed by a council of priests from the Ptolemaic period in 196 BCE, affirmed the royal cult of the 13-year-old Ptolemy V as a living god on the first anniversary of his coronation. The decree was written in Egyptian hieroglyphs (the language of the priests, suitable for a priestly decree), in Egyptian Demotic script (the native script used for daily purposes), and in classical Greek (the language of the Hellenistic administration).

The stele found at Rosetta could not have originally been placed there because the land on which it was found did not exist at the time of its carving, but was the result of later sedimentation. Another decree, also written in the same languages, known as the Canopus Decree, was later discovered at Tanis in 1866 by Egyptologist Karl Richard Lepsius. A second Canopus Decree was found in 1881. A third decree in the same languages, known as the Decree of Memphis (Ptolemy IV) is known in two versions: the Raphia Decree, found 1902 at the site of ancient Memphis, and the Pithom Stele, No. II, found 1923, which has hieroglyphs on the front, 42 lines in Demotic on the back, providing an almos complete translation, and Greek on the side.   

Following the death of Alexander the Great in 323 BCE, the Ptolemaic dynasty in Egypt had been established by the first Ptolemy, known as Ptolemy I Soter, one of Alexander's generals. Ignorant of the Egyptian language, the Ptolemies required their officials to speak Greek and made Greek the language of their administration, a requirement that remained in effect throughout their dynasty, which lasted for a thousand years. During their rule the Ptolemies made their capital city Alexandria the most advanced cultural center in the Greek-speaking world, for centuries second only to Rome. Among their most famous projects were the Royal Library of Alexandria and the Pharos Lighthouse, or Lighthouse of Alexandria, one of the Seven Wonders of the Ancient World

Because of the Ptolemaic dynasty's replacement of hieroglyphics by Greek among the educated non-priestly class educated Egyptians outside of the priesthood lost the ability to read their ancient pictographic language. Later, on February 27, 380, emperors Theodosius IGratian, and Valentinian II made Nicene Christianity the official state religion of the Roman Empire by the Edict of Thessalonica, also known as Cunctos populos, stating that all their subjects should profess the faith of the bishops of Rome and Alexandria. In 392 CE Theodosius issued a decisive edict closing Egyptian temples. As a result, the latest known inscription written in Egyptian hieroglyphs is dated August 23, 394 CE.

During the centuries of Muslim rule one scholar in Egypt during the ninth to tenth centuries, Ahmad bin Abu Bakr ibn Washshiyah, wrote a treatise on scripts in which he not only interpreted hieroglyphs as pictorial images, but, by relating them to the Coptic language used by Coptic priests during his time, also provided an alphabet in which hieroglyphs represented single letters, though only occasionally correctly. This text, which was read in manuscript by seventeenth-century polymath Athanasius Kircher, was later translated into English by Joseph Hammer, Secretary of the Imperial Legation at Constantinople, and published in print in 1806 as Ancient Alphabets and Hieroglyphic Characters Explained, with an Account of the Egyptian Priests. Following Kircher's early but incorrect attempts to understand hieroglyphs, by the mid-18th century deciphering the ancient Egyptian hieroglyphic language became one of the most challenging problems for European archeologists and linguists. Probably in 1761 Abbé Jean-Jacques Barthélemy was the first to suggest that the cartouches or oval-shaped framed sections of hieroglyphic inscriptions contained the names of gods and kings.

The Rosetta Stone was forfeited to the English in 1801 under the terms of the Treaty of Alexandria. Following its arrival in England in 1801, the Rosetta stone was placed in The Society of Antiquaries, where casts were made and sent to the universities of Oxford, Cambridge, Edinburgh and Dublin and to scholars in France for incorporation in the Description de l'Égypt that was eventually published between 1809 and 1828. In June, 1802 the stone was placed in the British Museum, where it remains. The Society of Antiquaries issued full-size reproductions of the stone between 1802 and 1803. Once the texts were available to scholars the three approximately parallel texts on the Rosetta Stone became key pieces of evidence in the research on hieroglyphics by Antoine Isaac Silvestre de Sacy, Johan David Åkerblad and Thomas Young, culminating in Jean-François Champollion's translation of the hieroglyphic text on the stone in 1822.

The first scholarly publication on the Rosetta Stone was de Sacy's, pamphlet: Lettre au Citoyen Chaptal . . . au sujet de l'inscription Égyptienne du monument trouvé à Rosette (Paris, 1802). In this brief work illustrated with one transcription of a portion of the stone, the orientalist and linguist Sacy, a teacher of Champollion, made some progress in identifying proper names in the demotic inscription. Within the same year another student of Sacy, the Swedish diplomat and orientalist Johan David Åkerblad published another "lettre" which described how he had managed to identify all proper names in the demotic text in just two months.  

"He could also read words like "Greek", "temple" and "Egyptian" and found out the correct sound value from 14 of the 29 signs, but he wrongly believed the demotic hieroglyphs to be entirely alphabetic. One of his strategies of comparing the demotic to Coptic later became a key in Champollion's eventual decipherment of the hieroglyphic script and the Ancient Egyptian language" (Wikipedia article on Johan David Akerblad, accessed 12-27-2012).

"At some period after its arrival in London, the inscriptions on the stone were coloured in white chalk to make them more legible, and the remaining surface was covered with a layer of carnauba wax designed to protect the Rosetta Stone from visitors' fingers. This gave a dark colour to the stone that led to its mistaken identification as black basalt. These additions were removed when the stone was cleaned in 1999, revealing the original dark grey tint of the rock, the sparkle of its crystalline structure, and a pink vein running across the top left corner. Comparisons with the Klemm collection of Egyptian rock samples showed a close resemblance to rock from a small granodiorite quarry at Gebel Tingar on the west bank of the Nile, west of Elephantine in the region of Aswan; the pink vein is typical of granodiorite from this region. . . . (Wikipedia article on Rosetta Stone, accessed 06-10-2011).

♦ When I revised this database entry in October 2012 the Rosetta Stone was the most widely viewed object in the British Museum. Reflective of this intense interest, the British Museum shop then offered a remarkably wide range of products with the Rosetta Stone motif, ranging from facsimiles of the stone in various sizes to umbrellas, coffee mugs, mousepads, neckties, and iPhone cases. In their British Museum Objects in Focus series of booklets they also issued a very useful 64-page compact reference: The Rosetta Stone by Richard Parkinson (2005). Parkinson was the author of the more definitive work entitled Cracking Codes. The Rosetta Stone and Decipherment, with Contributions by W[hitfield] Diffie, M. Fischer, and R.S. Simpson also published by the British Museum in 1999.

(This entry was last revised on August 12, 2014.)

View Map + Bookmark Entry

800 – 900

Carmina Figurata Word Pictures Circa 810

One of the most outsanding illumated manuscripts of De luadibus sanctae crucis, preserved in the Vatican Library, depicting Christ. (View Larger)

 

About 810 Frankish Benedictine monk, Hrabanus Maurus, wrote De laudibus sanctae crucis, a collection of 28 encrypted religious poems in praise of the holy cross. Arranged in the carmina figurata style of word pictures, in which shapes appropriate to the textual context are created by the outlines of letters, phrases or verses of poetry, these became much-admired and often copied.

In February 2014 a digital facsimile of an excellent 11th century illuminated manuscript of the text was available from the Burgerbibliothek, Bern, Switzerland at this link.

Bischoff, Latin Paleography: Antiquity and Middle Ages (1990) 210.

View Map + Bookmark Entry

The First Treatise on Cryptanalysis Circa 850

The first recorded exposition of any kind of cryptanalysis was the discussion of frequency analysis by the Muslim Arab philosopher, mathematician, physician and musician Abu Yusuf Yaʻqūb ibn Isḥāq al-Sabbah al-Kindī (Arabic: ابو يوسف يعقوب بن اسحاق الصبّاح الكندي‎) in his treatise on Deciphering Cryptographic Messages written in Baghdad about 850 CE.

It was suggested that close textual study of the Qur'an showed that Arabic has a characteristic letter frequency, to which frequency analysis could be applied.

View Map + Bookmark Entry

1400 – 1450

The Voynich Manuscript: Uncrackable Code or Great Written Hoax? Progress in its Deciphering Circa 1404 – 1438

Several pages from the indecipherable Voynich Manuscript. (View Larger)

The Voynich manuscript, a mysterious illustrated manuscript book written in what long appeared to be an indecipherable text, has been the subject of much research and speculation for centuries. However, its author, script and language remain unknown, and for centuries it was believed that the manuscript might have been intentionally meaningless. The mysteries involved with this manuscript have resulted in various videos of which the following appeared to be the best in February 2014:

"Over its recorded existence, the Voynich manuscript has been the object of intense study by many professional and amateur cryptographers, including some top American and British codebreakers of World War II fame (all of whom failed to decrypt a single word). This string of failures has turned the Voynich manuscript into a famous subject of historical cryptology, but it has also given weight to the theory that the book is simply an elaborate hoax — a meaningless sequence of arbitrary symbols" (Wikipedia article on the Voynich Manuscript).

The book is named after the Polish-American book-dealer Wilfrid M. Voynich, who acquired it in 1912. Since 1969 it has been preserved in the Beinecke Rare Book and Manuscript Library of Yale University, having been donated by the American rare book and manuscript dealer, H.P. Kraus.

Progress on the deciphering the manuscript was made in the 21st century:

♦ In 2011 scientists, using carbon-14 dating, were able to date the vellum on which the manuscript was written to between 1404 and 1438. This pushed its origin back perhaps 50 years.  However, the meaning, if any, of the circa 250,000 characters and the many diagrams in the manuscript, remained unknown.

In June 2013, Marcelo Montemurro, a theoretical physicist from the University of Manchester, UK, published a study which he believes shows that the manuscript was unlikely to be a hoax. Using a computerised statistical method to analyse the text, Montemurro and Zanette found that it followed the structure of "real languages":

Montemurro MA, Zanette DH (2013) "Keywords and Co-Occurrence Patterns in the Voynich Manuscript: An Information-Theoretic Analysis," PLoS ONE 8(6): e66344. doi:10.1371/journal.pone.0066344

In issue no. 100 of the American Botanical Council's HerbalGram, published in 2013, Arthur O. Tucker, and Rexford H. Talbert identified some of the plants illustrated in the manuscript and suggested that manuscript possibly originated in Mexico:

Tucker & Talbot, "A Preliminary Analysis of the Botany, Zoology, and Mineralogy of the Voynich Manuscript," herbalgram.org, Issue 100, 70-84 (reprodcuing numerous color illustrations, and with a bibliography of 74 citations.

In January 2014 Stephen Bax, an expert in applied linguistics from Bedfordshire University, reported that he had deciphered 10 words in the Voynich manuscript and was optimistic that using his methods more words would be deciphered:

"A proposed partial decoding of the Voynich script," Version 1, January 2014.  http://www.academia.edu/5932444/A_proposed_partial_decoding_of_the_Voynich_script#

In January 2014 Bax also produced a video on the issued involved:

In January 2015 palaeographer Judith Fagin Davis posted an exceptionally interesting and well-illustrated account of the Voynich Manuscript in her blog, Manuscript Road Trip: The World's Most Mysterious Manuscript. Highly recommended!

View Map + Bookmark Entry

One of the Earliest Surviving Italian Manuscripts on Technology and War Machines Circa 1420

Folio 2r of Bellicorum instrumentorum liber, showing an 'Oriental siege machine.' (View Larger)

The Bellicorum instrumentorum liber, cum figuris et fictitys litoris conscriptus, written and drawn by the Italian engineer, self-styled magus, and physician to the Venetian army in Brescia, Giovanni Fontana, may be the earliest extant illustrated Italian manuscript on technology and war machines.

Fontana accompanied each of his roughly 140 illustrations of siege engines, fountains and pumps, lifting and transporting machines, defensive towers, dredges, combination locks, battering rams, a "rocket-powered" craft, the first ever depiction of the magic lantern, scaling ladders, alchemical furnaces, clockwork, robotic automata, and measuring instruments with a caption that was partially encoded with a substitute cypher system.

♦ You can view a digital facsimile of Fontana's manuscript at the Bayerische Staatsbibliothek website at this link: http://daten.digitale-sammlungen.de/~db/0001/bsb00013084/images/index.html?id=00013084&fip=67.164.64.97&no=4&seite=21, accessed 01-16-2010).


Another manuscript by Fontana, preserved in the Bibliothèque nationale de France (Nouvelles Acquisitions Latin 635), entitled Secretum de thesauro experimentorum ymaginationis hominum, concerned mnemonic devices and memory: 

"The entire manuscript, excepting the table of contents, title and concluding formula is in cipher; this consists  almost entirely of straight lines and circles. Abbreviation marks are  placed under the script. . . .

"where one sees several projects of combiantorial machines, concentric disks, cylinders, rolls that allow the permutation of isolated elements of writing (letters or words): and engineer's realization of the Lullian dream. However the connection between the theater in the first book and the devices of the second is not one of mere juxtaposition: the Secretum is actually a treatise of mnemotechnics, or, as Battisti put it, "the blueprint for a compact database of the mind (http://www.voynich.net/Arch/2002/09/msg00136.html, accessed 01-16-2010).

View Map + Bookmark Entry

1450 – 1500

Leon Battista Alberti Describes "The Alberti Cipher" 1467

An engraved portrait of Leon Battista Alberti. Engraved by G. Benaglia and published in the 18th century.

An Alberti cipher disk.

Italian author, artist, architect, poet, priest, linguist, philosopher, cryptographer and general Renaissance humanist polymath Leon Battista Alberti wrote De Cifris describing the first polyalphabetic substitution with mixed alphabets and variable period. Compared to previous ciphers of the period, the Alberti Cipher was impossible to break without knowledge of the method. This was because the frequency distribution of the letters was masked and frequency analysis - the only known technique for attacking ciphers at that time - was no help. To facilitate the encryption process employed the first mechanical device, known as the Alberti cipher disk, also called formula. 

The cipher disk "is made up of two concentric disks, attached by a common pin, which can rotate one with respect to the other.

"The larger one is called Stabilis [stationary or fixed], the smaller one is called Mobilis [movable]. The circumference of each disk is divided into 24 equal cells. The outer ring contains one uppercase alphabet for plaintext and the inner ring has a lowercase mixed alphabet for ciphertext.

"The outer ring also includes the numbers 1 to 4 for the superencipherment of a codebook containing 336 phrases with assigned numerical values. This is a very effective method of concealing the code-numbers, since their equivalents cannot be distinguished from the other garbled letters.

" The sliding of the alphabets is controlled by key letters included in the body of the cryptogram" (Wikipedia article on Alberti cipher disk, accessed 03-30-2012).

View Map + Bookmark Entry

1500 – 1550

Johannes Trithemius Issues the First Book on Cryptography July 1518

 The 'square table' of abbot Johannes Trithemius’s 'Polygraphiae libri sex. - Clavis polygraphiae' was an example of how a message might be encoded through the use of multiple alphabets. (View Larger)

The abbot Johannes Trithemius’s (Tritheim's) Polygraphiae libri sex. - Clavis polygraphiae, a book on many forms of writing, but actually the first book on codes and cryptography, was posthumously published in Basel in 1518, two years after his death. Publication had been delayed because of ecclesiastical disapproval.

The codes that Tritheim invented and described in this book, notably the "Ave Maria" cipher, which takes up the bulk of the work (each word representing a letter, with consecutive tables making it possible to so arrange a code that it will read as a prayer), and the "square table", a sophisticated system of coding using multiple alphabets, were used for centuries. The remarkable title page is composed of a 7 woodcut blocks, showing the author presenting his book, and a bearded monk presenting a pair of keys, to the Emperor Maximilian. This block is within historiated woodcut borders of scholars holding emblems of science, arms of Maximilian and three other armorial shields at corners, and a reclining portrait of Trithemius himself at bottom.

Kahn, The Codebreakers  (1967) 134-35.

View Map + Bookmark Entry

1550 – 1600

Giovan Battista Bellaso Describes the First "Unbreakable" Text Autokey Cipher 1553

Table of reciprocal alphabet from a 1555 book by Giovan Battista Bellaso.

In 1553 Italian cryptologist Giovan Battista Bellaso published La Cifra del Sig. Giovan Battista Bel[l]aso, describing a text autokey cipher that was considered unbreakable for four centuries. "He suggested identifying the alphabets by means of an agreed-upon countersign or keyword off-line. He also taught various ways of mixing the cipher alphabets in order to free the correspondents from the need to exchange disks or prescribed tables.

"In 1550 Bellaso "was in the service of Cardinal Duranti in Camerino and had to use secret correspondence in the state affairs while his master was in Rome for a conclave. Versed in research, able in mathematics, Bellaso dealt with secret writing at a time when this art enjoyed great admiration in all the Italian courts, mainly in the Roman Curia. In this golden period of the history of cryptography, he was just one of many secretaries who, out of intellectual passion or for real necessity, experimented with new systems during their daily activities. His cipher marked an epoch and was considered unbreakable for four centuries. As a student of ciphers, he mentioned among his enthusiasts many eminent gentlemen and ‘‘great princes’’. In 1552, he met count Paolo Avogadro, count Gianfrancesco Gambara, and the renowned writer Girolamo Ruscelli, also an expert in secret writing, who urged him to reprint a reciprocal table that he was circulating in loose-leaf form, in print and manuscript. The table was to be duly completed with the instructions. Copies of these tables exist in contemporary private collections in Florence and Rome" (Wikipedia article on Giovan Battista Belaso, accessed 12-22-2008).

View Map + Bookmark Entry

Giambattista della Porta Publishes the First Known Digraphic Substitution Cypher 1563

Italian scientist, polymath, playwright, and "professor of secrets" Giambattista della Porta published in Naples at the press of Giovanni Maria Scoto De Furtivis Literarum Notis. In this work on cryptography Porta described the first known digraphic substitution cipher (cypher).

View Map + Bookmark Entry

Blaise de Vigenère Describes What is Later Known as the Vigenère Cipher 1586

In 1586 French diplomat and cryptographer Blaise de Vigenère published in Paris Traicté des chiffres ou secrètes manières d'escrires. Vigenère's book described a text autokey cipher that became known as the Vigenère cipher because it was misattributed to Vigenère in the 19th century. The actual inventor of the text autokey cipher was Giovan Battista Bellaso (1563).

“Vigenère became acquainted with the writings of Alberti, Trithemius, and Porta when, at the age of twenty-six, he was sent to Rome on a two year diplomatic mission. To start with, his interest in cryptography was purely practical and was linked to his diplomatic work. Then, at the age of thirty-nine, Vigenère decided that he had accumulated enough money for him to be able to abandon his career and concentrate on a life of study. It was only then that he examined in detail the ideas of Alberti, Trithemius, and Porta, weaving them into a coherent and powerful new cipher … The cipher is known as the Vigenère cipher in honour of the man who developed it into its final form. The strength of the Vigenère cipher lies in its using not one, but 26 distinct cipher alphabets to encode a message… To unscramble the message, the intended receiver needs to know which row of the Vigenère square has been used to encipher each letter, so there must be an agreed system of switching between rows. This is achieved by using a keyword… Vigenère’s work culminated in his Traicté des Chiffres, published in 1586. Ironically, this was the same year that Thomas Phelippes was breaking the cipher of Mary Queen of Scots. If only Mary’s secretary had read this treatise, he would have knownabout the Vigenère cipher, Mary’s messages to Babington would have baffled Phelippes, and her life might have been spared” (Singh, The Code Book. The Secret History of Codes and Codebreaking, 46-51).

The Vigenère cypher was regarded as unbreakable for over 300 years, until Charles Babbage and Friedrich Kasiski independently developed a method of multiple tests to carry out successful cryptanalysis.

Leaves CCCXXVII-CCCXXXVI of Vigenère's work contain the first representations of Chinese and Japanese writing in a European printed book.

Galland, An Historical and Analytical Bibliography of the Literature of Cryptography, 193.

View Map + Bookmark Entry

Christophorus Guyot Issues the Earliest Surviving Catalogue of a Book Auction July 6, 1599

The first book auctions with lot numbers and printed catalogues took place in Holland. The first book auction with a printed catalogue took place in Leiden in 1593, though no catalogue survives. The earliest surviving catalogue of a book auction was issued by Christophorus Guyot in Leiden: Catalogus Librorum Bibliothecae Nobilissimi Clarissimique viri piae memoriea D. Philippi Marnixii. The sale took place in the house of the widow of the owner of the library, Filips van Marnix, heer van Sint-Aldegonde, on July 6, 1599.

Marnix was a Dutch and Flemish writer and statesman and the probable author of the text of the Dutch national anthem, the Wilhelmus.

"Less known to the general public is his work as a cryptographer. St. Aldegonde is considered to be the first Dutch cryptographer (cfr. The Codebreakers). For Stadholder William the Silent, he deciphered secret messages that were intercepted from the Spaniards. His interest in cryptography possibly shows in the Wilhelmus, where the first letters of the couplets form the name Willem van Nassov, i.e. William 'the Silent' of Nassau, the Prince of Orange, but such musical games -often far more intricate- were commonly practiced by polyphony composers since the Gothic period." 

Only two copies survive. Breslauer & Folter, Bibliography: Its History and Development (1984) no. 40.

View Map + Bookmark Entry

1600 – 1650

Descartes Discusses the Idea of an Artificial Language 1629

In a letter dated 1629 to theologian, philosopher, and mathematician Marin Mersenne, philosopher, mathematician and physicist René Descartes proposed an artificial universal language, with equivalent ideas in different tongues sharing one symbol:

"Et si quelqu’un avait bien expliqué quelles sont les idées simples qui sont en l’imagination des hommes, desquelles se compose tout ce qu’ils pensent, et que cela fût reçu par tout le monde, j’oserais espérer ensuite une langue universelle, fort aisée à apprendre, à prononcer et à écrire."

"The notion of a universal language was based upon the idea of precisely cataloging the elements of the human imagination. The great advantage of such a language would be that it would represent everything 'distinctement.' Yet, the great problem faced by someone who wanted to create such a language was the nature of the human imagination itself. Although separate from the mind and reason, which were the foundations of Cartesian thought, the imagination nevertheless played an important role for Descartes. As he wrote elsewhere in the Meditations, the imagination not only conceptualized external things but also considers them, 'as being present by the power and internal application of my mind.' Imagination, in other words, produced the illusion of presence, figures appearing so that can the person can 'look upon them as present with the eyes of my mind.' As a result, Descartes remains highly suspicious of the imagination because it can produce appearances that have no corresponding reality. Descartes concluded his letter to Mersenne by dismissing hopes for a universal language or a real character as only being possible in a 'terrestrial paradise' or 'fairyland' because of the confused nature of signification and the variation of human understanding.

"Mais n’espérez pas de la voir jamais en usage; cela présuppose de grands changements en l’ordre des choses, et il faudrait que tout le Monde ne fût qu’un paradis terrestre, ce qui n’est bon à proposer que dans le pays des romans.

 "A universal language that would work at the level of the imagination, describing the actual 'things' of the external world, could only produce uniform results in the perfection of Eden or the ideal of fiction. One should, instead, stick with the institution of geometry as a method of rationalizing nature, a divine language grounded upon the cogito’s transmission of being. Descartes ultimately remains skeptical about any possibility of using alternative language games aside from mathematics in the project of rationalizing the world" (Batchelor, The Republic of Codes: Cryptographic Theory and Scientific Networks in the Seventeenth Century [1999] http://www.stanford.edu/dept/HPS/writingscience/Cryptography.html, accessed 01-22-2010).

View Map + Bookmark Entry

1750 – 1800

Raimondo di Sangro Issues the First Extensive Treatise on the Peruvian Knot-Based Counting Language, the Quipu 1750

The first extensive treatise on the Peruvian knot-based counting language, the Quipu was Lettera apologetica dell'esercitato accademico della Crusca contenente la Difesa del Libro Intitolato Lettere d'una Peruana per rispetto alla supposizione de'Quipu published in 1750 by the Neopolitan polymath and inventor Raimondo di Sangro, Prince of Sansevero, from the press of Gennaro Morelli of Naples. This work, printed in color using a polychromatic printing process invented by the Prince, was the first extensive treatise on the Peruvian knot-based counting language, the Quipu.  

Quipu used a decimal positional system: a knot in a row farthest from the main strand represented one, next farthest ten, etc.; the absence of knots on a cord implied zero. The colors of the cords, the way the cords are connected together, the relative placement of the cords, the spaces between the cords, the types of knots on the individual cords, and the relative placement of the knots are all important parts of the recording system. ‘Quipucamayocs,’ the accountants of the Inca Empire, created and deciphered the Quipu knots, and were also capable of performing simple mathematical calculations such as adding, subtracting, multiplying, and dividing. Quipu accounts were kept by court historians in Peru that covered hundreds of years of history, but after the Conquest, the Spaniards began to resent having this second set of record-keepers contradict them. The Quipu was classified as idolatrous at the Third Council of Lima (1581-3), many examples were destroyed.  Thus, by the time Raimondo di Sangro published his book the Quipu was no longer practiced, and attempting to understand the language was a research project in cryptanalysis.

"To date, no link has yet been found between a quipu and Quechua, the native language of the Peruvian Andes. This suggests that quipus are not a glottographic writing system and have no phonetic referent. Frank Salomon at the University of Wisconsin has argued that quipus are actually a semasiographic language, a system of representative symbols—such as music notation or numerals—that relay information but are not directly related to the speech sounds of a particular language. The Khipu Database Project (KDP), begun by Gary Urton, may have already decoded the first word from a quipu—the name of a village, Puruchuco, which Urton believes was represented by a three-number sequence, similar to a ZIP code. If this conjecture is correct, quipus are the only known example of a complex language recorded in a 3-D system. (Wikipedia article on Quipu, accessed 04-07-2013).

View Map + Bookmark Entry

Jean-Jacques Barthélemy Achieves the First Significant Decipherment of an Ancient Script: Palmyrene 1754

The first significant decipherment of an ancient script was that of Palmyrene, a West Aramaic dialect spoken in the city of PalmyraSyria in the early centuries CE. This was known from the church fathers to be similar to Syriac. "Copies" of Palmyrene script were available in print since 1616 but these were not helpful for decipherment. It was only after the British clergyman and orientalist John Swinton published accurate copies of paired inscriptions in Greek and Palmyrene in his paper, "An Explication of All the Inscriptions in the Palmyrene Language and Character Hitherto Publish'd," Philosophical Transactions of the Royal Society of London 48/2 (1755) 690-756 that French writer, numismatist and linguist Abbé Jean-Jacques Barthélemy could correlate the two. The first word in one of the inscriptions was the name Septimios. Barthélemy was able to match the Palmyrene letters with the Greek, and he also discovered that they were recognizably similar to both Hebrew and Syriac. Swinton's paper was read to the Royal Society in a series of letters beginning on June 20, 1754. Barthélemy pubished his results in a 32-page pamphlet entitled Réflexions sur l'alphabet et sur la langue dont on se servoit autrefois à Palmyre issued in Paris in 1754. From the Approbation of the pamphlet dated July 18, 1754 it is evident that the reading of Swinton's initial letter, and the inscriptions made available, provided key information for Barthélemy, who reproduced  on the three plates published with his pamphlet inscriptions initially brought to light by Swinton with referencing Swinton in any way. Barthelemy's pamphlet was reprinted in Mémoires de l'Académie des Inscriptions et Belles Lettres 26 (1759) 577-97.

Parkinson, Cracking Codes. The Rosetta Stone and Decipherment (1999) 16. Daniels & Bright, The World's Writing Systems (1996) 145ff.

(This entry was last revised on 08-29-2014.)

View Map + Bookmark Entry

Barthélemy Deciphers the Phoenician Language 1758

Four years after deciphering Palmyrene, the first ancient script to be deciphered, in 1758 French writer, numismatist and linguist Abbé Jean-Jacques Barthélemy deciphered Phoenician on the basis of blingual Phoenician and Hebrew inscriptions found in Malta and two bilingual Phoenician and Hebrew inscriptions found in Cyprus by Richard Pococke. Bathélemy confirmed his reading with bilingual coins of from Tyre and Sidon, and a set of Sicilian-Punic tetradrachms.

Barthélemy published his discovery in "Réflexions sur quelques monuments phéniciens, et sur les alphabets qui en résultent," Memoires de l'Académie des Inscriptions et Belles Lettres 30 (1764) 405-26. In this paper Barthélemy postulated four rules of decipherment which withstood the test of time.  

My copy of Barthélemy's paper is a preprint paginated 1-23, with 5 plates. According on a note published in the margin of p. 1, Barthélemy read his report to the Académie des Inscriptions on April 12, 1758. It was not formally published until six years later, and from a setting of type different from my copy. The first of his plates reproduced the Malta inscriptions, the second reproduced recto and verso of 10 bilingual coins, the third reproduced the inscriptions found on Cyprus, and his fourth plate set out his understanding of the Phoenician alphabet. A fifth plate in my copy reproduces an inscription on a pitcher.

Daniels & Bright, The World's Writing Systems (1996) 144, 155.

View Map + Bookmark Entry

The Copiale Cipher is Decrypted: Initiation into a Secret Society of Oculists Circa 1760 – 1780

The Copiale Cipher, an encrypted manuscript from circa1760-80, perserved at the German Academy of Sciences at Berlin, consisting of 75,000 characters on 105 pages, was decoded in April 2011 by an international team lead by Kevin Knight of the University of Southern California, using computer techniques. 

The cipher employed in the manuscript consists of 90 different characters, from Roman and Greek letters, to diacritics and abstract symbols. Catchwords (preview fragments) of one to three or four characters are written at the bottom of left–hand pages. The plain-text letters of the message were found to be encoded by accented Roman letters, Greek letters and symbols, with unaccented Roman letters serving only to represent spaces.

"The researchers found that the initial portion of 16 pages describes an initiation ceremony for a secret society, namely the "high enlightened (Hocherleuchtete) oculist order" of Wolfenbüttel. A parallel manuscript is kept at the Staatsarchiv Wolfenbüttel. The document describes, among other things, an initiation ritual in which the candidate is asked to read a blank piece of paper and, on confessing inability to do so, is given eyeglasses and asked to try again, and then again after washing the eyes with a cloth, followed by an 'operation' in which a single eyebrow hair is plucked "(Wikipedia article on Copiale Cipher, accessed 12-11-2011).

View Map + Bookmark Entry

1800 – 1850

Champollion Deciphers Egyptian Hieroglyphs September 22, 1822

Having studied the three texts on the Rosetta Stone, as well as other texts brought back from Egypt from Napoleon's Egyptian campaigns, on September 22, 1882 French scholar, philologist and linguist Jean-François Champollion announced his depherment of Egyptian hieroglyphs in a report to Bon-Joseph Dacier, Perpetual Secretary of the Académie des Inscriptions et Belles Lettres. This was published in Paris as Lettre à M. d'Acier relative à l'alphabet des hiéroglyphes phonétiques. In this 55-page report read to the Académie on September 27,

"Champollion described the alphabet that was used to write non-Egyptian names, and in the concluding pages he tentatively announced that he was certain that the phonetic signs were an integral part of 'pure hieroglyphic writing'. Among the select audience was the great Prussian natural scientist and explorer Alexander von Humboldt (1769-1859) and also Thomas Young, whose initial reaction is recorded in a letter written to W.R. Hamilton on the Sunday after the reading:

" 'I have found here, or rather recovered, Mr. Champollion, junior, who has been living for these ten years on the Inscription of Rosetta, and who has lately been making some steps in Egyptian literature, which really appear to be gigantic. It may be said that he found the key in England which has opened the gate for him, and it is often observed that c'est le premier pas qui coûte [it's the first step that takes the effort]: but if he did borrow an English key, the lock was so dreadfully rusty, that no common arm would ahve strength enough to turn it. . . .' (Parkinson, The Rosetta Stone [2005] 43-44).

Two years after his preliminary report of the discovery Champollion published a fuller exposition as Précis du système hiéroglyphique des anciens égyptiennes, marking the decisive step in the decipherment of Egyptian hieroglyphs. 

"His decipherment opened up the millennia of human history and resolved the pharaonic chronology that had been a major concern of the period. It also showed that human history went back much further than was accepted in the Church's chronology based on the Bible" (Parkinson, op. cit., 45).

(This entry was last revised on 08-04-2014).

 

View Map + Bookmark Entry

The Contributions of Thomas Young Toward Deciphering Egyptian Hieroglyphs 1823

In response to French linguist Jean-François Champollion's 1822 report of the decipherment of the Egyptian hieroglyphs, in 1823 English physician, scientist and polymath Thomas Young published An Account of Some Recent Discoveries in Hieroglyphical Literature, and Egyptian Antiquities. Young believed that his discoveries were the basis for Champollion's system. In this book Young emphasized that many of his findings had been published and sent to Paris in 1816. Although Young had correctly found the sound value of six signs, he had not deduced the grammar of the language, and had therefore not deciphered the entire written language.

"Young was also one of the first who tried to decipher Egyptian hieroglyphs, with the help of a demotic alphabet of 29 letters built up by Johan David Åkerblad in 1802 (15 turned out to be correct), but Åkerblad wrongly believed that demotic was entirely alphabetic. 'Dr Young however showed that neither the alphabet of Akerblad, nor any modification of it which could be proposed, was applicable to any considerable part of the enchorial portion of the Rosetta inscription beyond the proper names.'  By 1814 Young had completely translated the "enchorial" (demotic, in modern terms) text of the Rosetta Stone (he had a list with 86 demotic words), and then studied the hieroglyphic alphabet but initially failed to recognise that the demotic and hieroglyphic texts were paraphrases and not simple translations. Some of Young's conclusions appeared in the famous article "Egypt" he wrote for the 1818 edition of the Encyclopædia Britannica" (Wikipedia article on Thomas Young, accessed 07-28-2009).

(This entry was last revised on 08-04-2014).

View Map + Bookmark Entry

Rafinesque Deciphers the Mayan System of Counting 1832

Because of the destruction of most of the Maya codices in the sixteenth century, scholars had extremely limited access to the original texts. It was not until 1810 that the first reproduction of any Mayan codex— five pages from the Dresden Codex— were reproduced by Alexander von Humboldt in his Vues de cordillères, et monuments des peuples indigènes de l'Amérique. From this very limited reproduction in 1832 European-American autodidact polymath, mathematician, botanist, zoologist, and malachologist Constantine Samuel Rafinesque, while working in Philadelphia, deciphered the Maya's system of numerals.

In 1832 Rafinesque published his discovery in his periodical, the Atlantic Journal, and Friend of Knowledge: A Cyclopedic Journal and Review of Universal Science and Knowledge: Historical, Natural, and Medical Arts and Sciences: Industry, Agriculture, Education, and Every Useful Information. He announced it in a three-part article addressed to Jean-François Champollion, whose name he misspelled, "on the Graphic systems of America, and the Glyphs of Otolum or Palenque, in Central America." In the second part of this article, on page 42, Rafinesque briefly explained his discovery of the meaning of the Maya bar and dot system in which a dot equals one and a bar equals five. 

 "Later findings proved him right and also revealed that the Maya even had a symbol for zero, which appeared on Mesoamerican carvings as early as 36 B.C. (Zero didn't appear in Western Europe until the 12th century)"  (http://www.pbs.org/wgbh/nova/mayacode/time-flash.html, accessed 10-10-2009).

Like most of Rafinesque's numerous other publications, his Atlantic Journal enjoyed very limited success, and folded after only eight issues.  Copies of the original edition are extremely rare.  My copy is a facsimile reprint issued by the Arnold Arboretum, Boston, in 1946.

View Map + Bookmark Entry

Origins of the Morse Code 1837

In 1837 Samuel F. B. Morse invented a practical form of electromagnetic telegraph using an early version of his “Morse code.” 

Morse originally devised a cipher code similar to that used in existing semaphore telegraphs, by which words were assigned three or four-digit numbers and entered into a codebook. The sending operator converted words to these number groups and the receiving operator converted them back to words using this codebook. Morse spent several months compiling this code dictionary.

View Map + Bookmark Entry

Morse Transmits the First Message by Morse Code May 24, 1844

On May 24, 1844 Samuel F. B. Morse transmitted the first message on a United States experimental telegraph line (Washington to Baltimore) using the “Morse code” that became standard in the United States and Canada. The message, taken from the Bible, Numbers 23:23, and recorded on a paper tape, had been suggested to Morse by Annie Ellworth, the young daughter of a friend. It was “What hath God wrought?” The recipient of Morse's message was Morse's associate in developing the telegraph, machinist and inventor Alfred Vail

Vail, who had worked with Morse since September 1837, expanded Morse's original experimental numeric code based on a optical telegraph codes, to include letters and special characters, so it could be used more generally. Vail determined the frequency of use of letters in the English language by counting the movable type he found in the type-cases of a local newspaper in Morristown. The shorter marks were called "dots", and the longer ones "dashes", and the letters most commonly used were assigned the shorter sequences of dots and dashes. Vail was thus responsible for inventing the most useful and efficient features of the Morse Code.

The Morse Code became the first widely used data code.

Probably the first publication of the Morse Code was in Vail's Description of the American ElectroMagnetic Telegraph: Now in Operation between the Cities of Washington and Baltimore (Washington: Printed by J. & G. S. Gideon,1845). Vail issued two versions of this in 1845: a 24-page pamphlet, with the title just mentioned, which was probably the first, and a much-expanded 208-page book "with the Reports of Congress, and a Description of All the Telegraphs Known, Employing Electricity or Galvanism." The rear wrapper of the 24-page pamphlet states that it was sold for 12.5 cents, and that the larger work which was "just published" by Lea & Blanchard, Philadelphia, was available for 75 cents.

Hook & Norman, Origins of Cyberspace  (2002) no. 208.

View Map + Bookmark Entry

1850 – 1875

Abraham Lincoln's Surveillance of Telecommunications During the American Civil War 1862

President Abraham Lincoln appointed Edwin M. Stanton Secretary of War on January 15, 1862, and soon thereafter Stanton requested sweeping powers, including total control of telegraph lines, as a security measure. By routing all telegraph lines through his office Stanton could monitor vast amounts of communication—journalistic, governmental and personal. This early example of governmental surveillance of telecommunications came to my attention in an op-ed piece by David T. Z. Mindich entitled "Lincoln's Surveillance State" in The New York Times July 5, 2013. The piece was published in the context of the leaks by Edward Snowden in June 2013 concerninig the vast PRISM telecommunications surveillance program:

"Having the telegraph lines running through Stanton’s office made his department the nexus of war information; Lincoln visited regularly to get the latest on the war. Stanton collected news from generals, telegraph operators and reporters. He had a journalist’s love of breaking the story and an autocrat’s obsession with information control. He used his power over the telegraphs to influence what journalists did or didn’t publish. In 1862, the House Judiciary Committee took up the question of 'telegraphic censorship' and called for restraint on the part of the administration’s censors.  

"When I first read Stanton’s requests to Lincoln asking for broad powers, I accepted his information control as a necessary evil. Lincoln was fighting for a cause of the utmost importance in the face of enormous challenges. The benefits of information monitoring, censorship and extrajudicial tactics, though disturbing, were arguably worth their price.  

"But part of the reason this calculus was acceptable to me was that the trade-offs were not permanent. As the war ended, the emergency measures were rolled back. Information — telegraph and otherwise — began to flow freely again.  

"So it has been with many wars: a cycle of draconian measures followed by contraction. During the First World War, the Supreme Court found that Charles T. Schenck posed a “clear and present danger” for advocating opposition to the draft; later such speech became more permissible. During the Second World War, habeas corpus was suspended several times — most notably in Hawaii after the Pearl Harbor attack — but afterward such suspensions became rare.  

"This is why, if you are a critic of the N.S.A.’s surveillance program, it is imperative that the war on terror reach its culmination. In May, President Obama declared that 'this war, like all wars, must end.' If history is any guide, ending the seemingly endless state of war is the first step in returning our civil liberties. 

"Until then, we will continue to see acts of governmental overreach that would make even Stanton blush. “I, sitting at my desk, certainly had the authorities to wiretap anyone, from you or your accountant, to a federal judge or even the President, if I had a personal e-mail,” Mr. Snowden told The Guardian. And unlike Stanton’s telegraph operation, which housed just a handful of telegraphers, the current national security apparatus is huge. An estimated 483,000 government contractors had top-secret security clearances in 2012. That’s a lot of Snowdens to trust with your information."

View Map + Bookmark Entry

Émile Baudot Invents the Baudot Code, the First Means of Digital Communication 1870 – 1874

In 1870 French telegraph engineer Émile Baudot invented the Baudot code, a character set predating EDCDIC and ASCII, which has been called the first means of digital communication. In Baudot's code each character in the alphabet is represented by a series of bits sent over a communication channel. The symbol rate measurement (symbols per second or pulses per second) is known as baud in Baudot's honor.

"Baudot invented his original code during 1870 and patented it during 1874. It was a 5-bit code, with equal on and off intervals, which allowed telegraph transmission of the Roman alphabet and punctuation and control signals. It was based on an earlier code developed by Carl Friedrich Gauss and Wilhelm Weber in 1834.

"Baudot's original code was adapted to be sent from a manual keyboard, and no teleprinter equipment was ever constructed that used it in its original form. The code was entered on a keyboard which had just five piano type keys, operated with two fingers of the left hand and three fingers of the right hand. Once the keys had been pressed they were locked down until mechanical contacts in a distributor unit passed over the sector connected to that particular keyboard, when the keyboard was unlocked ready for the next character to be entered, with an audible click (known as the "cadence signal") to warn the operator. Operators had to maintain a steady rhythm, and the usual speed of operation was 30 words per minute." (Wikipedia article on Baudot code, accessed 12-22-2011).

View Map + Bookmark Entry

Charles Babbage's Library: the First Catalogue of a Library on Computing and its History 1872

In 1872, the year after his death, Charles Babbage’s scientific library was sold at auction. The auction catalogue, containing over two thousand items on topics such as mathematical tables, cryptography, and calculating machines, and including many rare volumes, may be the first catalogue of a library on computing and its history.

View Map + Bookmark Entry

1910 – 1920

Early Versions of the Enigma 1919

In 1919 early versions of the Enigma cipher machine were built in Europe.

View Map + Bookmark Entry

1920 – 1930

The Index of Coincidence Method of Code-Breaking 1922

In 1922 U.S. Army cryptologist William F. Friedman published The Index of Coincidence and its Applications in Cryptography, Department of Ciphers. Publ 22. Geneva, Illinois, USA: Riverbank Laboratories.

Friedman's report presented the coincidence counting, or index of coincidence method of code-breaking.

View Map + Bookmark Entry

The Enigma Machine is Introduced 1923

In 1923 German electrical engineer and inventor Arthur Scherbius began marketing a mechanical cipher rotor machine based on rotating wired wheels, and called Enigma. Thousands of the machines are thought to have been produced from the 1920s to the end of World War II, during which the devices were used by the Third Reich to encrypt messages in a form they believed was undecipherable.

On September 11, 2011 a three-rotor Enigma machine in its original wooden box, and dated circa 1939, sold at Christie's London for £133,250.  This was a record price for an Enigma Machine.  The machine had been used in the 2001 film entitled Enigma.

View Map + Bookmark Entry

A Logarithmic Law for Communication 1924

In “Certain Factors Affecting Telegraph Speed,” Bell System Technical Journal 3 (1924) 324–346, information theorist Harry Nyquist analyzed factors affecting telegraph transmission speed, presenting the first statement of a logarithmic law for communication, and the first examination of the theoretical bounds for ideal codes for the transmission of information.

View Map + Bookmark Entry

1930 – 1940

The Voder, the First Electronic Speech Synthesizer: a Simplified Version of the Vocoder 1936 – 1939

Between 1936 and 1939 electronic and acoustic engineer Homer Dudley and a team of engineers at Bell Labs produced the first electronic speech synthesizer, called the Voder ("Voice Operation DEmonstratoR").

The Voder was demonstrated at the 1939-1940 World's Fair in Flushing Meadows, New York and the 1939 Golden Gate International Exposition on Treasure Island, San Francisco Bay, by experts who used a keyboard and foot pedals to play the machine and emit speech.

♦ The Voder was a simplified version of the Vocoder (short for voice encoder) developed by Dudley from 1926 onward, and for which Dudley received US patent 2151091 A for Signal Transmission on March 21, 1939. Dudley's vocoder was used in the SIGSALY system built by Bell Labs engineers in 1943. SIGSALY was used for encrypted high-level voice communications during World War II.  Since then the Vocoder has been widely applied in music, television production, filmmaking and games, usually for robots or talking computers.

On August 19, 2014 Nate Lavey and Jay Caspian Kang posted an outstanding video in NewYorker.com as Object of Interest: The Vocoder. The video, which is embedded here, can be slow to load.

On April 14, 2016 Episode 208, Vox Ex Machina of 99percentinvisible.org posted this outstanding page on Vocoder and SIGSALY: http://99percentinvisible.org/episode/vox-ex-machina/

(This entry was last revised on 05-04-2016.)

(This entry was last revised on 08-20-2014.)

View Map + Bookmark Entry

Highlights of Alan Turing and Colleagues' Cryptanalysis Work at Bletchley Park Circa September 1939 – 1945

On September 1, 1939 Germany invaded Poland, beginning World War II. Two days later, on September 3, Britain and France declared war on Germany. The following day Alan Turing appeared for work at the Code Code and Cypher School at Bletchley, England, with the goal of deciphering military communications encoded by means of Enigma machines.

As early as December 1932 the Biuro Szyfrów ("Cipher Bureau") in Warsaw, the Polish interwaragency charged with both cryptography  and cryptanalysis, had broken the German Enigma machine cipher.Over the next nearly seven years before World War II, the Polish "Cipher Bureau" overcame the growing structural and operating complexities of the plugboard-equipped Enigma, the main German cipher device during the Second World War.

Prior to the beginning of World War II, in October 1938 Polish Cipher Bureau mathematician and cryptologist Marian Rejewski designed the bomba, or bomba kryptologiczna  ("bomb" or "cryptologic bomb,") a special-purpose machine for breaking German Enigma machine  ciphers. On July 25, 1939 the Biuro Szyfrów revealed Poland's Enigma-decryption techniques and equipment, which it had achieved using the bomba device, to the French and British. Poland thereby made possible the western Allies' vitally important decryption of Nazi German   secret communications (Ultra) during World War II.

"Up to July 25, 1939, the Poles had been breaking Enigma messages for over six and a half years without telling their  French  and British allies. On December 15, 1938, two new rotors, IV and V, were introduced (three of the now five rotors being selected for use in the machine at a time). As Rejewski wrote in a 1979 critique of appendix 1, volume 1 (1979), of the official history of British Intelligence in the Second World War, 'we quickly found the [wirings] within the [new rotors], but [their] introduction [...] raised the number of possible sequences of drums from 6 to 60 [...] and hence also raised tenfold the work of finding the keys. Thus the change was not qualitative but quantitative. We would have had to markedly increase the personnel to operate the bombs, to produce the perforated sheets (60 series of 26 sheets each were now needed, whereas up to the meeting on July 25, 1939, we had only two such series ready) and to manipulate the sheets.'

"Harry Hinsley suggested in British Intelligence . . . that the Poles decided to share their Enigma-breaking techniques and equipment with the French and British in July 1939 because they had encountered insuperable technical difficulties. Rejewski refuted this: 'No, it was not [cryptologic] difficulties [. . .] that prompted us to work with the British and French, but only the deteriorating political situation. If we had had no difficulties at all we would still, or even the more so, have shared our achievements with our allies as our contribution to the struggle against Germany' ' (Wikipedia article on Bomba (cryptography), accessed 12-21-2008).

In the first few months after arriving at Bletchley Turing made a key deduction that led to his development of Banburismus, a cryptanalytic process used by Turing and his co-workers at Bletchley's Hut 8 to help break German Kriegsmarine (Naval) messages enciphered by Enigma.

"The process used sequential conditional probability to infer information about the likely settings of the Enigma machine. It gave rise to Turing's invention of the ban as a measure of the weight of evidence in favour of a hypothesis. This concept was later applied in Turingery and all the other methods used for breaking the Lorenz cipher.

"The aim of Banburismus was to reduce the time required of the electromechanical Bombe machines by identifying the most likely right-hand and middle wheels of the Enigma. Hut 8 performed the procedure continuously for two years, stopping only in 1943 when sufficient bombe time became readily available. Banburismus was a development of the "clock method" invented by the Polish cryptanalyst Jerzy Różyck

To develop Banburismus Turing

"deduced that the message-settings of Kriegsmarine Enigma signals were enciphered on a common G rundstellung (starting position of the rotors), and were then super-enciphered with a bigram and a trigram lookup table. These trigram tables were in a book called the Kenngruppenbuch (K book). However, without the bigram tables, Hut 8 were unable to start attacking the traffic. A breakthrough was achieved after the Narvik pinch in which the disguised armed trawler Polares, which was on its way to Narvik in Norway, was seized by HMS Griffin in the North Sea on 26 April 1940. The Germans did not have time to destroy all their cryptographic documents, and the captured material revealed the precise form of the indicating system, supplied the plugboard connections and Grundstellung for April 23 and 24 and the operators' log, which gave a long stretch of paired plaintext and enciphered message for the 25th and 26th.

"The bigram tables themselves were not part of the capture, but Hut 8 were able to use the settings-lists to read retrospectively, all the Kriegsmarine traffic that had been intercepted from 22 to 27 April. This allowed them do a partial reconstruction of the bigram tables and start the first attempt to use Banburismus to attack Kriegsmarine traffic, from 30 April onwards. Eligible days were those where at least 200 messages were received and for which the partial bigram-tables deciphered the indicators. The first day to be broken was 8 May 1940, thereafter celebrated as "Foss's Day" in honour of Hugh Foss, the cryptanalyst who achieved the feat.

"This task took until November that year, by which time the intelligence was very out of date, but it did show that Banburismus could work. It also allowed much more of the bigram tables to be reconstructed, which in turn allowed April 14 and June 26 to be broken. However, the Kriegsmarine had changed the bigram tables on 1 July. By the end of 1940, much of the theory of the Banburismus scoring system had been worked out.

"The First Lofoten pinch from the trawler Krebs on 3 March 1941 provided the complete keys for February - but no bigram tables or K book. The consequent decrypts allowed the statistical scoring system to be refined so that Banburismus could become the standard procedure against Kriegsmarine Enigma until mid-1943" (This and the earlier quotation are from the Wikipedia article on Banburismus, accessed 01-04-2015.)

About December 1940 Alan Turing and Gordon Welchman at Bletchley Park designed an improved Bombe cryptanalysis machine for deciphering Enigma messages.

Between 1940 and 1941 Max Newman and his team at Bletchley, including Turing, created the top-secret Heath Robinson cryptographic computer named after the cartoonist-designer of fantastic machines. This special-purpose relay computer successfully decoded messages encrypted by Enigma, the Nazis' first-generation enciphering machine.

In July 1942 Turing developed  the hand codebreaking method known as Turingery or Turing's Method (playfully dubbed Turingismus by Peter Ericsson, Peter Hilton and Donald Michie)  for use in cryptanalysis of the Lorenz cipher produced by the SZ40 and SZ42 teleprinter rotor stream cipher machines, one of the GermansGeheimschreiber (secret writer) machines. The British codenamed non-Morse traffic "Fish", and that from this machine "Tunny".

"Reading a Tunny message required firstly that the logical structure of the system was known, secondly that the periodically changed pattern of active cams on the wheels was derived, and thirdly that the starting positions of the scrambler wheels for this message—the message key—was established.The logical structure of Tunny had been worked out by William Tutte and colleagues over several months ending in January 1942. Deriving the message key was called "setting" at Bletchley Park, but it was the derivation of the cam patterns—which was known as "wheel breaking"—that was the target of Turingery.

"German operator errors in transmitting more than one message with the same key, producing a "depth", allowed the derivation of that key. Turingery was applied to such a key stream to derive the cam settings" (Wikipedia article on Turingery, accessed 01-04-2015).

In 1943 Alan Turing traveled to New York to consult with Claude Shannon and Harry Nyquist at Bell Labs concerning the encryption of speech signals between Roosevelt and Churchill.

In January 1944 the top-secret Colossus programmable cryptanalysis machine designed by Tommy Flowers and his team at the Post Office Research Station, Dollis Hill, in North West London, was installed at Bletchley Park to crack the higher level encryption of the Nazi Lorenz SZ40 machine. Colossus employed vacuum tubes and was between one hundred and one thousand times faster than Heath Robinson. "It exceeded all expectations and was able to derive many of the Lorenz settings for each message within a few hours, compared to weeks previously" (http://googleblog.blogspot.com/2012/03/remembering-colossus-worlds-first.html, accessed 03-0-2012). The Colossus machines have been called the first operational programmable electronic digital computers.

On June 1, 1944 the first improved Colossus Mark 2 with 2400 vacuum tubes was operational at Bletchley Park just in time for the Normandy Landings. By the end of the war there were ten Colossus computers operating. They enabled the decryption of 63,000,000 characters of high-grade German messages. Even though these machines incorporated features of special purpose electronic digital computers, and had incalculable influence on the outcome of WWII, they had little influence in the conventional sense on the development of computing technology because they remained top secret until about 1970.

"The Colossus computers were used to help decipher teleprinter  messages which had been encrypted using the Lorenz SZ40/42 machine — British codebreakers referred to encrypted German teleprinter traffic as "Fish" and called the SZ40/42 machine and its traffic as 'Tunny'. Colossus compared two data streams, counting each match based on a programmable Boolean function. The encrypted message was read at high speed from a paper tape. The other stream was generated internally, and was an electronic simulation of the Lorenz machine at various trial settings. If the match count for a setting was above a certain threshold, it would be sent as output to an electric typewriter" (Wikipedia article on Colossus computer, accessed 11-23-2008).

In March 2012 the Colossus Rebuild Project at the National Museum of Computing at Bletchley Park had completed an operating reconstruction of a Colossus II, after 10 years and over 6,000 man-days of volunteer effort. The Rebuild stands in its historically correct place, the room in H Block, in Bletchley Park, where Colossus No. 9 stood in WW II.

(This entry was last revised on 01-17-2015.)

View Map + Bookmark Entry

1940 – 1950

Communication Theory of Secrecy Systems 1945 – 1949

Claude Shannon's report, originally issued as a classified document entitled A Mathematical Theory of Cryptography, Memorandum MM 45-110-02, September 1, 1945, was formally published in 1949 as "Communication Theory of Secrecy Systems" in Bell System Technical Journal, 28 (1949)  656–715.  This paper, discussing cryptography from the viewpoint of information theory, contained a proof that all theoretically unbreakable ciphers must have the same requirements as the one-time pad.

View Map + Bookmark Entry

Warren Weaver Suggests Applying Cryptanalysis Techniques to Translation March 4 – May 9, 1947

On March 4, 1947 mathematician and Director of the Division of Natural Sciences at the Rockefeller Foundation in New York Warren Weaver sent the following letter to Norbert Wiener, suggesting that cryptanalysis techniques might be applied to translation, and that a computer could be built for the purpose. This letter, preserved at the Rockefeller Archives Center, may the origin of efforts at machine translation: 

"Dear Norbert:

I was terribly sorry, when in Cambridge recently, that I got un- avoidably held up by several unexpected jobs, and did not get a chance to see you.

One thing I wanted to ask you about is this. A most serious problem, for UNESCO and for the constructive and peaceful future of the planet, is the problem of translation, as it unavoidably affects the communication between peoples. Huxley has recently told me that they are appalled by the magnitude and the importance of the translation job.

 Recognizing fully, even though necessarily vaguely, the semantic difficulties because of multiple meanings, etc., I have wondered if it were unthinkable to design a computer which would translate. Even if it would translate only scientific material (where the semantic difficulties are very notably less), and even if it did produce an inelegant (but intelligible) result, it would seem to me worth while.

Also knowing nothing official about, but having guessed and inferred considerable about, powerful new mechanized methods in cryptography - methods which I believe succeed even when one does not know what language has been coded - one naturally wonders if the problem of translation could conceivably be treated as a problem in cryptography. When I look at an article in Russian, I say "This is really written in English, but it has been coded in some strange symbols. I will now proceed to decode."

Have you ever thought about this? As P. linguist and expert on computers, do you think it is worth thinking about?

Cordially,

Warren Weaver

In his reply dated April 30, 1947 Wiener was not optimistic regarding the possibility of machine translation:

"Dear Warren:  

First, I want to thank you and The Rockefeller Foundation for the almost unlimited number of favors that I have been receiving. I think and hope, at any rate, that we shall be able to come across in such a way as to at least partly justify your expenditure.

Second - as to the problem of mechanical translation, I frankly am afraid the boundaries of words in different languages are too vague and the emotional and international connotations are too extensive to make any quasi mechanical translation scheme very hopeful. I will admit that basic English seems to indicate that we can go further than we have generally done in the mechanization of speech, but you must remember that in certain respects basic English is the reverse of mechanical and throws upon such words as "get," a burden, which is much greater than most words carry in conventional English. At the present time, the mechanization of language, beyond such a stage as the design of photoelectric reading opportunities for the blind, seems very premature. By the way, I have been fascinated by McCulloch's work on such apparatus, and, as you probably know, he finds the wiring diagram of apparatus of this kind turns out to be surprisingly like the microscopic analogy of the visual cortex in the brain.

"I have heard that your health is much better, and I certainly hope so. I shall try to look you up before I sail for France.  

"Sincerely yours,

"Norbert Wiener

Weaver, however, maintained his belief in the possibility of machine translation in spite of Wiener's pessimism, writing back on May 9, 1947:

"Dear Norbert:

Thank you for your letter of April 30. I am sure that Dr. Morrison and I will both be very glad to have you tell us, from tine to time, about the progress of your collaborative program with Rosenblueth. And I will be most interested, after your re- turn from France, to hear your comments on your trip there.

"I am disappointed but not surprised by your comments on the translation problem. The difficulty you mention concerning Basic seems to me to have a rather easy answer. It is, of course, true that Basic puts multiple use on an action verb such as "get." But even so, the two-word combinations such as "get up," "get over," "get back," etc., are, in Basic, not really very numerous. Suppose we take a vocabulary of 2,000 words, and admit for good measure all the two-word combinations as if they were single words. The vocabulary is still only four million: and that is not so formidable a number to a modern computer, is it?

Cordially,

Warren Weaver"

(http://www.mt-archive.info/Weaver-1947-original.pdf, accessed 10-25-2011).

 

View Map + Bookmark Entry

1950 – 1960

The Hamming Codes 1950

In 1950 Richard W. Hamming of Bell Labs and the City College of New York published Error Detecting and Error Codes.

View Map + Bookmark Entry

The National Security Agency is Founded November 4, 1952

The National Security Agency/Central Security Service (NSA/CSS), a cryptologic intelligence agency of the United States Department of Defense responsible for the collection and analysis of foreign communications and foreign signals intelligence, as well as protecting U.S. government communications and information systems, officially came into existence on November 4, 1952. 

"The National Security Agency's predecessor was the Armed Forces Security Agency (AFSA), created on May 20, 1949. This organization was originally established within the U.S. Department of Defense under the command of the Joint Chiefs of Staff. The AFSA was to direct the communications and electronic intelligence activities of the U.S. military intelligence units: the Army Security Agency, the Naval Security Group, and the Air Force Security Service. However, that agency had little power and lacked a centralized coordination mechanism. . . . As the change in the security agency's name indicated, the role of NSA was extended beyond the armed forces" (Wikipedia article on National Security Agency, accessed 01-14-2012).

View Map + Bookmark Entry

The Idea of a Genetic Code 1953 – 1954

In 1953 and 1954 Russian-American theoretical physicist, cosmologist and science writer George Gamow, while at George Washington University, came up with the idea of a genetic code in his paper “Possible Mathematical Relation between Deoxyribonucleic Acids and Proteins” (Det. Kongelige Danske Videnskabernes Selskab: Biologiske Meddeleiser 22, no. 3 [1954] 1-13).

In the fall of 1953 Gamov gave Crick an earlier draft of this paper entitled “Protein synthesis by DNA molecules.”

“Gamov’s scheme was decisive, Crick has often said since, because it forced him, and soon others, to begin to think hard and from a particular slant—that of the coding problem—about the next stage, now that the structure of DNA was known” (Judson,The Eighth Day of Creation, 236).

View Map + Bookmark Entry

The First Amino Acid Sequence of a Protein 1955

In 1955 English biochemist Frederick Sanger sequenced the amino acids of insulin, the first of any protein.

Sanger's work “revealed that a protein has a definite constant, genetically determined sequence—and yet a sequence with no general rule for its assembly. Therefore it had to have a code” (Judson, Eighth Day of Creation, 188).

"Sanger's first triumph was to determine the complete amino acid sequence of the two polypeptide chains of bovine insulin, A and B, in 1952 and 1951, respectively. Prior to this it was widely assumed that proteins were somewhat amorphous. In determining these sequences, Sanger proved that proteins have a defined chemical composition. For this purpose he used the "Sanger Reagent", fluorodinitrobenzene (FDNB), to react with the exposed amino groups in the protein and in particular with the N-terminal amino group at one end of the polypeptide chain. He then partially hydrolysed the insulin into short peptides, either with hydrochloric acid or using an enzyme such as trypsin. The mixture of peptides was fractionated in two dimensions on a sheet of filter paper, first by electrophoresis in one dimension and then, perpendicular to that, by chromatography in the other. The different peptide fragments of insulin, detected with ninhydrin, moved to different positions on the paper, creating a distinct pattern that Sanger called 'fingerprints'. The peptide from the N-terminus could be recognised by the yellow colour imparted by the FDNB label and the identity of the labelled amino acid at the end of the peptide determined by complete acid hydrolysis and discovering which dinitrophenyl-amino acid was there. By repeating this type of procedure Sanger was able to determine the sequences of the many peptides generated using different methods for the initial partial hydrolysis. These could then be assembled into the longer sequences to deduce the complete structure of insulin. Finally, because the A and B chains are physiologically inactive without the three linking disulfide bonds (two interchain, one intrachain on A), Sanger and coworkers determined their assignments in 1955. Sanger's principal conclusion was that the two polypeptide chains of the protein insulin had precise amino acid sequences and, by extension, that every protein had a unique sequence. It was this achievement that earned him his first Nobel prize in Chemistry in 1958. This discovery was crucial for the later sequence hypothesis of Crick for developing ideas of how DNA codes for proteins" (Wikipedia article on Frederick Sanger, accessed 11-20-2013).

View Map + Bookmark Entry

Crick's "On Protein Synthesis" September 1957

In September 1957 molecular Biologist Francis Crick delivered his paper “On Protein Synthesis,” published in Symp. Soc. Exp. Biol. 12 (1958): 138-63. In it Crick proposed two general principles:

1) The Sequence Hypothesis:

“The order of bases in a portion of DNA represents a code for the amino acid sequence of a specific protein. Each ‘word’ in the code would name a specific amino acid. From the two-dimensional genetic text, written in DNA, are forged the whole diversity of uniquely shaped three-dimensional proteins

"In this context, Crick discussed the 'coding problem'—how the ordered sequence of the four bases in DNA might constitute genes that encode and disburse information directing the manufacture of proteins. Crick hypothesized that, with four bases to DNA and twenty amino acids, the simplest code would involve "triplets"—in which sequences of three bases coded for a single amino acid" (Genome News Network, Genetics and Genomics Timeline 1957).

2) The Central Dogma:

“Information is transmitted from DNA and RNA to proteins but information cannot be transmitted from a protein to DNA.” This paper “permanently altered the logic of biology.” (Judson)

View Map + Bookmark Entry

1960 – 1970

Crick & Brenner Propose The Genetic Code 1961

At Cambridge in 1961 Francis Crick, Sydney Brenner and colleagues proposed that DNA code is written in “words” called codons formed of three DNA bases. DNA sequence is built from four different bases, so a total of 64 (4 x 4 x 4) possible codons can be produced. They also proposed that a particular set of RNA molecules subsequently called transfer RNAs (tRNAs) act to “decode” the DNA.

“There was an unfortunate thing at the Cold Spring Harbor Symposium that year. I said, ‘We call this messenger RNA’ Because Mercury was the messenger of the gods, you know. And Erwin Chargaff very quickly stood up in the audience and said he wished to point out that Mercury may have been the messenger of the gods, but he was also the god of thieves. Which said a lot for Chargaff at the time! But I don’t think that we stole anything from anybody— except from nature. I think it’s right to steal from nature, however” (Brenner, My Life, 85).

Francis Crick, L. Barnett, Sydney Brenner and R. J. Watts-Tobin, “General Nature of the Genetic code for Proteins,” Nature 192 (1961): 1227-32.

J. Norman (ed) Morton's Medical Bibliography 5th ed (1991) no. 256.8.

View Map + Bookmark Entry

ASCII is Promulgated 1963

In 1963 the ASCII (American Standard Code for Information Interchange) standard was promulgated, specifying the pattern of seven bits to represent letters, numbers, punctuation, and control signals in computers.

"Historically, ASCII developed from telegraphic codes. Its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on ASCII formally began October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963, a major revision during 1967, and the most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters. ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) that affect how text and space is processed; 94 are printable characters, and the space is considered an invisible graphic. The most commonly used character encoding on the World Wide Web was US-ASCII until 2008, when it was surpassed by UTF-8" (Wikipedia article on ASCII, accessed 01-29-2010).

View Map + Bookmark Entry

1970 – 1980

Public Key Cryptography is Suggested 1976

In 1976 cryptologists Bailey Whitfield 'Whit' Diffie  and Martin E. Hellman published "New Directions in Cryptography," IEEE Transactions on Information Theory, IT-22, 6 (1976) 644–654.

This paper suggested public key cryptography and presented the Diffie-Hellman key exchange.

In March 2016 Diffie and Hellman received the A. M.Turing Awardpresented by the ACMfor the invention of public key cryptography.

View Map + Bookmark Entry

1980 – 1990

The Unicode Universal Character Set is Introduced August 29, 1988

Joseph D. Becker of Xerox Corporation, Rochester, New York, Lee Collins (also at Xerox) and Mark Davis of Apple Computer developed a universal character set, the name for which Becker in his report, Unicode 88 issued on August 29, 1988:

"1.1. Abstract

"This document is a draft proposal for the design of an international/multilingual text character coding system, tentatively called Unicode.

"Unicode is intended to address the need for a workable, reliable world text encoding. Unicode could be roughly described as 'wide-body ASCII' that has been stretched to 16 bits to encompass the characters of all the world's living languages. In a properly engineered design, 16 bits per character are more than sufficient for this purpose.

"In the Unicode system, a simple unambiguous fixed-length character encoding is integrated into a coherent overall architecture of text processing. The design aims to be flexible enough to support many disparate (vendor-specific) implementations of text processing software.

"A general scheme for character code allocations is proposed (and materials for making specific individual character code assignments are well at hand), but specific code assignments are not proposed here. Rather, it is hoped that this document will evoke interest from many organizations, which could cooperate in perfecting the design and in determining the final character code assignments" (http://www.unicode.org/history/unicode88.pdf, accessed 01-29-2010).

View Map + Bookmark Entry

1990 – 2000

An Encoded Sculpture, Still Not Decoded November 3, 1990

On November 3, 1990 American sculptor James Sanborn completed the cryptographic sculpture, Kryptos, on the grounds of the Central Intelligence Agency in Langley, Virginia.

"The name Kryptos comes from the Greek word for 'hidden', and the theme of the sculpture is 'intelligence gathering.' The most prominent feature is a large vertical S-shaped copper screen resembling a scroll, or piece of paper emerging from a computer printer, covered with characters comprising encrypted text. The characters consist of the 26 letters of the standard Roman alphabet and question marks cut out of the copper. This 'inscription' contains four separate enigmatic messages, each apparently encrypted with a different cipher."

"The ciphertext on one half of the main sculpture contains 869 characters in total, however Sanborn released information in April 2006 stating that an intended letter on the main half of Kryptos was missing. This would bring the total number of characters to 870 on the main portion. The other half of the sculpture comprises a Vigenère encryption tableau, comprising 869 characters, if spaces are counted. Sanborn worked with a retiring CIA employee named Ed Scheidt, Chairman of the CIA Cryptographic Center, to come up with the cryptographic systems used on the sculpture. Sanborn has since revealed that the sculpture contains a riddle within a riddle which will be solvable only after the four encrypted passages have been decrypted. He said that he gave the complete solution at the time of the sculpture's dedication to CIA director William H. Webster. However, in an interview for wired.com in January 2005, Sanborn said that he had not given Webster the entire solution. He did, however, confirm that where in part 2 it says "Who knows the exact location? Only WW," that "WW" was intended to refer to William Webster. He also confirmed that should he die before it becomes deciphered that there will be someone able to confirm the solution" (Wikipedia article on Kryptos, accessed 05-09-2009).

Steven Levy, "Mission Impossible: The Code that Even the CIA Can't Crack," Wired 17.05 (May 2009).

View Map + Bookmark Entry

Formulation of Shor's Algorithm for Quantum Computers 1994

In 1994 American applied mathematician Peter Shor, working at Bell Labs in Murray Hill, New Jersey, formulated Shor's algorithm, a quantum algorithm for integer factorization. Because Shor's algorithm shows that a quantum computer, or quantum supercomputer algorithm, with a sufficient number of qubits, operating without succumbing to noise or other quantum interference phenomena, could theoretically be used to break public-key cryptography schemes such as the widely used RSA scheme, its formulation in 1994 was a powerful motivator for the design and construction of quantum computers, and for the study of new quantum computer algorithms. It also stimulated research on new cryptosystems secure from quantum computers, collectively called post-quantum cryptography

"In 2001, Shor's algorithm was demonstrated by a group at IBM, who factored 15 into 3 × 5, using an NMR implementation of a quantum computer with 7 qubits. However, some doubts have been raised as to whether IBM's experiment was a true demonstration of quantum computation, since no entanglement was observed. Since IBM's implementation, several other groups have implemented Shor's algorithm using photonic qubits, emphasizing that entanglement was observed. In 2012, the factorization of 15 was repeated. Also in 2012, the factorization of 21 was achieved, setting the record for the largest number factored with a quantum computer" (Wikipedia article on Shor's algorithm, accessed 12-24-2013).

Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer", SIAM J. Comput. 26 (1996) 1484–1509, arXiv:quant-ph/9508027v2.

View Map + Bookmark Entry

2005 – 2010

The First Intelligible Word from an Extinct South American Civilization? August 12, 2005

Gary Urton with some khipu

Carrie Brezine studying khipu

An example of khipu

On August 12, 2005 anthropologists Gary Urton and Carrie Brezine published "Khipu Accounting in Ancient Peru," Science 309(2005)1065 - 1067.

"Khipu [quipu] are knotted-string devices that were used for bureaucratic recording and communication in the Inka [Inca] Empire. We recently undertook a computer analysis of 21 khipu from the Inka administrative center of Puruchuco, on the central coast of Peru. Results indicate that this khipu archive exemplifies the way in which census and tribute data were synthesized, manipulated, and transferred between different accounting levels in the Inka administrative system" (Science).

"Researchers in the US believe they have come closer to solving a centuries-old mystery - by deciphering knotted string used by the ancient Incas.

"Experts say one bunch of knots appears to identify a city, marking the first intelligible word from the extinct South American civilisation.

"The coloured, knotted pieces of string,known as khipu, are believed to have been used for accounting information.

"The researchers say the finding could unlock the meaning of other khipu.

"Harvard University researchers Gary Urton and Carrie Brezine used computers to analyse 21 khipu.

"They found a three-knot pattern in some of the strings which they believe identifies the bunch as coming from the city of Puruchuco, the site of an Inca palace.

" 'We hypothesize that the arrangement of three figure-eight knots at the start of these khipu represented the place identifier, or toponym, Puruchuco,' they wrote in their report, published in the journal Science.

" 'We suggest that any khipu moving within the state administrative system bearing an initial arrangement of three figure-eight knots would have been immediately recognisable to Inca administrators as an account pertaining to the palace of Puruchuco.' (http://news.bbc.co.uk/2/hi/americas/4143968.stm, accessed 04-28-2009).

View Map + Bookmark Entry

The Genetic Code of Avian Flu Virus H5N1 is Deciphered October 5, 2005

The Armed Forces Institute of Pathology logo

Colorized transmission electron micrograph of Avian influenza A H5N1 viruses (seen in gold) grown in MDCK cells (seen in green)

On October 5,2005 scientists at the Armed Forces Institute of Pathology announced that they deciphered the genetic code of the 1918 avian flu virus H5N1, which killed as many as 50,000,000 people worldwide, from a victim exhumed in 1997 from the Alaskan permafrost. The scientists reconstructed the virus in the laboratory and published the genetic sequence.

View Map + Bookmark Entry

Decoding Printer Tracking Dots October 19, 2005

The Electronic Frontier Foundation logo

An image of one repetion of the dot grid from the Xerox DocuColor 12 page, magnified 10x and photographed by the QX5 microscope under illumination from a Photon blue LED flashlight

The dot grid under 60x magnification

In October 2005 the Electronic Frontier Foundation decoded printer tracking dots.

View Map + Bookmark Entry

The PRISM Surveillance Program September 11, 2007 – June 6, 2013

A BoundlessInformant global heat map of data collection. The color scheme ranges from green (areas least subjected to surveillance) through yellow and orange to red (areas most subjected to surveilance).

The PRISM program logo

On September 11, 2007, U.S. President George W. Bush signed the Protect America Act of 2007, allowing the National Security Administration (NSA) to start a massive domestic surveillance data-collection program known officially by the SIGAD US-984XN, code name PRISM.

"The program is operated under the supervision of the U.S. Foreign Intelligence Surveillance Court (FISC) pursuant to the Foreign Intelligence Surveillance Act (FISA). Its existence was leaked five years later by NSA contractor Edward Snowden, who claimed the extent of mass data collection was far greater than the public knew, and included 'dangerous' and 'criminal' activities in law. The disclosures were published by [by Glenn Greenwald in] The Guardian and The Washington Post on June 6, 2013.

A document included in the leak indicated that PRISM was 'the number one source of raw intelligence used for NSA analytic reports.' The leaked information came to light one day after the revelation that the FISC had been ordering a business unit of the telecommunications company Verizon Communications to turn over to the NSA logs tracking all of its customers' telephone calls on an ongoing daily basis." (Wikipedia article on PRISM (surveillance program) accessed 07-07-2013).

Here is the link to Glenn Greenwald's article in www.guardian.co.uk publishing the first of Snowden's disclosures.  When I linked to this on July 7, 2013 it had been friended on Facebook by 141,922 people.

♦ A more general survey of the extent of what was characterized as the "2013 mass surveillance scandal," with a summary of NSA spying programs, was available from the Wikipedia in August 2013.

♦ On August 13, 2013 The New York Times published an article by Peter Maass regarding the work of the documentary film maker Laura Poitras, telling how she and lawyer / journalist Glenn Greenwald helped Edward Snowden publish his secrets.

View Map + Bookmark Entry

2012 – 2016

The Enigma Database for Deciphering Difficult to Read Words in Medieval Latin Manuscripts June 2014

Enigma, a database developed by the Digital Humanities program of the CIHAM - UMR 5648 research center at CNRS- Université Lyon 2-EHESS- ENS de Lyon- Université d'Avignon et des Pays de Vlaucluse- Université Lyon 3, was designed to help scholars decipher difficult to read words in medieval Latin manuscripts. 

"If you type the letters you can read and add wildcards, Enigma will list the possible Latin forms, drawing from its database of more than 400,000 forms. 
Nota bene: Enigma does NOT solve abbreviations. To do so, you can resort to A. Cappelli's famous dictionary, available online (ed. Milan, 1912, and ed. Leipzig, 1928). If you cannot resolve an abbreviation, replace it by a wildcard in your Enigma query. . . ." 
View Map + Bookmark Entry

"How Edward Snowden Changed Journalism" October 21, 2014

On October 21, 2014 Steve Coll, dean of the Tow Center for Digital Journalism at Columbia University's Graduate School of Journalism, published an article entitled "How Edward Snowden Changed Journalism" in The New Yorker, from which I quote:

". . . . one of the least remarked upon aspects of the [Edward] Snowden matter is that he has influenced journalistic practice for the better by his example as a source. Famously, when Snowden first contacted [Glenn] Greenwald, he insisted that the columnist communicate only through encrypted channels. Greenwald couldn’t be bothered. Only later, when [Laura] Poitras told Greenwald that he should take the trouble, did Snowden take him on as an interlocutor.

"It had been evident for some time before Snowden surfaced that best practices in investigative reporting and source protection needed to change—in large part, because of the migration of journalism (and so many other aspects of life) into digital channels. The third reporter Snowden supplied with National Security Agency files, Barton Gellman, of the Washington Post, was well known in his newsroom as an early adopter of encryption. But it has been a difficult evolution, for a number of reasons.

"Reporters communicate copiously; encryption makes that habit more cumbersome. Most reporters don’t have the technical skills to make decisions on their own about what practices are effective and efficient. Training is improving (the Tow Center for Digital Journalism, at Columbia Journalism School, where I serve as dean, offers a useful place to start), but the same digital revolution that gave rise to surveillance and sources like Snowden also disrupted incumbent newspapers and undermined their business models. Training budgets shrank. In such an unstable economic and audience environment, source protection and the integrity of independent reporting fell on some newsrooms’ priority lists.

"Snowden has now provided a highly visible example of how, in a very high-stakes situation, encryption can, at a minimum, create time and space for independent journalistic decision-making about what to publish and why. Snowden did not ask to have his identity protected for more than a few days—he seemed to think it wouldn’t work for longer than that, and he also seemed to want to reveal himself to the public. Yet the steps he took to protect his data and his communications with journalists made it possible for the Guardian and the Post to publish their initial stories and bring Snowden to global attention.

"It took an inside expert with his life and liberty at stake to prove how much encryption and related security measures matter. 'There was no risk of compromise,' Snowden told the Guardian, referring to how he managed his source relationship with Poitras and the others before their meeting in Hong Kong. 'I could have been screwed,'but his encryption and other data-security practices insured that it 'wasn’t possible at all' to intercept his transmissions to journalists “'unless the journalist intentionally passed this to the government' "

View Map + Bookmark Entry