4406 entries. 94 themes. Last updated December 26, 2016.

Conservation, Preservation & Restoration Timeline

Theme

1,000 BCE – 300 BCE

The Royal Library of Alexandria: The Largest Collection of Recorded Information in the Ancient World Circa 300 BCE

The Royal Library of Alexandria, associated with the Museum or Mouseion at Alexandria (Μουσεῖον τῆς Ἀλεξανδρείας), was probably founded around 300 BCE under the reign of Ptolemy I Soter or Ptolemy II. Though it was the largest library in the ancient world, and the repository of so much Greek literature that was eventually passed down to us, and also so much that was eventually lost, the number of papyrus rolls preserved at Alexandria at its peak, or any other time, is unknown. At its peak, the number of rolls that it might have held has been estimated by numerous scholars, without any reliable evidence, from as many as 400,000 to 700,000 to as few as 40,000, or even less. A typical papyrus roll probably contained a text about the length of one book of Homer.

Writing in 2002, American classical scholar Roger Bagnall argued that very high numbers of rolls traditionally estimated by scholars to have been held by the Royal Library of Alexandria, such as 400,000 to 700,000 rolls, may reflect modern expectations rather than the extent of written literature that may have been produced by ancient Greek writers: 

"The computer databank of ancient Greek literature, the Thesurus Linguae Graecae, contains about 450 authors of whom at least a few words survive in quotation and whose lives are thought to have begun by the late fourth century. No doubt there were authors extant in the early Hellenistic period of whom not a line survives today, but we cannot estimate their numbers. Of most of these 450, we have literally a few sentences. There are another 175 known whose lives are placed, or whose births are placed in the third century B. C. Most of these authors probably wrote what by modern standards was a modest amount—a few book-rolls full, perhaps. Even the most voluminous authors of the group, like the Athenian dramatists, probably filled nor more than a hundred rolls or so. If the average writer filled 50 rolls, our known authors to the end of the third century would have produced 31,250 rolls. . . .

"To look at matters another way, just, 2,871,000 words of Greek are preserved for all authors known to have lived at least in part in the fourth century or earlier. Adding the third and second centuries brings the total to 3,773,000 words (or about 12,600 pages of 300 words each). At an average of 15,000 words per roll, this corpus would require a mere 251 rolls. Even at an average of 10,000 words per roll the figure would be only 377 rolls. It was estimated by one eminent ancient historian that the original bulk of historical writings in ancient Greece amounted to something like forty times what has survived. If so, our estimate would run to an original body of 10,000 to 15,000 rolls. This may be too low, but is it likely that it is too low by a factor of thirty or forty, and that only one word in 1,500 or 2,000 has survived? . . . (Roger S. Bagnall, "Alexandria: Library of Dreams," Proceedings of the American Philosophical Society, 146 (2002) 348-62, quoting from 352-53).

Traditionally the Alexandrian Library is thought to have been based upon the library of Aristotle. By tradition it is also believed, without concrete evidence, that the much of the collection of rolls was acquired by order of Ptolemy III, who supposedly required all visitors to Alexandria to surrender rolls in their possession. These writings were then copied by official scribes, the originals were put into the Library, and the copies were delivered to the previous owners.

The Alexandrian Library was associated with a school and a museum. Scholars at Alexandria were responsible for the editing and standardization for many earlier Greek texts. One of the best-known of these editors was Aristophanes of Byzantium, a director of the library, whose work on the text of the Iliad may be preserved in the Venetus A manuscript, but who was also known for editing authors such as Pindar and Hesiod.

Though it is known that portions of the Alexandrian Library survived for several centuries, the various accounts of the library's eventual destruction are contradictory. The Wikipedia article on the Library of Alexandria outlined four possible scenarios for its destruction:

  1. Julius Caesar's fire in The Alexandrian War, in 48 BCE
  2. The attack of Aurelian in the Third century CE
  3. The decree of Theophilus in 391 CE. (Destruction of pagan literature by early Christians.)
  4. The Muslim conquest in 642 CE, or thereafter.

♦ Other factors in the eventual destruction of the contents of the Alexandrian Library might have included the decay of the papyrus rolls as a result of the climate. Most of the papyrus rolls and fragments that survived after the Alexandrian Library did so in the dry sands of the Egyptian desert. Papyrus rolls do not keep well either in dampness or in salty sea air, to which they were likely exposed in the library located in the port of Alexandria. Thus, independently of the selected library destruction scenario, because of decay of the storage medium, or as a result of fires, rodent damage, natural catastrophes, or neglect, it is probable that significant portions of the information in the Alexandrian library were lost before the library was physically destroyed.

Whatever the circumstances and timing of the physical destruction of the Library, it is evident that by the eighth century the Alexandrian Library was no longer a significant institution. 

(This entry was last revised on March 22, 2014.)

View Map + Bookmark Entry

300 BCE – 30 CE

The Portland Vase: Classical Connoisseurship, Influence, Destruction & Conservation 30 BCE – 25 CE

The Portland Vase. Shown is the first of two scenes. (View Larger)

A Roman cameo glass vase, the Portland Vase, created between 30 BCE and 25 CE, and known since the Renaissance, served as an inspiration to many glass and porcelain makers from about the beginning of the 18th century onwards. It is about 25 centimeters high and 56 in circumference, made of violet-blue glass, and surrounded with a single continuous white glass cameo depicting seven figures of humans and gods. "On the bottom was a cameo glass disc, also in blue and white, showing a head, presumed to be of Paris or Priam on the basis of the Phrygian cap it wears. This roundel clearly does not belong to the vase, and has been displayed separately since 1845. It may have been added to mend a break in antiquity or after, or the result of a conversion from an original amphora form (paralleled by a similar blue-glass cameo vessel from Pompeii) - it was definitely attached to the bottom from at least 1826."

"The meaning of the images on the vase is unclear and controversial. Interpretations of the portrayals have included that of a marine setting (due to the presence of a ketos or sea-snake), and of a marriage theme/context (i.e. as a wedding gift). Many scholars (even Charles Towneley) have concluded that the figures do not fit into a single iconographic set."

"Cameo-glass vessels were probably all made within about two generations as experiments when the blowing technique (discovered in about 50 BC) was still in its infancy. Recent research has shown that the Portland vase, like the majority of cameo-glass vessels, was made by the dip-overlay method, whereby an elongated bubble of glass was partially dipped into a crucible (fire-resistant container) of white glass, before the two were blown together. After cooling the white layer was cut away to form the design."

"The work towards making a 19th century copy proved to be incredibly painstaking, and based on this it is believed that the Portland Vase must have taken its original artisan no less than two years to produce. The cutting was probably performed by a skilled gem-cutter. It is believed that the cutter may have been Dioskourides, as gems cut by him of a similar period and signed by him."

Traditionally the vase was believed to have been discovered by Fabrizio Lazzaro in the sepulchre of the Emperor Alexander Severus, at Monte del Grano near Rome, and excavated some time around 1582.

The first documented reference to the vase is a 1601 letter from the French scholar Nicolas Claude Fabri de Peiresc to the painter Peter Paul Rubens, where it is recorded as in the collection of Cardinal Francesco Maria Del Monte in Italy. It then passed to the Barberini family collection (which also included sculptures such as the Barberini Faun and Barberini Apollo) where it remained for some two hundred years, being one of the treasures of Maffeo Barberini, later Pope Urban VIII.

In 1778 Sir William Hamilton, British ambassador in Naples, purchased it from James Byres. "Byres, a Scottish art dealer, had acquired it after it was sold by Donna Cornelia Barberini-Colonna, Princess of Palestrina. She had inherited the vase from the Barberini family. Hamilton brought it to England on his next leave, after the death of his first wife, Catherine. In 1784, with the assistance of his niece, Mary, he arranged a private sale to Margaret Cavendish-Harley, widow of William Bentinck, 2nd Duke of Portland and so dowager Duchess of Portland. She passed it to her son William Cavendish-Bentinck, 3rd Duke of Portland in 1786.

"The 3rd Duke loaned the original vase to Josiah Wedgwood (see below) and then to the British Museum for safe-keeping, at which point it was dubbed the "Portland Vase". It was deposited there permanently by the fourth Duke in 1810, after a friend of his broke its base. The original Roman vase has remained in the British Museum ever since 1810, apart from three years (1929-32) when William Cavendish-Bentinck, 6th Duke of Portland put it up for sale at Christie's. It failed to reach its reserve. It was purchased by the Museum from William Cavendish-Bentinck, 7th Duke of Portland in 1945 with the aid of a bequest from James Rose Vallentin. . . .

"The 3rd Duke lent the vase to Josiah Wedgwood, who had already had it described to him as 'the finest production of Art that has been brought to England and seems to be the very apex of perfection to which you are endeavouring' by the sculptor John Flaxman. Wedgwood devoted four years of painstaking trials at duplicating the vase - not in glass but in jasperware. He had problems with his copies ranging from cracking and blistering (clearly visible on the example at the Victoria and Albert Museum) to the reliefs 'lifting' during the firing, and in 1786 he feared that he could never apply the Jasper relief thinly enough to match the glass original's subtlety and delicacy. He finally managed to perfect it in 1790, with the issue of the "first-edition" of copies (with some of this edition, including the V&A one, copying the cameo's delicacy by a combination of undercutting and shading the reliefs in grey), and it marks his last major achievement.

"Wedgwood put the first edition on private show between April and May 1790, with that exhibition proving so popular that visitor numbers had to be restricted by only printing 1900 tickets, before going on show in his public London showrooms. (One ticket to the private exhibition, illustrated by Samuel Alkin and printed with 'Admission to see Mr Wedgwood's copy of The Portland Vase, Greek Street, Soho, between 12 o'clock and 5', was bound into the Wedgwood catalogue on view in the Victoria and Albert Museum's British Galleries.) As well as the V&A copy (said to have come from the collection of Wedgwood's grandson, the naturalist Charles Darwin), others are held at the Fitzwilliam Museum (this is the copy sent by Wedgwood to Erasmus Darwin which his descendants loaned to the Museum in 1963 and later sold to them) and the Department of Prehistory and Europe at the British Museum.

"The Vase also inspired a 19th century competition to duplicate its cameo-work in glass, with Benjamin Richardson offering a £1000 prize to anyone who could achieve that feat. Taking three years, glass maker Philip Pargeter made a copy and John Northwood engraved it, to win the prize. This copy is in the Corning Museum of Glass in Corning, New York.

Vandalism and Reconstruction

"On February 7, 1845, the vase was shattered by William Lloyd, who drunkenly threw a nearby sculpture on top of the case smashing both it and the vase. The vase was pieced together with fair success, though the restorer was unable to replace all of the pieces and thirty-seven small fragments were lost. It appears they had been put into a box and forgotten. In 1948, the keeper Bernard Ashmole received thirty-seven fragments in a box from Mr. Croker of Putney, who did not know what they were. In 1845 Mr. Doubleday, the first restorer, did not know where these fragments went. A colleague had taken these to Mr. Gabb, a box maker, who was asked to make a box with thirty seven compartments, one for each fragment. The colleague died, the box was never collected, Gabb died and his executrix Miss Revees asked Croker to ask the museum if they could identify them. The Duke's descendants finally sold the vase to the museum in 1945.

"By 1948, the restoration appeared aged and it was decided to restore the vase again, but the restorer was only successful in replacing three fragments. The adhesive from this weakened, by 1986 the joints rattled when the vase was gently tapped. The third and current reconstruction took place in 1987, when a new generation of conservators assessed the vase's condition during its appearance as the focal piece of an international exhibition of Roman glass and, at the conclusion of the exhibition, it was decided to go ahead with reconstruction and stabilisation. The treatment had scholarly attention and press coverage. The vase was photographed and drawn to record the position of fragments before dismantling; the BBC filmed the conservation process. All previous adhesives had failed, so to find one that would last, conservation scientists at the museum tested many adhesives for long term stability. Finally, an epoxy resin with excellent ageing properties was chosen. Reassembly of the vase was made more difficult as the edges of some fragments were found to have been filed down during the restorations. Nevertheless, all of the fragments were replaced except for a few small splinters. Areas that were still missing were gap-filled with a blue or white resin.

"The newly conserved Portland Vase was returned to display. Little sign of the original damage is visible and except for light cleaning, the vase should not require major conservation work for many years." (Wikipedia article on Portland Vase, accessed 11-10-2009)

View Map + Bookmark Entry

30 CE – 500 CE

Over 11,000 Wall Inscriptions Survived from Pompeii 79 CE

An inscription depicting a contemporaneous politician. (View Larger)

The eruption of Mt. Vesuvius over two days in 79 CE buried the cities of Herculaneum and Pompeii in lava, destroying life, but preserving buildings in a remarkable way.

From the ruins of Pompeii over 11,000 inscriptions have been recorded—of many different kinds—carved, painted  or scratched into walls, formal, humorous, erotic, and scatological. They reflect wide use of writing and comparatively wide availability of literacy in Roman society.

"Some of them [the inscriptions] are very grand and formal, like the dedications of public buildings and the funerary epitaphs, similar to others found all over the Roman world. Inscriptions such as these are not necessarily good evidence of widespread literacy. The enormous numbers that were produced in Roman times could reflect a fashion for this particular medium of display, rather than a dramatic spread of the ability to read and write.

"Other Pompeian inscriptions are perhaps more telling, because they display a desire to cummunicate in a less formal and more ephemeral way with fellow citizens. Walls on the main streets of Pompeii are often decorated with painted messages, whose regular script and layout reveal the work of professional sign-writers. Some are advertisements for events such as games in the amphitheatre; others are endoresements of candiates for civic office, by individuals and groups within the city. . . .

"Graffiti offer even more striking evidence of the spread and use of writing in Pompeian society. These are found all over the city, scratched into stone or plaster by townspeople with time on their hands and a message to convey to future idlers. . . .

"Even though we cannot estimate the proportion of Pompeians who were literate (was it 30 per cent, or more, or perhaps on 10 per cent ?) we can say with confidence that writing was an essential, and a day-to-day part of the city's life" (Ward-Perkins, The Fall of Rome and the End of Civilization [2005] 153-54, & 155-57).

Because graffiti such as those preserved in Pompeii were intended to be widely shared some have called these evidence of early social media. 

View Map + Bookmark Entry

Pamphilus Establishes a Library and Scriptorium and is Executed During the Diocletianic Persecution of Christians 275 CE – 309 CE

A map of Israel, with Caesarea Maritima highlightd in blue. (View Larger)

Between 275 and his martyrdom in 309 Pamphilus of Caesarea (Pamphilius), presbyter, and teacher of Eusebius, devoted his life to searching out and obtaining copies of manuscript texts, some of which he copied himself. He established a library that may have contained 30,000 manuscripts, and a scriptorium at a Christian theological school at Caesarea Palaestina, now Caesarea Maritima, a town on the coast of Israel between Tel Aviv and Haifa. Because of this library Caesarea was the capital of Christian scholarship in the 3rd century.

"This Pamphilus was of a noble family in the Phoenician city of Berytus [Beirut], where he received his early education. Probably in the early and mid-280's, he studied in Alexandria under the presbyter Pierius, who was himself known as 'the Younger Origen.' From there Pamphilius seems to have come to Caesarea, where his great learning in philosophy and theology enabled him to open a successful school at Caesarea. Pamphilus' school could boast no unbroken descent from Origen's school, because there was no continuous sucession of masters at Caesarea between Origen and Pamphilus. . . ." (Carriker, The Library of Eusebius of Caesarea [2003] 12-13).

During the Diocletianic persecution, the last and most bloody persecution of Christians before Constantine established Christianity as the Roman state religion, Pamphilus was arrested and imprisoned in November 307. He was executed and martyred on February 16, 309. 

"By the end of 307 Pamphilius was arrested under the orders of Urbanus, the local Roman governor, tortured cruelly, and placed in prison. Yet, in prison and suffering from his torture wounds, Pamphilius did not remain idle but continued editing the Septuagint and with Eusebius, wrote a Defense of Origen that he sent to the confessors in the mines of Phaeno, Egypt [i.e. South Palestine, "in a mining area lying east of the Wadi Arabah, between the south end of the Dead Sea and Petra."]

"After being in prison for two years, Pamphilius was ordered killed by the new governor, Firminius. He was then beheaded on February 16, 309 with several of his disciples. In his memory Eusebius called himself Eusebius Pamphili, to denote his close friendship with Pamphilius" (Orthodox Wiki article on Pamphilius, accessed 02-02-2013).

View Map + Bookmark Entry

800 – 900

The Second Oldest Arabic Manuscript on Arabic Paper November – December 867

Folio 241b of MS Leiden Or. 298, a manuscript of the 'Gharib al-Hadith' by Abu `Ubayd al-Qasim b. Sallam. (View Larger)

The second oldest surviving Arabic book on Arabic paper, and the earliest Arabic manuscript on paper preserved in Europe

"is generally believed to be a fragmentary copy of Abu Ubayd's work on unusual terms in the traditions of the Prophet dated Dhu'l-Qada 252, November—December 967 and preserved in Leiden University Library [Legatum Warnerianum]. It bears no indication of where it was copied. The opaque stiff paper has turned dark brown and has a tendency to split along the edges. This feature had led some observers to suggest that the pages of early manuscripts were pasted together, back to back, from two separate sheets made in floating molds, which leave one side rougher than the other and unsuitable for writing. This tendency for the pages to split is actually a result of delamination, a condition seen in many early papers, such as the Vatican manuscript [Doctrina patrum]. When the pulp was not sufficiently beaten, the outer layers of the cellulose fibers did not detach and form physical and chemical bonds with adjacent microfibrils, and the resulting paper has weak internal cohension. The condition was exacerbated when the paper was given a hard surface with the application of size. The weaker interior splits easily in two, revealing a rough, woolly and feltlike inner surface" (Bloom, Paper Before Print. The History and Impact of Paper in the Islamic World [2001] 59-60 and figure 27).

View Map + Bookmark Entry

1000 – 1100

Beowulf: Known from a Unique Medieval Manuscript Circa 1000

The first page of the Beowulf manuscript. (View Larger)

Beowulf, a traditional heroic epic poem written in Old English alliterative verse, and representing with its 3,182 lines 10% of all surviving Old English poetry, is known from one medieval manuscript that dates from between the 8th and the 11th century, perhaps in the first decade after 1000. The manuscript, known as the Nowell Codex or Cotton Vitellius A. xv, is preserved in the British Library.

The volume as it was bound in the 17th century contains two manuscripts: a 12th century manuscript that contains four prose works, and the Nowell codex, named after the q6th century English antiquarian, cartographer, and scholar of Anglo-Saxon language and literature Laurence Nowell, whose name is inscribed on its first page, and who owned the manuscript in the mid-16th century. It was then acquired by Sir Robert Cotton, who placed it in his library as the 15th manuscript on the first shelf of the bookcase that was headed by a bust of Vitellius.

"The unique copy of Beowulf is preserved in the Cottonian collection of manuscripts that suffered from a great fire in 1731. It remained in its burnt binding until the middle of the nineteenth century, when Sir Frederic Madden, Keeper of Manuscripts at the British Museum, undertook to restore these damaged manuscripts in his care. His bookbinder first traced the outline of each burnt leaf, cut out the center of the tracing except for a retaining edge of about 2mm, and pasted and taped the vellum leaf to the paper frame. Then he rebound the framed leaves in a new cover. The method well preserved the fragile bits of text along the burnt edges of the leaves, but the retaining edges of the paper mounts, and the paste and tape used to secure the leaves to them, hide from view many hundreds of letters and bits of letters. Today they are visible only if one holds a bright light directly behind them, an ineffectual solution if one lacks the manuscript, the bright light, or the permission to use them together" (The Electronic Beowulf, 1993, accessed 06-15-2009).

View Map + Bookmark Entry

The Norman Conquest Recorded on the Bayeux Tapestry 1077

A scene from the Bayeux tapestry, showing Odo, Archbishop of Canterbury, on horseback. (View Larger)

The Bayeux Tapestry, an embroidery roughly 70 meters long, was produced in England, possibly in Canterbury, commemorating events leading up to and after the Battle of Hastings.

"The tapestry has text in Latin describing what is happening in the scenes. This work of art includes 623 humans, 202 horses, 41 ships, 2000 Latin words and 8 different colors of yarn."

A view of one half of the gallery in which the Bayeux tapestry is preserved. (View Larger)

 

"The tapestry was most likely first put on display in the Cathedral of Notre Dame [in Bayeux] built by Bishop Odo in 1077. Then, no mention of it is found for the next 300 years. Then, it was mentioned in 1750 when it was referred to in a book by the name of Palaeographia Britannicus. Soon afterward, the people of Bayeux, who were fighting for the Republic, needed cloth to cover their wagons. As such, the tapestry was removed from the cathedral and used to cover an ammunition wagon. A lawyer saved the tapestry by replacing it with another cloth. In 1803 Napoleon seized it and transported it to Paris. Napoleon wanted to use the tapestry as inspiration for his planned attack on England. When this plan was cancelled, the tapestry was returned to Bayeux. The townspeople wound the tapestry up and stored it like a scroll. The tapestry spent World War II wound up in the Louvre. Now it is stored in a museum in a dark room with special lighting to avoid damaging it."

View Map + Bookmark Entry

1100 – 1200

King Roger Bans the Use of Paper 1145

At Martorana in Palermo, Italy, a mural depicting the divine coronation of Roger II. (View Larger)

King Roger II of Sicily banned the use of paper for official documents, believing it to be less permanent than parchment. (Tsien Tsuen-Hsuin 5). Europeans were initially distrustful of paper, which was introduced to Europe from the Arab world during the period of the Crusades.

View Map + Bookmark Entry

Early Autograph Draft of Maimonides' Guide for the Perplexed Circa 1185 – 1190

T-S_10Ka4.1,r: a page from an early autograph draft of Maimonides's 'Guide for the Perplexed.' (View Larger)

About 1185-1190 Moses Maimonides, rabbi, physician, and philosopher in Spain, Morocco and Egypt, wrote the Guide for the Perplexed, of which an early autograph draft is preserved in the Taylor-Schechter Cairo Genizah collection at Cambridge University Library, gathered from the Genizah of the Ben Ezra Synagogue Old Cairo, along with several other autograph manuscripts and fragments by Maimonides.

Maimonides' Guide for the Perplexed, thought to have been completed by 1190,

"is the main source of the Rambam's philosophical views, as opposed to his opinions on Jewish law. Since many of the philosophical concepts, such as his view of theodicy and the relationship between philosophy and religion, are relevant beyond strictly Jewish theology, it has been the work most commonly associated with Maimonides in the non-Jewish world and it is known to have influenced several major non-Jewish philosophers. . . . Within Judaism, the Guide became widely popular and controversial, with many Jewish communities requesting copies of the manuscript."

View Map + Bookmark Entry

1300 – 1400

Philobiblon, Perhaps the Earliest Treatise on Book Collecting and on Preserving Books & Creating a Library 1345 – 1473

The seal of Richard de Bury. (View Larger)

Shortly before his death in 1345, the priest, bishop, politician, diplomat and bibliophile Richard Aungerville, commonly known as Richard de Bury, wrote Philobiblon, perhaps the earliest treatise on the value of preserving neglected or decaying manuscripts, on building a library, and on book collecting. de Bury was appointed tutor to the future King Edward III while Edward was Prince of Wales, and, according to Thomas Frognall Dibdin, inspired the prince with his own love of books.

Having connections in the court, de Bury somehow became involved in the intrigues preceding the deposition of King Edward II, and in 1325 supplied Queen Isabella and her lover, Roger Mortimer, in Paris with money from the revenues of Brienne, of which province he was treasurer. For a period of time he had to hide in Paris from the officers sent by Edward II to apprehend him.

Upon his ascent to the throne in 1327 Edward III rapidly promoted de Bury, appointing him cofferer to the king, treasurer of the wardrobe and afterwards in 1329 Lord Privy Seal. The king repeatedly recommended him to the pope, and twice sent him, in 1330 and 1333, as ambassador to the papal court in exile at Avignon. On the first of these visits Richard met a fellow bibliophile, Petrarch, who recorded his impression of Aungerville as "not ignorant of literature and from his youth up curious beyond belief of hidden things." Pope John XXII made de Bury his principal chaplain, and presented him with a rochet in earnest of the next vacant bishopric in England. 

During his absence from England in February 1333 de Bury was appointed Dean of Wells. In September of the same year, he was appointed Bishop of Durham by the king. In February 1334 de Bury was made Lord Treasurer, an appointment he exchanged later in the year for that of Lord Chancellor. Richard may have sometimes exploited his political power to collect manuscripts. According to the Wikipedia, an abbot of St Albans bribed him with four valuable books, and de Bury, who procured certain coveted privileges for the monastery, bought from him thirty-two other books for fifty pieces of silver, far less than their normal price. In Philobiblon

"Richard de Bury gives an account of the wearied efforts made by himself and his agents to collect books. He records his intention of founding a hall at Oxford, and in connection with it a library in which his books were to form the nucleus. He even details the dates to be observed for the lending and care of the books, and had already taken the preliminary steps for the foundation. The bishop died, however, in great poverty on 14 April 1345 at Bishop Auckland, and it seems likely that his collection was dispersed immediately after his death. Of it, the traditional account is that the books were sent to the Durham Benedictines Durham College, Oxford which was shortly thereafter founded by Bishop Hatfield, and that on the dissolution of the foundation by Henry VIII they were divided between Duke Humphrey of Gloucester's library, Balliol College, Oxford, and George Owen. Only two of the volumes are known to be in existence; one is a copy of John of Salisbury's works in the British Museum, and the other some theological treatises by Anselm and others in the Bodleian.

"The chief authority for the bishop's life is William de Chambre, printed in Wharton's Anglia Sacra, 1691, and in Historiae conelmensis scriptores tres, Surtees Soc., 1839, who describes him as an amiable and excellent man, charitable in his diocese, and the liberal patron of many learned men, among these being Thomas Bradwardine, afterwards Archbishop of CanterburyRichard Fitzralph, afterwards Archbishop of Armagh, the enemy of the mendicant ordersWalter Burley, who translated Aristotle, John Mauduit the astronomer, Robert Holkot and Richard de KilvingtonJohn Bale and Pits I mention other works of his, Epistolae Familiares and Orationes ad Principes. The opening words of the Philobiblon and the Epistolaeas given by Bale represent those of the Philobiblon and its prologue, of that he apparently made two books out of one treatise. It is possible that the Orationes may represent a letter book of Richard de Bury's, entitled Liber Epistolaris quondam dominiis cardi de Bury, Episcopi Dunelmensis, now in the possession of Lord Harlech.

"This manuscript, the contents of which are fully catalogued in the Fourth Report (1874) of the Historical Manuscripts Commission (Appendix, pp. 379–397), contains numerous letters from various popes, from the king, a correspondence dealing with the affairs of the university of Oxford, another with the province of Gascony, beside some harangues and letters evidently meant as models to be used on various occasions. It has often been asserted that the Philobiblon itself was not written by Richard de Bury at all, but by Robert Holkot. This assertion is supported by the fact that in seven of the extant manuscripts of Philobiblon it is ascribed to Holkote in an introductory page, in these or slightly varying terms: Incipit prologus in re philobiblon ricardi dunelmensis episcopi que libri composuit ag. The Paris manuscript has simply Philobiblon olchoti anglici, and does not contain the usual concluding note of the date when the book was completed by Richard. As a great part of the charm of book lies in the unconscious record of the collector's own character, the establishment of Holkot's authorship would materially alter its value. A notice of Richard de Bury by his contemporary Adam Murimuth (Continuatio ChronicarumRolls series, 1889, p. 171) gives a less favourable account of him than does William de Chambre, asserting that he was only moderately learned, but desired to be regarded as a great scholar (Wikipedia article on Richard de Bury, accessed 02-04-2014).

Philobiblon was published in print for the first time in Cologne, 1473

View Map + Bookmark Entry

1450 – 1500

Discovery of a Lost Painting by Michelangelo? 1487 – 1488

According to Vasari, when he was twelve or thirteen Michelangelo painted a version of The Torment of St. Anthony based on an engraving by Martin Schongauer. This was one of only four known easel paintings by Michelangelo. For centuries art historians debated the existence of such a painting.

In 2008 a painting of The Torment of St. Anthony from a private collection was sold at Sotheby's London, with an attribution from the Florence workshop of Ghirlandaio, to whom Michelangelo was apprenticed. Adam Williams, a New York dealer, bought the painting, believing that it was by Michelangelo. Williams took it to the Metropolitan Museum of Art for cleaning and study. In 2009 Williams sold it to the Kimball Art Museum, Fort Worth, Texas.

" 'I had never seen it before,” Mr. Christiansen said. “I looked at it and said this is self-evidently Michelangelo. There’s a section of the rocks with cross-hatching. Nobody else did this kind of emphatic cross-hatching.”

"Michael Gallagher, conservator of paintings at the Metropolitan, cleaned and studied the painting.

" 'It was incredibly dirty,' he said. 'But once the centuries of varnish were removed, its true quality was evident.'

"Claire M. Barry, the Kimbell’s chief curator, heard about the work and came to the Met to see it. She then contacted Mr. Lee, who also inspected it and persuaded his board to buy it. Although no one will disclose the price, experts in the field say they believe the figure was more than $6 million.

"For centuries, art historians have known that Michelangelo copied an engraving of St. Anthony by the 15th-century German master Martin Schongauer for a painting. Michelangelo’s biographer and former student, Ascanio Condivi, said the young Michelangelo told him that while he was working on the painting, he had visited a local market to learn how to depict fish scales, a feature not found in the engraving.

"A painting of St. Anthony is also mentioned in Giorgio Vasari’s chronicle of Michelangelo’s life, although Vasari at first ascribed the original engraving to Dürer. But after Michelangelo complained, Vasari changed his account, naming Schongauer.

"Measuring 18 ½ inches by 13 1/4 inches, 'The Torment of St. Anthony' is at least one-third larger than the engraving. It is also not an exact copy; Michelangelo took liberties. In addition to adding the fish scales, he depicted St. Anthony holding his head more erect and with an expression more detached than sad.

"He also added a landscape to the bottom of the composition, and created monsters that are more dramatic than those in the engraving.

"Mr. Christiansen said studying 'The Torment of St. Anthony' with infrared reflectography had exposed layers of pentimenti, or under drawing, revealing what he called the master’s hand at work. And once the centuries of varnish were removed, the colors suddenly came alive. There is eggplant, lavender, apple green and even a brilliant salmon, which was used to depict the scales of the spiny demons. The palette, Mr. Christiansen said, is a prelude to the colors chosen for the Sistine Chapel’s vault" (Vogel, "By the Hand of a Very Young Master?," NY Times, May 12, 2009).

View Map + Bookmark Entry

Trithemius Favors Vellum over Paper for Long Term Information Storage 1494

In his treatise De laude scriptorum (In Praise of Scribes) written in reaction to the information revolution caused by printing, and published as a printed book in 1494, Benedictine abbot Johannes Trithemius (Tritheim) advocated preserving the medieval tradition of manuscript copying in spite of the the advantages of printing for information distribution. He was well aware of these advantages since he exploited them to expand his abbey library after the invention of printing, and also because thirty printed editions of his own writings appeared during the 15th century.

In the context of the fifteenth century information revolution Tritheim is most remembered for questioning the durability of media used in long term information storage when he compared the known long-term durability of information written on traditional parchment, examples of which had already lasted over 700 years, with that written or printed on the newer and less proven medium of paper.

Tritheim wrote:

"Brothers, nobody should say or think: 'What is the sense of bothering with copyring by hand when the art of printing has brought to light so many important books; a huge library can be acquired inexpensively.' I tell you, the man who ways this only tries to conceal his own laziness.

"All of you know the difference between a manuscript and a printed book. The word written on parchment will last a thousand years. The printed word is on paper. How long will it last? The most you can expect a book of paper to survive is two hundred years. Yet, there are many who think they can entrust their works to paper. Only time will tell.

"Yes, many books are now available in print but no matter how many books will be printed, there will always be some left unprinted and worth copying. No one will ever be able to locate and buy all printed books. . . ." (Translated in Tribble and Trubek eds., Writing Material: Readings from Plato to the Digital Age [2003]).

Taking an expansive view of libraries and the history of information, Tritheim also pointed out that all recorded information could never be published in print or collected in a single library. He also believed that in spite of the new technology it remained the responsibility of monks to continue to copy and preserve obscure texts which might not be economically viable to print. Working manually, the monks could produce copies of higher quality, or include decorative elements (ceteros librorum ornatus) not possible in a printed edition. 

Perhaps not surprisingly, Tritheim's retrograde treatise which took issue with the new technology was not a best-seller. It underwent only one printed edition, from Mainz at the press of Peter von Friedberg, during the 15th century.

ISTC no. it00442000.  Wagner, Als die Lettern laufen lernten. Medienwandel im 15. Jahrhundert (2009) no. 32.

In November 2013 a digital facsimile was available from the Bayerische Staatsbibliothek website at this link

View Map + Bookmark Entry

1500 – 1550

Leonardo's Lost Painting, "Salvator Mundi", Discovered Circa 1500

On July 10, 2011 artdaily.org reported that:

"A lost painting by Leonardo da Vinci has been identified in an American collection and will be exhibited for the first time this November. Titled Salvator Mundi (Savior of the World) and dating around 1500, the newly discovered masterpiece depicts a half-length figure of Christ facing frontally, holding a crystal orb in his left hand as he raises his right in blessing. One of some 15 surviving Leonardo oil paintings, the work will be included in 'Leonardo da Vinci: Painter at the Court of Milan,' to be held at the National Gallery in London from November 9, 2011 until February 5, 2012. The last time a Leonardo painting was discovered was in 1909, when the Benois Madonna, now in the Hermitage in St. Petersburg, came to light.

"DOCUMENTED HISTORY  

"Leonardo's painting of the Salvator Mundi was long known to have existed, but was presumed to have been destroyed. The composition was documented in two preparatory drawings by Leonardo and more than 20 painted copies by students and followers of the artist, as well as a meticulous 1650 etching made after the original painting by the Bohemian artist Wenceslaus Hollar.

"ROYAL PROVENANCE  

"The recently rediscovered painting was first recorded in the art collection of King Charles I of England in 1649. It was sold after his death, returned to the Crown upon the accession of Charles II, and later passed to the collection of the Duke of Buckingham, whose son put it at auction in 1763 following the sale of Buckingham House (now Palace) to the King. All trace of the work was then lost until 1900, when the picture was acquired by Sir Frederick Cook, but by then the painting had been damaged, disfigured by overpaint, and its authorship by Leonardo forgotten. Cook's descendants sold the painting at auction in 1958, when it brought 45 pounds Sterling. A photograph taken before 1912 records its compromised appearance at that time. This photograph has recently been circulated in the media, as has another photo [with Christ in a red tunic], incorrectly identified as the (recently rediscovered) work. In 2005, the painting was acquired from an American estate and brought to a New York art historian and private dealer named Robert Simon for study. The Salvator Mundi is privately owned and not currently for sale.

"CONSERVATION & AUTHENTICATION  

"After an extensive conservation treatment, the painting was examined by a series of international scholars. An unequivocal consensus was reached that the Salvator Mundi was the original by Leonardo da Vinci. Opinions vary slightly in the matter of dating, with some assigning the work to the late 1490's, and others placing it after 1500.

"Scholars were convinced of Leonardo's authorship due to the painting's adherence in style to the artist's known paintings; the quality of execution; the relationship of the painting to the two preparatory drawings; its correspondence to Wenceslaus Hollar's etching; its superiority to the numerous versions of the known composition; and the presence of pentimenti, or changes by the artist not found in copies" (http://www.artdaily.org/index.asp?int_sec=2&int_new=48949, accessed 07-10-2011).

On March 4, 2014 AFAnews.com reported on the sale of the painting:

"Leonardo da Vinci's 'Salvator Mundi," which was discovered by American art dealer Alexander Parish at an estate sale in the mid-2000s, was sold to an unidentified collector for between $75 milllion and $80 million in May 2013. The details of the sale, which was organized by Sotheby's, remained confidential until this week.

" 'Salvator Mundi,' a half-length protrait of Christ holding a crystal orb in one hand, was created around 1500. Since 1900, the heavily over-painted canvas was attributed to Boltraffio, an artist who worked in da Vinci's studio. It wasn't until Paris acquired the work and it underwent  extensive cleaning and research that it was deemed an original da Vinci formerly owned by King Charles I of England. Prior to last year's sale, Paris and two other art dealers shared ownership of the work.

"In 2012, after raising tens of millions of dollars, the Dallas Museum of Art attempted to buy 'Salvator Mundi.' Museum officials made a formal offer to Paris and the painting's other owners but were rebuffed after some discussion."

View Map + Bookmark Entry

1550 – 1600

Archbishop Matthew Parker Assembles the First Major Antiquarian Book Collection in England 1568

In 1568 Archbishop of Canterbury Matthew Parker secured a license from Queen Elizabeth to seek out "auncient records or monuments" from the former libraries of the monasteries suppressed by Henry VIII, and from old cathedral priories converted to the use of the Church of England.

"He thus had first choice of many hundreds of manuscripts of the very highest importance. This was the earliest major antiquarian collection ever asssembled in England, long before those of Thomas Bodley (1545-1613) or Robert Cotton (1571-1631), which became the foundations of the libraries of the Bodleian in Oxford and, eventually, the British Library in London" (de Hamel, The Parker Library: Treasures from the Collection [2000] 8). 

View Map + Bookmark Entry

Sir Robert Bruce Cotton Forms One of the Most Important Private Collections of Manuscripts Ever Collected in England 1588 – 1631

In 1588 English politician Sir Robert Bruce Cotton began collecting original manuscripts, an activity which he continued until his death in 1631. One of the foundations of the British Museum since 1753, and hence of the British Library, Cotton's library of 958 manuscripts has been called the most important collection of manuscripts ever assembled in Britain by a private individual. Competing for this designation would, of course, be Archbishop of Canterbury Matthew Parker's library at Corpus Christi College, Cambridge. Parker, who began collecting in 1568, preceded Cotton in his collecting by a generation. The Sir Thomas Phillipps library, though formed in the nineteenth century and dispersed, was many times larger than either Cotton's or Parker's libraries, and also needs to be considered for the designation. 

Among Cotton's many treasures were the Lindisfarne Gospels, two of the contemporary exemplifications of Magna Carta, and the only surviving manuscript of Beowulf.  The first published catalogue of the Cottonian Library was Thomas Smith's Catalogus Librorum Manuscriptorum Bibliothecae Cottonianae, a substantial folio volume including a life of Robert Cotton and a history of the library published in Oxford in 1696. 

On October 23, 1731 Cotton's library suffered very significant damage in a fire where it was stored at Ashburnham House in London. Of its 958 manuscripts 114 were "lost, burnt or intirely spoiled" and another 98 damaged enough to be considered defective. The Wikipedia article on Ashburnham House states  

"a contemporary records the librarian, Dr. Bentley, leaping from a window with the priceless Codex Alexandrinus under one arm. The manuscript of Beowulf was damaged, and reported in 'The Gentleman's Magazine.' "  

An expert committee was formed to investigate the cause of the fire and assess the damage. This resulted in A Report from the Committee appointed to view the Cottonian Library and such of the Publick Records of this Kingdom as they think proper and to Report to the House the Condition thereof together with what they shall judge fit to be done for the better Reception Preservation and more convenient Use of the same (London, 1732). David Casley (1681/2-1754), deputy librarian of both the Royal and Cottonian collections, and a member of this committee, compiled the list of damaged and destroyed Cotton manuscripts, which was printed in an appendix to the committee's report. Casley described a number of manuscripts as "burnt to a crust." The Committee was also "empowered to investigate the state of the public records as a whole. They found that for the most part they were 'in great Confusion and Disorder' and much in need of care and attention" (Miller, That Noble Cabinet, 36).

The 1732 report also contained an appendix consisting of "A Narrative of the Fire. . . and of the Methods used for preserving and recovering the Manuscripts of the Royal and Cottonian libraries,"  compiled by the Reverend William Whiston the younger, the clerk in charge of the records kept in the Chapter House at Westminster, another notorious firetrap. Almost immediately after the fire attempts at restoration or stabilization of some of the damaged manuscripts was undertaken, mostly by inexperienced workers under the supervision of members of the committee, using whatever methods were available, and thus potentially damaging as much as preserving what remained.  

In April 1837, palaeographer Frederic Madden, Assistant Keeper of Manuscripts at the British Museum, was shown a garret of the old museum building which contained a large number of burnt and damaged fragments and vellum codices. Madden immediately identified these as part of the Cottonan Library. During his tenure as Keeper of MSS, Madden undertook extensive conservation work on the Cottonian manuscripts, often in the face of opposition from the Museum’s board, who deemed the enterprise prohibitively expensive.

In collaboration with the bookbinder Henry Gough, Madden developed a conservation strategy that restored even the most badly damaged fragments and manuscripts to a usable state. Vellum sheets were cleaned and flattened and mounted in paper frames. Where possible, they were rebound in their original codices. Madden also carried out conservation work on the rest of the Cottonian Library. By 1845 the conservation work was largely complete, though Madden was to suffer one more setback when a fire broke out in the Museum bindery, destroying some additional manuscripts in the Cottonian Library.  The process of restoring and conserving these precious manuscripts, which continues to this day, was studied extensively by Andrew Prescott in " 'Their Present Miserable State of Cremation' : the Restoration of the Cotton Library," Sir Robert Cotton as Collector" Essays on an Early Stuart Courtier and His Legacy, edited by C. J. Wright (1997) 391-454. This paper, and its 357 footnotes, was available online in April 2012.

"The Cottonian Library was the richest private collection of manuscripts ever amassed; of secular libraries it outranked the Royal library, the collections of the Inns of Court and the College of Arms; Cotton's house near the Palace of Westminster became the meeting-place of the Society of Antiquaries and of all the eminent scholars of England; it was eventually donated to the nation by Cotton's grandson and now resides at the British Library.

"The physical arrangement of Cotton's Library continues to be reflected in citations to manuscripts once in his possession. His library was housed in a room 26 feet (7.9 m) long by six feet wide filled with bookpresses, each with the bust of a figure from classical antiquity on top. Counterclockwise, these are catalogued as Julius (i.e., Julius Caesar), Augustus, Cleopatra, Faustina, Tiberius, Caligula, Claudius, Nero, Galba, Otho, Vitellius, Vespasian, Titus, and Domitian. (Domitian had only one shelf, perhaps because it was over the door.) Manuscripts are now designated by library, bookpress, and number: for example, the manuscript of Beowulf is designated Cotton Vitellius A.xv, and the manuscript of Pearl is Cotton Nero A.x" (Wikipedia article on Sir Robert Cotton, accessed 11-22-2008).

The most useful version of Smith's 1696 catalogue of Cotton's library, published in somewhat reduced format, was the offset reprint done from Sir Robert Harley's copy, annotated by his librarian Humfrey Wanley, together with documents relating to the fire of 1731. This annotated edition included translations into English of the Latin essays on the life of Robert Cotton and the history of the library. Edited by C.G.C. Tite, it was published in 1984. See also Tite, The Early Records of Sir Robert Cotton's Library. Formation, Cataloguing, Use (2003). 

Sharpe, Sir Robert Cotton 1586-1631. History and Politics in Early Modern England (1979).

View Map + Bookmark Entry

1600 – 1650

At Hereford Cathedral the Largest Historic Chained Library in the World 1611

The library at Hereford Cathedral. (View Larger)

The working library of Hereford Cathedral in England originated in the eleventh century. The chained library at the cathedral, containing 229 medieval manuscripts, remains the largest historic chained library in the world, with all its rods, chains and locks intact. It has been preserved in the form in which it was maintained from 1611 to 1841.

View Map + Bookmark Entry

"The Great Parchment Book" and Its Digital Restoration After Three Centuries 1639 – 2013

In 1639 a Commission instituted under the Great Seal by Charles I ordered compilation of The Great Parchment Book of the Honourable The Irish Society, a major survey of all estates in Derry managed by the City of London through the Irish Society and the City of London livery companies. It remained part of the City of London’s collections held at London Metropolitan Archives (LMA reference CLA/049/EM/02/018), and it represents a key source for the City of London’s role in the Protestant colonization and administration of the Irish province of Ulster.

However, in February 1786, a fire in the Chamber of London at the Guildhall in the City of London destroyed most of the early records of the Irish Society, so that very few 17th century documents remain. Among those which survived is the Great Parchment Book, but the fire caused such dramatic shrivelling and fire damage to the manuscript that it was completely unavailable to researchers since this date. 

"As part of the 2013 commemorations in Derry of the 400th anniversary of the building of the city walls, it was decided to attempt to make the Great Parchment Book available as a central point of an exhibition in Derry’s Guildhall.

Box of Pages from the Great Parchment Book (before rehousing)

"The manuscript consisted of 165 separate parchment pages, all of which suffered damage in the fire in 1786. The uneven shrinkage and distortion caused by fire had rendered much of the text illegible. The surviving 165 folios (including fragments and unidentified folios) were stored in 16 boxes, in an order drawing together as far as possible the passages dealing with the particular lands of different livery companies and of the Society.

"It soon became apparent that traditional conservation alone would not produce sufficient results to make the manuscript accessible or suitable for exhibition, since the parchment was too shrivelled to be returned to a readable state. However, much of the text was still visible (if distorted) so following discussions with conservation and computing experts, it was decided that the best approach was to flatten the parchment sheets as far as possible, and to use digital imaging to gain legibility and to enable digital access to the volume.

"A partnership with the Department of Computer Science and the Centre for Digital Humanities at University College London (UCL) established a four year EngD in the Virtual Environments, Imaging and Visualisation programme in September 2010 (jointly funded by the Engineering and Physical Sciences Research Council and London Metropolitan Archives) with the intention of developing software to enable the manipulation (including virtual stretching and alignment) of digital images of the book rather than the object itself. The aim was to make the distorted text legible, and ideally to reconstitute the manuscript digitally. Such an innovative methodology clearly had much wider potential application.

Before virtual flatteningAfter virtual flattening

"During the imaging work a set of typically 50-60 22MP images was captured for each page and used to generate a 3D model containing 100-170MP, which allowed viewing at archival resolution. These models could be flattened and browsed virtually, allowing the contents of the book to be accessed more easily and without further handling of the document. UCL’s work on the computational approach to model, stretch, and read the damaged parchment will be applicable to similarly damaged material as part of the development of best practice computational approaches to digitising highly distorted, fire-damaged, historical documents" (http://www.greatparchmentbook.org/the-project/, accessed 10-26-2014).

View Map + Bookmark Entry

1650 – 1700

John Wilkins Creates A Universal Language Based on a Classification Scheme or Ontology, and a Universal System of Measurement 1668

In An Essay towards a Real Character and a Philosophical Language English clergyman and natural philosopher John Wilkins  attempted to create a universal, artificial language, based upon an innovative classification of knowledge, by which scholars and philosophers as well as diplomats, scholars, and merchants, could communicate. Wilkins intended his "universal language" as a supplement to, rather than a replacement for, existing "natural" languages. His scheme has been called ingenious but completely unworkable.

In this book Wilkins also called for the institution of a "universal measure" or "universal metre," which would be based on a natural phenomenon rather than royal decree, and would also be decimal rather than the various systems of multipliers, often duodecimal, that coexisted at the time. The meter or metre would not gain traction until after the French Revolution.

"During the final stages of work on his Essay Wilkins lost his house and most of his belongs and papers, in the great fire of London, but being eager to complete his scheme he enlisted the help of John Ray and Francis Willioughby to improve the botanical and zoological nomenclature. This was a major factor in stimulating Ray to develop his own classificatory studies. Similarly, Samuel Pepys reported that he helped to draw up a table of naval terms, such as the names of rigging. Even with this and other help, Wilkins admited his scheme's shortcomings and called upon the Royal Society to improve it. Although various fellows of the society spoke highly of the scheme for a while, only Robert Hooke showed any lasting commitment to it, and the committee established to improve on the Essay never reported. Scholars have argued about the major influences upon Wilkins's linguistic studies. There is little evidence that the universal language schemes of Amos Comenius played any significant role; Mersenne may have been an inspiration but George Dalgarno, to help whom Wilkins had begun to draw up classifactory tables of knowledge after 1657, was a more dirrect influence" (ODNB).

By "real character" Wilkins meant:

"an ingeniously constructed family of symbols corresponding to an elaborate classification scheme developed at great labor by Wilkins and his colleagues, which was intended to provide elementary building blocks from which could be constructed the universe's every possible thing and notion. The Real Character is emphatically not an orthography in that it is not a written representation of oral speech. Instead, each symbol represents a concept directly, without (at least in the early parts of the Essay's presentation) there being any way of vocalizing it at all; each reader might, if he wished, give voice to the text in his or her own tongue. Inspiration for this approach came in part from (partially mistaken) accounts of the Chinese writing system.

"Later in the Essay Wilkins introduces his "Philospophical Language," which assigns phonetic values to the Real Characters, should it be desired to read text aloud without using any of the existing national languages. (The term philosophical language is an ill-defined one, used by various authors over time to mean a variety of things; most of the description found at the article on "philosophical languages" applies to Wilkins' Real Character on its own, even excluding what Wilkins called his "Philosophical Language")

"For convenience, the following discussion blurs the distinction between Wilkins' Character and his Language. Concepts are divided into forty main Genera, each of which gives the first, two-letter syllable of the word; a Genus is divided into Differences, each of which adds another letter; and Differences are divided into Species, which add a fourth letter. For instance, Zi identifies the Genus of “beasts” (mammals); Zit gives the Difference of “rapacious beasts of the dog kind”; Zitα gives the Species of dogs. (Sometimes the first letter indicates a supercategory— e.g. Z always indicates an animal— but this does not always hold.) The resulting Character, and its vocalization, for a given concept thus captures, to some extent, the concept's semantics.

"The Essay also proposed ideas on weights and measure similar to those later found in the metric system. The botanical section of the essay was contributed by John Ray; . . .  

 "Jorge Luis Borges wrote a critique of Wilkins' philosophical language in his essay El idioma analítico de John Wilkins (The Analytical Language of John Wilkins). He compares Wilkins’ classification to the fictitious Chinese encyclopedia Celestial Emporium of Benevolent Knowledge, expressing doubts about all attempts at a universal classification. Modern information theory also suggests that it is a bad idea to have words with similar but distinct meanings also sound similar, because mishearings and the resulting confusion would be much more prominent than in real-world languages. In The Search for the Perfect Language, Umberto Eco catches Wilkins himself making this kind of mistake in his text, using Gαde (barley) instead of Gαpe (tulip)" (Wikipedia article on An Essay towards a Real Character and a Philosophical Language, accessed 06-16-2010).

View Map + Bookmark Entry

1750 – 1800

Thomas Jefferson Describes Printing as a Way to Preserve Information February 18, 1791

In a letter to Ebenezer Hazard written during Jefferson's tenure as Secretary of State, Thomas Jefferson wrote concerning the preservation of information:

". . . let us save what remains: not by vaults and locks which fence them from the public eye and use in consigning them to the waste of time, but by such a multiplication of copies, as shall place them beyond the reach of accident."

Jefferson's idea of preserving texts by distributing copies had been anticipated by exponents of the new invention of printing by movable type in the second half of the fifteenth century who believed, and rightly so, that printing an edition of a text that might survive in only one or a handful of manuscript copies was a way of safeguarding the existence of the text.

View Map + Bookmark Entry

1800 – 1850

Mairet Issues a Manual of Lithography, Bookbinding, and Cleaning and Restoring Paper 1818 – 1824

In 1818 F. Mairet published from Dijon, Notice sur la lithographie. Mairet, a paper merchant and distinguished bookbinder, set up the second lithographic press in Dijon, and became the first lithographic printer, besides Senefelder himself, to write a manual on lithography. The book sold successfully and six years later Mairet issued a revised edition, adding to it an essay on bookbinding and on the cleaning (blanchiment) of books and prints.  The title of the second edition, issued from Chatillon-sur-Seine, was Notice sur lithographie. . . suivi d'un essai sur la relieure et le blanchiment des livres et gravures. The second edition, then, became one of the earliest discussions in book form of the methods of restoring books and prints.

Bigmore & Wyman II, 14. , Twyman, Lithography, 93-94.

View Map + Bookmark Entry

Michael Faraday on Decay in Leather Bookbindings April 7, 1843

In a paper on Light and Ventilation delivered at the Royal Institution where he worked on April 7, 1843 chemist and physicist Michael Faraday attributed decay in leather bookbindings and chairs to the heat and sulphur fumes emanating from the illuminating gas then used. Faraday began his career as a bookbinder.

View Map + Bookmark Entry

Friedrich Keller Rediscovers Paper Making from Wood Pulp & Industrializes the Process October 26, 1844 – August 1845

Though Matthias Koops in England produced paper from wood pulp as early as 1801, credit for the discovery of the industrial process for making wood pulp paper is generally given to the German machinist and inventor Friedrich Gottlob Keller, and to the Canadian poet and inventor Charles Fenerty, both of whom appear to have independently announced the discovery of similar processes in 1844. However, neiter Fenerty nor Keller exploited the process; that was accomplished by the German industrialists, Heinrich Voelter, and Johann Matthäus Voith.

Fenerty began experimenting with wood pulp around 1838. On October 26, 1844 he took a sample of his paper to the leading newspaper in Halifax, Nova Scotia, the Acadian Recorder. According to the Wikipedia he wrote the following letter on this piece of wood pulp paper: 

Messrs. English & Blackadar,

Enclosed is a small piece of PAPER, the result of an experiment I have made, in order to ascertain if that useful article might not be manufactured from WOOD. The result has proved that opinion to be correct, for- by the sample which I have sent you, Gentlemen- you will perceive the feasibility of it. The enclosed, which is as firm in its texture as white, and to all appearance as durable as the common wrapping paper made from hemp, cotton, or the ordinary materials of manufacture is ACTUALLY COMPOSED OF SPRUCE WOOD, reduced to a pulp, and subjected to the same treatment as paper is in course of being made, only with this exception, VIZ: my insufficient means of giving it the required pressure. I entertain an opinion that our common forest trees, either hard or soft wood, but more especially the fir, spruce, or poplar, on account of the fibrous quality of their wood, might easily be reduced by a chafing machine, and manufactured into paper of the finest kind. This opinion, Sirs, I think the experiment will justify, and leaving it to be prosecuted further by the scientific, or the curious.

I remain, Gentlemen, your obdt. servant,

CHARLES FENERTY.

The Acadian Recorder Halifax, N.S. Saturday, October 26, 1844

Fenerty seems never to have exploited his process. Keller, on the other hand, sold his process to a paper specialist Heinrich Voelter, and in August, 1845 both Keller and Voelter received a German patent, which reverted entirely to Voelter, and Keller became unemployed.  In 1848 industrialist Johann Matthäus Voith began working with Voelter to develop means of mass producing paper by wood pulp processing, and by 1852 Voelter was selling numerous wood-grinding machines for the papermaking process, and producing wood pulp paper at his mill in Heidenheim. Voith continued to improve the process, and in 1859 he created the first Raffineur, a machine that refined the raw wood pulp and significantly improved the quality of paper products.  Voelter and Voith's business continues today as a division of the German industrial company Voith AG.

"Throughout his life, Keller received no royalties from his invention. In 1870 he received from a number of German paper makers and other associations a small sum of money, which he used to buy a house in Krippen, Germany. Then towards the end of his life, various countries put together a fair sum of money for him, enough for a worry-free retirement, and he also received several awards in recognition of his invention" (Wikipedia article on Friedrich Gottlob Keller, accessed 03-26-2012). 

View Map + Bookmark Entry

Alfred Bonnardot Issues the First Book on the Restoration of Rare Books and their Bindings 1846

In 1846 French bookbinder, restorer, and writer Alfred Bonnardot published Essai sur la restauration des anciennes estampes et des livres rares, ou Traité sur les meilleurs procédés a suivre pour réparer, détacher, décolorier et conserver les gravures, dessins et livres. Ouvrages spécialment utile aux artists, aux collectionneurs, aux marchands d'estampes, aux bibliophiles, etc. This was the first book on the restoration of rare books and their bindings. It also covered issues of restoration of works of art on paper, and was directed toward artists, collectors, print dealers and bibliophiles. The small work consisted of 80 pages, including an index.  

Bonnardot later issued a Supplément of 31 pages with 15 pages of revisions to the previous work and an additional Chapter XV (pp. 16-31) "De la restauration et de la reliure provisoire des livres rares." The Table des Chapitres was published on the first leaf of the index (p. 79).  15 pages of revisions to a text of only 80 pages, plus the addition of an additional chapter as an afterthought, suggest a work that was rapidly published, probably before the author had the opportunity to make sufficient revisions. 400 copies were printed.

In 1858 Bonnardot published a greatly revised second edition of this work. According to his preface to the later edition, the first edition was sold out by 1850, but presumably, having rushed the first edition, Bonnardot took sufficient time to put out a more definitive second edition. The revised edition, published 12 years after the first, consisted of eight preliminary pages, and 352 pages of text. In addition to the greatly expanded text, this edition is useful for its chronological listing, with comments, of rare works on the topics covered in the text. The list includes some books that Bonnardot knew about but was not able to see. The second edition also included an "Exposé des divers systèmes de reproduction des anciennes estampes et des livres rares." This covered lithographic, photographic, and other means of reproduction.  A German translation of the 1858 edition was published in 1859.

Portions of Bonnardot's 1858 edition were translated into English in Buck, Book Repair and Restoration . . . including some Translated Selections from Essai sur l'art de Restaurer les Estampes et les Livres par A. Bonnardot, Paris 1858 (1918).

View Map + Bookmark Entry

1850 – 1875

James Glaisher Proposes Using Microphotography for Document Preservation 1851 – 1852

Impressed by the exhibition of photography at the Great Exhibition of 1851, English meterologist and aeronaut James Glaisher proposed that microphotography be used as a method for document preservation. According to the Wikipedia article on Microform, astronomer and photography pioneer Sir John Herschel supported this view in 1853.

Great Exhibition of the Works of Industry of All Nations of 1851. Reports by the Juries (1852). Carter & Muir, Printing and the Mind of Man (1967) no. 331.

View Map + Bookmark Entry

Benjamin Tilghman Invents the Sulfite Pulping Process for Manufacturing Paper from Wood Pulp 1866

In 1866 American soldier, chemist and inventor Benjamin Chew Tilghman developed the sulfite pulping process for the manufacture of paper from wood pulp, receiving the US patent on the use of calcium bisulfite, Ca(HSO3)2, to pulp wood in 1867. The first mill using this process was built in Bergvik, Sweden in 1874. It used magnesium as the counter ion and was based on work by Swedish chemical engineer Carl Daniel Ekman.

"The soda process was for many years the only practical process for pulping of straw, wood, and similar fibrous materials. However, in 1866, 1867, and 1869 the American chemist Benjamin Chew Tilghman of Philadelphia was granted British and American patents on a new process of pulping of wood or other vegetable fibrous substances. This process involved essentially the heating, under pressure, of lignified fibrous material with an aqueous solution of sulfurous acid, with or without the addition of sulfite of an alkali such as calcium sulfite or bisulfite. These patents were the results of extensive experiments conducted by Benjamin Chew Tilghman in cooperation with his younger brother Richard Albert at the pulp and paper mills of W. W. Harding and Sons in Manayunk near Philadelphia" (Phillips, Benjamin Chew Tilghman, and the Origin of the Sulfite Process for the Delignification of WoodJ. Chem. Educ., 1943, 20 (9), p. 444, DOI: 10.1021/ed020p444).

Throughout the 19th century it was increasingly necessary to find workable substitutes for scarce linen rags, the supply of which could not possibly keep up with the growing demands for paper. While the production of paper from wood pulp enabled greatly increased production, the bleaching agents used in this new process reduced the longevity of paper. The pulping, bleaching, and sizing processes generated hydrochloric and sulfuric acids, which over time resulted in brittleness and deterioration of paper, and the possible loss of information.

View Map + Bookmark Entry

1875 – 1900

Listening to the Earliest Surviving Recording of a Musical Performance June 22, 1878 – October 2012

In October 2012 computing technology made it possible to listen to the oldest playable recording of an American voice and the first-ever capturing of a musical performance.  The recording on tinfoil, which lasts 78 seconds, was made on a phonograph in St. Louis, Missouri on June 22, 1878, months after Thomas Edison invented the phonograph.

" 'In the history of recorded sound that's still playable, this is about as far back as we can go,' said John Schneiter, a trustee at the Museum of Innovation and Science in Schenectady, where it was played Thursday night in the city where Edison helped found the General Electric Co.

"The recording opens with a 23-second cornet solo of an unidentified song, followed by a man's voice reciting 'Mary Had a Little Lamb' and 'Old Mother Hubbard.' The man laughs at two spots during the recording, including at the end, when he recites the wrong words in the second nursery rhyme.

" 'Look at me; I don't know the song,' he says.

"When the recording was played using modern technology during a presentation Thursday at a nearby theater, it was likely the first time it had been played at a public event since it was created during an Edison phonograph demonstration held June 22, 1878, in St. Louis, museum officials said. The recording was made on a sheet of tinfoil, 5 inches wide by 15 inches long, placed on the cylinder of the phonograph Edison invented in 1877 and began selling the following year. A hand crank turned the cylinder under a stylus that would move up and down over the foil, recording the sound waves created by the operator's voice. The stylus would eventually tear the foil after just a few playbacks, and the person demonstrating the technology would typically tear up the tinfoil and hand the pieces out as souvenirs, according to museum curator Chris Hunter.

"Popping noises heard on this recording are likely from scars left from where the foil was folded up for more than a century.

" 'Realistically, once you played it a couple of times, the stylus would tear through it and destroy it,' he said. Only a handful of the tinfoil recording sheets are known to known to survive, and of those, only two are playable: the Schenectady museum's and an 1880 recording owned by The Henry Ford museum in Michigan.

"Hunter said he was able to determine just this week that the man's voice on the museum's 1878 tinfoil recording is believed to be that of Thomas Mason, a St. Louis newspaper political writer who also went by the pen name I.X. Peck. Edison company records show that one of his newly invented tinfoil phonographs, serial No. 8, was sold to Mason for $95.50 in April 1878, and a search of old newspapers revealed a listing for a public phonograph program being offered by Peck on June 22, 1878, in St. Louis, the curator said. A woman's voice says the words 'Old Mother Hubbard,' but her identity remains a mystery, he said. Three weeks after making the recording, Mason died of sunstroke, Hunter said" (http://www.google.com/hostednews/ap/article/ALeqM5izrvFWaR6h-FWye-Eq2bZN5RCqOg?docId=c9195e25da6f473e90e726152ddbc4d6, accessed 10-26-2012).

View Map + Bookmark Entry

One of the Most Dramatic Problems in the Preservation of Media 1889 – 1955

In 1889 inventor and entrepreneur George Eastman of Rochester, New York used Cellulose Nitrate as a base for photographic roll film. Cellulose nitrate was used for photographic and professional 35mm motion picture film until the 1950s, eventually creating one of the most dramatic problems in the preservation of media.

"It is highly inflammable and also decomposes to a dangerous condition with age. When new, nitrate film could be ignited with the heat of a cigarette; partially decomposed, it can ignite spontaneously at temperatures as low as 120 F (49C). Nitrate film burns rapidly, fuelled by its own oxygen, and releases toxic fumes.

"Decomposition: There are five stages in the decomposition of nitrate film:

"(i) Amber discolouration with fading of picture.
"(ii) The emulsion becomes adhesive and films stick together; film becomes brittle.
"(iii) The film contains gas bubbles and gives off a noxious odour
"(iv) The film is soft, welded to adjacent film and frequently covered with a viscous froth
"(v) The film mass degenerates into a brownish acrid powder.

"Film in the first and second stages can be copied, as may parts of films at the third stage of decomposition. Film at the fourth or fifth stages is useless and should be immediately destroyed by your local fire brigade because of the dangers of spontaneous combustion and chemical attack on other films. Contact your local environmental health officer about this.

"It has been estimated that the majority of nitrate film will have decomposed to an uncopiable state by the year 2000, though archives are now deep-freezing film."

View Map + Bookmark Entry

The Largest and Most Diverse Collection of Medieval Manuscripts in the World 1896 – 1902

In 1896 Agnes Smith Lewis and Margaret Dunlop Gibson, identical-twin sisters and Semitic scholars, who between them learned twelve languages, returned to Cambridge from a trip to the Middle East bearing leaves from several ancient Hebrew manuscripts that they had purchased from a Cairo bookseller. They showed the parchment leaves to Solomon Schechter, reader in Talmudic Studies at Cambridge, who was surprised to discover among them in May 1896 an 11th or 12 century copy of the Hebrew proverbs of Ben Sira, a second-century BCE Hebrew book of wisdom. Through translations, where it is known as Sirach in Greek, or Ecclesiasticus in Latin (not to be confused with Ecclesiastes) the work became part of the Christian Bible,  This he published with English translation, introduction, and notes in the Expositor for July 1896, (p. i seqq.)

Wanting to share news of his discovery Schechter wrote to his friend Adolf Neubauer, sublibrarian at the Bodleian Library and reader in Rabbinic Hebrew at Oxford, that he had discovered a fragment of Sirach (xxxix. 15 to xl. 7) in Hebrew. In response to Schechter's postcard, Neubauer replied  two weeks later that he and his assistant, Arthur Cowley, had “coincidentally" discovered nine pages of Ben Sira at Oxford. Of course, this was no coincidence. Schechter's discovery had prompted Neubauer to restudy much more carefully a collection of Hebrew manuscripts that he had previously dismissed and had intended to sell—a box of about 10,000 pages of manuscripts that had been obtained from the genizah in 1895 by Oxford Assyriologist and linguist Archibald Sayce. Using Schechter's discovery and finds from Sayce's donation, in 1897 Neubauer and A. E. Cowley published The Original Hebrew of a Portion of Ecclesiasticus (xxxix.14 to ILIX.11) Together with the Early Versions and an English Translation Followed by the Quotations from Ben Sira in Rabbinical Literature. This was probably the first scholarly book in English on manuscripts from the Cairo genizah. 

Not wanting to miss out on any more discoveries, Schechter set out for Egypt where, with the financial assistance of Hebraist Charles Taylor, then Master of St. John's College, Cambridge, he purchased what he considered the most significant portion of the contents of the genizah (Geniza), a sacred storeroom in the loft of the Ben Ezra Synagogue in Fustat, presently Old Cairo.

"According to rabbinic law (see, for instance, Mishna Shabbat 16:1), once a holy book can no longer be used (because it is too old, or because its text is no longer relevant) it cannot be destroyed or casually discarded: texts containing the name of God should be buried or, if burial is not possible, placed in a genizah.  

"At least from the early 11th century, the Jews of Fustat, one of the most important and richest Jewish communities of the Mediterranean world, reverently placed their old texts in the Genizah. Remarkably, however, they placed not only the expected religious works, such as Bibles, prayer books and compendia of Jewish law, but also what we would regard as secular works and everyday documents: shopping lists, marriage contracts, divorce deeds, pages from Arabic fables, works of Sufi and Shi'ite philosophy, medical books, magical amulets, business letters and accounts, and hundreds of letters: examples of practically every kind of written text produced by the Jewish communities of the Near East can now be found in the Genizah Collection, and it presents an unparalleled insight into the medieval Jewish world" (http://cudl.lib.cam.ac.uk/collections/genizah, accessed 12-14-2012).

Schechter sent back to Cambridge about 193,000 manuscripts from the genizah. These became the Taylor-Schechter Genizah Collection at Cambridge University Library. In 2012 this entire collection was in the process of being digitized and placed online as part of the Cambridge Digital Library.

        When Schechter assumed the presidency of the Jewish Theological Seminary of America in New York in 1902 he brought an additional collection of manuscripts from the genizah to that library. Currently the Jewish Theological Seminary holds about 40,000 manuscripts or fragments from the Cairo genizah. An additional 11,000 fragments are at the John Rylands University Library at the University of Manchester, purchased from the estate of Dr. Moses Gaster in 1954. Smaller portions are preserved in other universities around the world.

"The Cairo Genizah, mostly discovered late in the nineteenth century but still resurfacing in our own day, is a collection of over 200,000 fragmentary Jewish texts (which may well equal three times that number of folios). Many of these were stored in the loft of the ancient Ben Ezra Synagogue in Fustat medieval Cairo, to the south-west of the modern city) between the 11th and 19th centuries. A genizah is a storage room where copies of respected texts with scribal errors or physical damaged, or unusable documents, are kept until they can be ritually buried. The dark, sealed, room in the arid Egyptian climate contributed to the preservation of the documents, the earliest of which may go back to the eighth and ninth centuries.

"These manuscripts outline a 1,000-year continuum of Middle-Eastern history and comprise the largest and most diverse collection of medieval manuscripts in the world. The Genizah can be described as one of the greatest Jewish treasures ever found.

"Early visitors to the Genizah were wary of examining its contents because of the local superstition that foretold disaster for anyone who might remove any of its contents. This, too, contributed to the preservation of the documents.

"In the second half of the 19th century some texts were sold by synagogue officials to dealers, scholars and visitors. Famous libraries in St. Petersburg, Paris, London, Oxford, Cambridge and Philadelphia acquired major collections.

"In the early 1890's Rabbi Shlomo Aharon Wertheimer, a Torah scholar, collector and researcher, living in Jerusalem, began publishing manuscripts that he had purchased from the Cairo Genizah with his identifications and explanations – among them rare and important texts. He also sold some of these manuscripts to collectors in order to finance the purchase of additional ones. To some extent, he was one of the first to recognize the treasure trove that was the Cairo Genizah."

These quotations were from the website of the Friedberg Genizah Project, an effort underway in Jerusalem to digitize and preserve all surviving portions of the Cairo Genizah from around the world.

_________

In December 2013 BBC News announced that historic rivals Oxford and Cambridge Universities had jointly raised £1.2m to purchase the Lewis-Gibson Genizah collection, containing about 1,700 documents and fragments, that twin-sisters Agnes Smith Lewis and Margaret Dunlop Gibson had acquired in Cairo and donated to Westminster College, Cambridge.

View Map + Bookmark Entry

The Questionable Quality of Paper 1898

In his annual report for 1898 Librarian of Congress John Russell Young commented on the "questionable quality of the paper upon which so much of the Library material is printed." Referring to the wood pulp paper that is inferior to paper previously made from linen rags, Young warned that many of the works coming into the Library "threaten in a few years to crumble into a waste heap, with no value as record."

View Map + Bookmark Entry

1900 – 1910

Revealing a Hidden Image in a 1901 Painting by Picasso in a 2012 Newspaper Article 1901 – October 24, 2012

Since 1989 conservators and art historians have known that hidden beneath the surface of Picasso's “Woman Ironing”  preserved in the Solomon R. Guggenheim Museum, New York, is the upside-down ghost of another painting — a three-quarter-length portrait of a man with a mustache. The hidden image was first seen in photographs of this painting from Picasso's Blue Period (1901-1904) taken with an infrared camera in 1989.  

On October 24, 2012 The New York Times published an article by Carol Vogel on this painting and the painting hidden underneath entitled "Under One Picasso, Another."  From the standpoint of this database on the history of media what I find most interesting about this is the "interactive feature" published in association with the article entitled "Scratching the Surface, Two Picassos Revealed."

A very clever imaging program in the interactive feature invited the reader to "click and drag your mouse over the painting to see what was hidden beneath it." As I wiped the top image of the painting off with mouse strokes the painting underneath was revealed.  I could also rotate the image and reset it back to the top layer.

View Map + Bookmark Entry

Problems with Leather Used in Bookbinding 1905

The final "Report of the Committee of the Society of Arts on Leather for Bookbinding" published in London in 1905 confirmed the view that bookbinding leathers being used were inferior to those used 50 years earlier. It attributed degradation to changes in methods of manufacture and tanning, and also to the "injurious effect of light and gas fumes" which were common in many libraries.

View Map + Bookmark Entry

The Photomicrographic Book 1907

In 1907 engineer Robert Goldschmidt and Belgian author, entrepreneur, visionary, lawyer and peace activist Paul Otlet published "Sur une forme nouvelle du livre-- le livre microphotographique" in l'Institut international de bibliographie bulletin. In this paper they "proposed the livre microphotographique as a way to alleviate the cost and space limitations imposed by the codex format. Otlet’s overarching goal was to create a World Center Library of Juridical, Social and Cultural Documentation, and he saw microfiche as way to offer a stable and durable format that was inexpensive, easy to use, easy to reproduce, and extremely compact" (Wikipedia article on Microform, accessed 04-26-2009). 

View Map + Bookmark Entry

1930 – 1940

Otto Bettman Founds The Bettmann Archive: the Beginning of "The Visual Age" 1938

The Bettmann Archive, founded in New York in 1936 by Otto Bettmann, a refugee from Nazi Germany, contained 15,000 images by 1938.  Bettmann later characterized this period of time as "the beginning of the visual age." By 1980, the year before Bettmann sold the archive to the Kraus-Thomson Organization, the archive contained 2,000,000 images, carefully selected for their historical value, mainly under the five categories of world events, personalities, lifestyles, advertising art, and art and illustrations.

In 1984 the Kraus-Thomson Organization acquired the extensive United Press International (UPI) collection, containing millions of worldwide news and lifestyle photographs taken by photographers working for United Press International, International News Photos, Acme Newspictures, and Pacific and Atlantic.

In 1995 Corbis, a company controlled by Bill Gates, bought the Bettmann Archive.

"Beginning in 1997, Corbis spent five years selecting images of maximum historical value and saleability for digitization. More than 1.3 million images (26% of the collection) have been edited and 225,000 have been digitized. Because of this effort, more images from the Bettmann Archive are available now than ever before.

"In 2002, the Archive was moved to a state-of-the-art, sub-zero film preservation facility in western Pennsylvania. The 10,000-square-foot underground storage facility is environmentally-controlled, with specific conditions (minus -20°C, relative humidity of 35%) calculated to preserve prints, color transparencies, negatives, photographs, enclosures, and indexing systems" (http://www.corbis.com/BettMann100/Archive/Preservation.asp, accessed 01-17-2010).

View Map + Bookmark Entry

1940 – 1950

The Fitzwilliam Museum Exhibition of Printing: Precursor to "Printing and the Mind of Man" May 6 – May 16, 1940

Detail of cover of catalogue for An Exhibition of Printing at the Fitzwilliam Museum.  Please click to view entire cover of catalogue.

An Exhibition of Printing at the Fitzwilliam Museum in Cambridge was planned for May 6 to June 23, 1940, taking the year 1940 as the quincentenary of Gutenberg's invention of printing, just as had been done in 1840 for the quatercentenary, in 1740 for the tricentennial, and in 1640 for the bicentennial. Exhibitions of this kind normally require years of advance planning, but from the brief account in Nicolas Barker's Stanley Morison (1972) it appears that the prospectus for this exhibition was sent out only at the beginning of March, 1940:

"At the beginning of March a prospectus was circulated to librarians, members of the Bibliographical Scoiety, the Roxburghe Club, and others.

"Though more than half Europe is at present too tragically absorbed in the future of its civilisation to be able to pay much thought to its past, the five-hundredth anniversary of Gutenberg's invention none the less demands to be recognized. The conditions which make it impractical to hold a worthy exhibition in London are happily absent in Cambridge; and plans for stage here a modest tribute to Gutenberg's memory have developed into a resolution to make good the general deficiency with a major exhibition.

"The theme of the exhibition was then set out; a full representation of every aspect of human thought and action served by Gutenberg's invention; 'wherever civilization has called upon the craft of printing from movable type to promote its ends, there is subject matter for this exhibition'.

"The response for the request for loans was conspicuously prompt and generous. Nearly 100 lenders produced over 600 exhibits. . . " (Barker, op. cit., 376-77).

According to Brooke Crutchley, "The Gutenberg Exhbition at Cambridge, 1940," Matrix 12 (1992) 77-82:

"The decision to celebrate the quincentenary of Gutenberg's invention by holding an exhibition in Cambridge in 1940 was largely an act of defiance. The outbreak of war in September 1939 and the swift conquest of Poland were followed by an uneasy quiet in western Europe while armies lined up against each other in preparation for the battle that was to come. Meanwhile the Fitzwilliam Museum had sent its principal treasures to Wales for safe keeping, the windows of King's College chapel were boarded up, civilisation seemed to have been put on ice. An exhibition to show the contribution that printing had made over five hundred years, and would continue to make when the madness was over, might be seen as a challenge to the forces of destruction." 

As a guide and record of the exhibition, an unillustrated catalogue describing 641 items was published by Cambridge University Press and offered for sale for one shilling. On the cover was an emblem symbolizing Gutenberg's type designed by wood engraver Reynolds Stone.

The Foreword to the catalogue read as follows:

"There is no moral to this exhibition. It aims at portraying, as objectively as possible, the uses to which printing from movable type has been put since Gutenberg and his associates invented it five hundred years ago; the spread of knowledge more quickly and accurately than was possible before, the storing of human experience, the providing of entertainment, the simplication of the increasingly complicated business of living. Those books, papers, and other printing have been chosen (so far as the difficulties of the times would permit) which made most effective use of the medium of type; in other words, those which, composed and multiplied, most strongly influenced people and events. Others have been chosen for their illustration of events and trends of particular importance or interest; others again for their intrinsic curiosity as examples of the exploitation of print. All are shewn so far as possible in the original editions in which they were first presented to the world.

"The exhibition has been designed therefore to illustrate the development of man's use of movable type as a tool; its spread from Mainz through the countries of the world, through all the fields of knowledge, through the whole range of man's activities. Running through the story another theme presents itself and draws occasional comment--the development of the actual form of printing. The technical display deals with the old and modern methods fo type-founding and composition, and briefly illustrates the development of type design. That part of the exhibition is education; for the rest, though there is much to learn from it, it does not set out to teach. It is simply an illustration to that proud but unattributed saying: With my twenty-six soldiers of lead I have conquered the world."

Persons involved with organizing the exhibition and writing catalogue entries included writer on typography Beatrice Warde, antiquarian bookseller and writer Percy Muir, typographer John Dreyfus, writer and antiquarian bookseller John Carter, economist and book collector John Maynard Keynes, and scientist, sinologist and historian of science Joseph Needham. According to Sebastian Carter, "Printing & the Mind of Man," Matrix  20 (2000) 172-180, typographer Stanley Morison, typographic advisor to Cambridge University Press, was involved in the planning, but the bulk of the organization of the exhbiition was done by the Assistant University Printer, Brooke Crutchley, helped by John Dreyfus. The largest private lender to the exhibition was stockbroker (later intelligence agent), book collector and writer, Ian Fleming, who had pioneered in collecting influential books, or those which, in the words of Sebastian Carter, had "started something."

Among several innovative aspects of the exhibition was a display of books published in the year 1859, including, among others, Darwin On the Origin of Species, Mill On Liberty, Fitzgerald's Rubaiyat of Omar Khayyam, and Mrs. Beeton's Book of Household Management.

The catalogue did not appear until June 1940, after the exhibition had been closed on May 16, only 10 days after it had opened, because of war. It was reprinted in the following month. In my copy of the second printing the following statement appeared:

"As this catalogue was about to go to press, a sudden change in the war situation made it appear advisable to close the Exhibition when it had been open only ten days. The catalogue was printed off, nevertheless, so that copies might be sent to all who had helped and others be available for sale. The demand proved greater than had been expected, and this reprint was in hand in which a few errors and oversights have been made good."

When I originally wrote this entry for From Cave Paintings to the Internet on October 25, 2011, I had never previously seen a copy of the 1940 exhibition catalogue, in spite of my roughly 50 years experience in the world of books. Until reading the catalogue I was unaware how much this forgotten exhibition held early in World War II had influenced the 1963 exhibition, Printing and the Mind of Man. The overlap in choices between the 1940 and 1963 catalogues is significant, especially as Carter & Muir were heavily involved in both exhibitions held 23 years apart, and some of the same lenders, especially Ian Fleming, contributed notable items to both exhibitions. It would be useful some day to compare the selections of the two exhibitions carefully.  Before doing that I would observe that the organizers of the 1940 exhibition must have been well aware of the significance of Hitler's writings leading up to World War II, as they included the  February 24, 1920 Munich Auszug aus dem Programm der national-sozialistischen Deutschen Arbeiterpartei as item 620 in their exhibition, and Hitler's Mein Kampf as item number 623.

View Map + Bookmark Entry

Sealing of the Crypt of Civilization May 25, 1940

On May 25, 1940 Presbyterian minister and president of Oglethorpe University in Brookhaven, GeorgiaThornwell Jacobs sealed the Oglethorpe Atlanta Crypt of Civilization in a cermony broadcast on Atlanta's WSB radio.  It was intended to be opened on May 28, 8113 CE.

Modelled after a chamber in an Egyptian pyramid, the Crypt of Civilization was a subterranean chamber, twenty feet long, ten feet wide, and ten feet high. Among the many elements of the time capsule were  microfilm media (film and thin metal) used to store written information, recorded sound, and moving pictures in the capsule. Apparently little or no print on paper material was included even though by the time of the creation of the capsule there was already sufficient evidence that print on paper, or writing on parchment, had survived for several thousand years, while microfilm or microform media was new and untested for durability.

"In this room had been a swimming pool, the foundation of which was impervious to water. The floor was raised with concrete with a heavy layer of damp proofing applied. The gallery's extended granite walls were lined with vitreous porcelain enamel embedded in pitch. The crypt had a two-foot thick stone floor and a stone roof seven feet thick. Jacobs consulted the Bureau of Standards in Washington for technical advice for storing the contents of the crypt. Inside would be sealed stainless steel receptacles with glass linings, filled with the inert gas of nitrogen to prevent oxidation or the aging process. A stainless steel door would seal the crypt."

"Articles on the crypt in the New York Times caught the attention of Thomas Kimmwood Peters (1884-1973), an inventor and photographer of versatile experience. Peters had been the only newsreel photographer to film the San Francisco earthquake of 1906. He had worked at Karnak and Luxor, Peters was also the inventor of the first microfilm camera using 35 millimeter film to photograph documents. In 1937 Jacobs appointed Peters as archivist of the crypt." 

"From 1937 to 1940, Peters and a staff of student assistants conducted an ambitious microfilming project. The cellulose acetate base film would be placed in hermetically sealed receptacles. Peters believed, based on the Bureau of Standards testing, that the scientifically stored film would last for six centuries; he took however, as a method of precaution, a duplicate metal film, thin as paper. Inside the crypt are microfilms of the greatest classics, including the Bible, the Koran, the Iliad, and Dante's Inferno. Producer David O. Selznick donated an original copy of the script of 'Gone With the Wind.' There are more than 640,000 pages of microfilm from over eight hundred works on the arts and sciences. Peters also used similar methods for capturing and for storing still and motion pictures. Voice recordings of political leaders such as Hitler, Stalin, Mussolini, Chamberlain, and Roosevelt were included, as were voice recordings of Popeye the Sailor and a champion hog caller. To view and to hear these picture and sound records, Peters placed in the vault electric machines, microreaders, and projectors. In the event that electricity would not be in use in 8113 A.D., there is in the crypt a generator operated by a windmill to drive the apparatus as well as a seven power magnifier to read the microbook records by hand. The first item one would see upon entering the chamber is a thoughtful precaution-a machine to teach the English language so that the works would be more readily decipherable if found by people of a strange tongue.

"Thornwell Jacobs envisioned the crypt as a synoptic compilation and thus aimed for a whole 'museum' of not only accumulated formal knowledge of over six thousand years, but also 1930s popular culture. The list of items in the crypt is seemingly endless. All of the items were donated, with contributors as diverse as King Gustav V of Sweden and the Eastman Kodak Company. Some of the more curious items Peters included in the crypt were plastic toys - a Donald Duck, the Lone Ranger, and a Negro doll, as well as a set of Lincoln Logs. Peters also arranged with Anheuser Busch for a specially sealed ampule of Budweiser beer. The chamber of the crypt when finally finished in the spring of 1940, resembled a cell of an Egyptian pyramid, cluttered with artifacts on shelves and on the floor" (http://www.oglethorpe.edu/about_us/crypt_of_civilization/history_of_the_crypt.asp, accessed 04-22-2011). 

View Map + Bookmark Entry

1950 – 1960

Archival Records Include "Machine-Readable Materials" 1950

The Federal Records Act of 1950 expanded the definition of "record" to include "machine-readable materials." At this time machine-readable records included primarily punched-cards.

View Map + Bookmark Entry

One of the Earliest Surviving British Television Dramas December 12 – December 14, 1954

From December 12-14, 1954 the BBC presented a television production of George Orwell's Nineteen Eighty-Four, adapted for television by Nigel  Kneale.

"Kneale's script was a largely faithful adaptation of the novel as far as was practical with the limitations of the medium. The writer did, however, make some small additions of his own, the most notable being the creation of a sequence in which O'Brien observes Julia at work in PornoSec, and reads a small segment from one of the erotic novels being written by the machines there."

"When it had become clear what an important production Nineteen Eighty-Four was, it was arranged for the second performance [December 14, 1954] to be telerecorded onto 35mm film – the first performance having simply disappeared off into the ether, as it was shown live, seen only by those who were watching on the Sunday evening. At this stage, Videotape recording was still at the development stage and television images could only be preserved on film by using a special recording apparatus (known as "telerecording" in the UK and "kinescoping" in the USA), but was only used sparingly, then in Britain for historic preservation reasons and not for pre-recording. It is thus the second performance that survives in the archives, one of the earliest surviving British television dramas" (Wikipedia article on Nineteen Eight-Four (TV Programme), accessed 07-26-2009).

View Map + Bookmark Entry

Longevity of Paper is a Function of its Acidity or Alkalinity Circa 1958

In the late 1950s it was recognized that the longevity of paper is a function of its acidity or alkalinity: the lower the acidity and higher the alkalinity, the greater the longevity of paper.

View Map + Bookmark Entry

1960 – 1970

ICPSR, The Largest Archive of Digital Social Science Data, is Founded at the University of Michigan 1962

In 1962 ICPSR, the Inter-university Consortium for Political and Social Research, was founded at the University of Michigan, Ann Arbor. ICPSR became the world's largest archive of digital social science data,  acquiring, preserving, and distributing original research data, and providing training in its analysis.

View Map + Bookmark Entry

NCR Issues the Smallest Published Edition of the Bible, and the First to Reach the Moon 1966

In 1966 the Research and Development department of National Cash Register (NCR) of Dayton, Ohio produced an edition of all 1245 pages of  the World Publishing Company's No. 715 Bible on a single 2" x 1-3/4" photochromatic microform (PCMI) The microform contained both the Old Testament on 773 pages and the New Testament on 746 pages, and was issued in a paper sleeve with title on the cover and information about the process inside and on the back.

On the microform each page of double column Bible text was about 0.5 mm wide and 1 mm high. Each text character was 8 um high (ie 8/1000ths of a millimeter). NCR noted on the paper wallet provided with the microform that this represented a linear reduction of about 250:1 or an area reduction of 62,500:1. This would correspond to the original text being circa 2 mm high. To put this into perspective, NCR also noted that if this reduction was used on the millions of books on the 270+ miles of shelving in the Library of Congress, the entire Library of Congress as it existed in 1966 could be stored in six standard filing cabinets.

♦ In 1971 Apollo 14 lunar module pilot Edgar D. Mitchell carried 100 of the microform bibles aboard the lunar module Antares, as confirmed by NASA's official manifest. Launched January 31, 1971, Mitchell and the bibles reached the Fra Mauro formation of the Moon on February 5 aboard the Antares before returning to the command module for the voyage back to Earth. This was the first edition of the Bible to reach the Moon, and probably the first book of any kind of reach the moon and return. A second parcel containing 200 microform Bibles flew in Edgar Mitchell's command module "PPK" bag in lunar orbit, and did not land. These 200 copies represented extra Bibles to be used if something happened to the lunar module copies.

View Map + Bookmark Entry

1970 – 1980

Acquiring New Archival Material at the Rate of 1 Mile per Year Circa 1970

During the 1970s The National Archives of Great Britain in Kew, Richmond, Surrey, measured the extent of its holdings by shelf length. It held about 80 miles of physical information, and acquired new material at the rate of about 1 mile per year.

View Map + Bookmark Entry

Launching "Messages in a Bottle" into the Cosmic Ocean August 20, 1977 – September 5,

The Voyager Golden Records were included on the Voyager 1 and 2 spacecraft launched in on September 5, 1977 and August 20, 1977 respectively as a kind of time capsule intended to communicate a story of our world to extraterrestrials. Each was a 12-inch gold-plated copper disk-shaped phonograph record containing sounds and images selected to portray the diversity of life and culture on Earth. The contents of the record were selected for NASA by a committee chaired by Carl Sagan of Cornell University. Sagan and associates assembled 115 images and a variety of natural sounds, such as those made by surf, wind and thunder, birds, whales, and other animals. To this they added musical selections from different cultures and eras, and spoken greetings from in fifty-five languages, and printed messages from President Jimmy Carter and U.N. Secretary General Kurt Waldheim.

Because it was believed that the Voyager spacecrafts would not encounter another solar system for 40,000 years, the production of these records seems to have involved a naive faith in the permanence of accessibility of analog data, and in the durability of such data to survive over extremely long periods of time. 

"Each record is encased in a protective aluminum jacket, together with a cartridge and a needle. Instructions, in symbolic language, explain the origin of the spacecraft and indicate how the record is to be played. The 115 images are encoded in analog form. The remainder of the record is in audio, designed to be played at 16-2/3 revolutions per minute. It contains the spoken greetings, beginning with Akkadian, which was spoken in Sumer about six thousand years ago, and ending with Wu, a modern Chinese dialect. Following the section on the sounds of Earth, there is an eclectic 90-minute selection of music, including both Eastern and Western classics and a variety of ethnic music. Once the Voyager spacecraft leave the solar system (by 1990, both will be beyond the orbit of Pluto), they will find themselves in empty space. It will be forty thousand years before they make a close approach to any other planetary system. As Carl Sagan has noted, 'The spacecraft will be encountered and the record played only if there are advanced spacefaring civilizations in interstellar space. But the launching of this bottle into the cosmic ocean says something very hopeful about life on this planet' (http://voyager.jpl.nasa.gov/spacecraft/goldenrec.html, accessed 02-27-2011).

View Map + Bookmark Entry

1980 – 1990

Flexible Image Transport System (FITS) 1981

D. C. Wells, E. W. Greisen, and R. H. Harten developed the open source FITS (Flexible Image Transport System), which was first standardized in 1981. It is

"a digital file format used to store, transmit, and manipulate scientific and other images. FITS is the most commonly used digital file format in astronomy. Unlike many image formats, FITS is designed specifically for scientific data and hence includes many provisions for describing photometric and spatial calibration information, together with image origin metadata.

"A major feature of the FITS format is that image metadata is stored in a human readable ASCII header, so that an interested user can examine the headers to investigate a file of unknown provenance. Each FITS file consists of one or more headers containing ASCII card images (80 character fixed-length strings) that carry keyword/value pairs, interleaved between data blocks. The keyword/value pairs provide information such as size, origin, coordinates, binary data format, free-form comments, history of the data, and anything else the creator desires: while many keywords are reserved for FITS use, the standard allows arbitrary use of the rest of the name-space" (Wikipedia article on FITS, accessed 03-24-2010).

Because of its special features FITS became a very useful format for the long term preservation of digital images. It was also adopted by NASA as a standard, and was also adopted by the Vatican Library.

View Map + Bookmark Entry

U.S. Newspaper Program Microfilms Newspapers 1982

In cooperation with the Library of Congress, in 1982 the National Endowment for the Humanities began funding the United States Newspaper Program—"a cooperative national effort among the states and the federal government to locate, catalog, and preserve on microfilm newspapers published in the United States from the eighteenth century to the present."

View Map + Bookmark Entry

The Digital Domesday Project--Doomed to Early Digital Obsolescence 1984 – 1986

From 1984 to 1986 Acorn Computers Ltd, Philips, Logica and the BBC (with some funding from the European Commission's ESPRIT program) marked the 900th anniversary of the original Domesday Book—an 11th century census of England—with the multimedia BBC Domesday Project. This publication is frequently cited as an example of digital obsolescence.

The Project "included a new 'survey' of the United Kingdom, in which people, mostly school children, wrote about geography, history or social issues in their local area or just about their daily lives. This was linked with maps, and many colour photos, statistical data, video and 'virtual walks'. Over 1 million people participated in the project. The project also incorporated professionally-prepared video footage, virtual reality tours of major landmarks and other prepared datasets such as the 1981 census.

"The project was stored on adapted laserdiscs in the LaserVision Read Only Memory (LV-ROM) format, which contained not only analog video and still pictures, but also digital data, with 300 MB of storage space on each side of the disc. The discs were mastered, produced, and tested by Philips at their Eindhoven headquarters factory. Viewing the discs required an Acorn BBC Master expanded with an SCSI controller and an additional coprocessor controlled a Philips VP415 "Domesday Player", a specially-produced laserdisc player. The user interface consisted of the BBC Master's keyboard and a trackball (known at the time as a trackerball). The software for the project was written in BCPL (a precursor to C), to make cross platform porting easier, although BCPL never attained the popularity that its early promise suggested it might.

In 2002, there were great fears that the discs would become unreadable as computers capable of reading the format had become rare (and drives capable of accessing the discs even more rare). Aside from the difficulty of emulating the original code, a major issue was that the still images had been stored on the laserdisc as single-frame analogue video, which were overlaid by the computer system's graphical interface. The project had begun years before JPEG image compression and before truecolour computer video cards had become widely available.

"However, the BBC later announced that the CAMiLEON project (a partnership between the University of Leeds and University of Michigan) had developed a system capable of accessing the discs using emulation techniques. CAMiLEON copied the video footage from one of the extant Domesday laserdiscs. Another team, working for the UK National Archives (who hold the original Domesday Book) tracked down the original 1-inch videotape masters of the project. These were digitised and archived to Digital Betacam.

"A version of one of the discs was created that runs on a Windows PC. This version was reverse-engineered from an original Domesday Community disc and incorporates images from the videotape masters. It was initially available only via a terminal at the National Archives headquarters in Kew, Surrey but has been available since July 2004 on the web.

"The head of the Domesday Project, Mike Tibbets, has criticized the bodies to which the archive material was originally entrusted" (Wikipedia article on BBC Domesday Project, accessed 12-21-2008).

View Map + Bookmark Entry

The Perseus Digital Library Project at Tufts University Begins 1985

The Perseus Digital Library Project began at Tufts University, Medford/Somerville, Massachusetts in 1985. Though the project was ostensibly about Greek and Roman literature and culture, it evolved into an exploration of the ways that digital collections could enhance scholarship with new research tools that took libraries and scholarship beyond the physical book. The following quote came from their website around 2010:

"Since planning began in 1985, the Perseus Digital Library Project has explored what happens when libraries move online. Two decades later, as new forms of publication emerge and millions of books become digital, this question is more pressing than ever. Perseus is a practical experiment in which we explore possibilities and challenges of digital collections in a networked world.

"Our flagship collection, under development since 1987, covers the history, literature and culture of the Greco-Roman world. We are applying what we have learned from Classics to other subjects within the humanities and beyond. We have studied many problems over the past two decades, but our current research centers on personalization: organizing what you see to meet your needs.

"We collect texts, images, datasets and other primary materials. We assemble and carefully structure encyclopedias, maps, grammars, dictionaries and other reference works. At present, 1.1 million manually created and 30 million automatically generated links connect the 100 million words and 75,000 images in the core Perseus collections. 850,000 reference articles provide background on 450,000 people, places, organizations, dictionary definitions, grammatical functions and other topics."

In December 2013 I found this description of their activities on their website:

"Perseus has a particular focus upon the Greco-Roman world and upon classical Greek and Latin, but the larger mission provides the distant, but fixed star by which we have charted our path for over two decades. Early modern English, the American Civil War, the History and Topography of London, the History of Mechanics, automatic identification and glossing of technical language in scientific documents, customized reading support for Arabic language, and other projects that we have undertaken allow us to maintain a broader focus and to demonstrate the commonalities between Classics and other disciplines in the humanities and beyond. At a deeper level, collaborations with colleagues outside of classical studies make good on the claim that a classical education generally provides those critical skills and that intellectual adaptability that we claim to instill in our students. We offer the combination of classical and non-classical projects that we pursue as one answer to those who worry that a classical education will leave them or their children with narrow, idiosyncratic skills.

"Within this larger mission, we focus on three categories of access:

Human readable information: digitized images of objects, places, inscriptions, and printed pages, geographic information, and other digital representations of objects and spaces. This layer of functionality allows us to call up information relevant to a longitude and latitude coordinate or a library call number. In this stage digital representations provide direct access to the physical senses of actual people in particular places and times. In some cases (such as high resolution, multi-spectral imaging), digital sources already provide better physical access than has ever been feasible when human beings had direct contact with the physical artifact.

"Machine actionable knowledge: catalogue records, encyclopedia articles, lexicon entries, and other structured information sources. Physical access can serve our senses but provides no information about what we are encountering - in effect, physical access is like visiting a historical site about which we may know nothing and where any visible documentation is in a language that we cannot understand. Machine actionable knowledge allows us to retrieve information about what we are viewing. Thus, if we encounter a page from a Greek manuscript of Homer, we could at this stage find cleanly printed modern editions of the Greek, modern language translations, commentaries and other background information about the passage on that manuscript page. If we moved through a virtual Acropolis, we could retrieve background information about the buildings and the sculpture.

"Machine generated knowledge: By analyzing existing information automated systems can produce new knowledge. Machine actionable knowledge allows, for example, us to look up a dictionary entry (e.g., facio, "to do, make") in a dictionary or to find pre-existing translations for a passage in Latin or Greek. Machine generated knowledge allows a machine to recognize that fecisset is a pluperfect subjunctive form of facio and to provide reading support where there is no pre-existing human translation. Such reading support might include full machine translation but also finer grained services such as word and phrase translation (e.g., recognizing whetherorationes in a given context more likely corresponds to English "speeches," "prayers" or some other term), syntactic analysis (e.g., recognizing that orationes in a given passage is the object of a given verb), named entity identification (e.g., identifying Antonium in a given passage as a personal name and then as a reference to Antonius the triumvir)." 

View Map + Bookmark Entry

Among the Earliest Practical Digital Libraries 1985

In 1985 an IBM team began scanning the papers related to Columbus' discovery of the new world at El Archivo General de Indias de Sevilla (AGI), Seville, Spain.

"To coincide with the 500th anniversary of Columbus' landfall in the West Indies, the AGI project was to capture 10% of the collection estimated to consist of 86,000,000 pages. By 1992, it had indeed collected about 9,000,000 digital image pages onto optical disks, together with a set of finding aids." This was among the earliest practical digital libraries.

View Map + Bookmark Entry

Rediscovery of Electronic Images Created by Andy Warhol on an Amiga Computer 1985

On April 25, 2014 The Andy Warhol Museum (The Warhol) in Pittsburgh announced the discovery of previously unknown artistic experiments created by Warhol on an Amiga Computer in 1985. The electronic images, perserved on floppy discs, were commissioned by Amiga to demonstrate the graphic capabilities of their personal computer. They were found along with the computer and manuals that Warhol used. When this story was published I thought it was the first instance in which electronic imagery by a major artist was restored from obsolete software and an obsolete computer.

View Map + Bookmark Entry

The First Digital Image Database of Cultural Materials 1987

To photograph, store, and organize the art work of the painter, Andrew Wyeth in Chadds Ford, Pennsylvania, in 1987 Fred Mintzer, Henry Gladney and colleagues at IBM developed a high resolution digital camera for photographing art works and a PC-based database system to store and index the images. The system was used by Wyeth's staff to photograph, store, and organize about 10,000 images. "Pictures were scanned at a spatial resolution of 2500 by 3000 pixels and a color depth of 24 bits-per-pixel, and were color calibrated." This was the first digital image database of cultural materials.

View Map + Bookmark Entry

"Slow Fires: On the Preservation of the Human Record" 1987

In 1987 American filmaker Terry Sanders, the American Film Foundation, and the Council on Library and Information Resources, issued Slow Fires: On the Preservation of the Human Record, a film narrated by Robert McNeil. The AFF characterized the film as:

"The unforgettable story of the deterioration and destruction of our world’s intellectual heritage and the global crisis in preserving library materials. . . .

"Millions of pages of paper in books, photographs, drawings and maps are disintegrating and turning to dust. This remarkable film provides a comprehensive assessment of the worldwide situation, demonstrates methods of restoration and preservation and suggests ways to prevent new documents from facing ultimate destruction".

In December 2013 the film could be purchased from the American Film Foundation's website on DVD, in 33 and 58 minute versions.

View Map + Bookmark Entry

1990 – 2000

The Memory of the World Program 1992

In 1992 UNESCO launched the Memory of the World Program, an international initiative to guard against collective amnesia, by promoting preservation and dissemination of valuable archive holdings and library collections worldwide.

View Map + Bookmark Entry

Preserving Access to Digital Information 1993

At the Towards Federation 2001 (TF2001) meeting in 1993 a group from the Australian library and archives sectors was organized to develop appropriate guidelines for the preservation of information in electronic form. This evolved into the National Library of Australia's Preserving Access to Digital Information Initiative (PADI).

View Map + Bookmark Entry

The Electronic Beowulf 1993

In 1993 the British Library and Kevin S. Kiernan at the University of Kentucky embarked on the Electronic Beowulf project, an effort to photograph and publish high resolution electronic copies of the manuscript. The Electronic Beowulf was a pioneering effort in the digital preservation, restoration, and dissemination of manuscript material.

"The equipment we are using to capture the images is the Roche/Kontron ProgRes 3012 digital camera, which can scan any text, from a letter or a word to an entire page, at 2000 x 3000 pixels in 24-bit color. The resulting images at this maximum resolution are enormous, about 21-25 MB, and tax the capabilities of the biggest machines. Three or four images - three or four letters or words if that is what we are scanning - will fill up an 88 MB hard disk, and we have found that no single image of this size can be processed in real time without at least 64 MB of RAM. In our first experiments in June with the camera and its dedicated hardware, we transmitted a half-dozen images by phone line from the Conservation Studio of the British Library to the Wenner Gren Imaging Laboratory at the University of Kentucky, where identical hardware was set up to receive the data. Most of these images are now available on the Internet through anonymous ftp or Mosaic."

View Map + Bookmark Entry

The First Sourcebook on Digital Libraries? December 6, 1993

In December 1993 Edward A. Fox of Virginia Polytechnic Institute and State University (Virginia Tech) issued Sourcebook on Digital Libraries. Version 1.0. The earliest reference in the bibliography is the April 1991 issue of Byte magazine. Most other references are to works published in 1992.

View Map + Bookmark Entry

Digital Library: Gross Structure and Requirements March 1, 1994

On March 1, 1994 "A one-day, constrained-size workshop addendum to the annual CAIA conference" was held in San Antonio, Texas, on the emerging topic of digital libraries. It issued the report: Digital Library: Gross Structure and Requirements.

View Map + Bookmark Entry

The Digital Library Federation is Founded May 1, 1994

On May 1, 1994 directors of 15 major academic libraries in the United States, and the President of the Commission on Preservation and Access, and the Council on Library and Information Resources, founded The Digital Library Federation for:

"The implementation of a distributed, open digital library conforming to the overall theme and accessible across the global Internet. This library shall consist of collections--expanding over time in number and scope -- to be created from the conversion to digital form of documents contained in our and other libraries and archives, and from the incorporation of holdings already in electronic form."

View Map + Bookmark Entry

The First Conference on the Theory and Practice of Digital Libraries June 19 – June 21, 1994

The first annual conference on the Theory and Practice of Digital Libraries met at College Station, Texas from June 19-21, 1994.

View Map + Bookmark Entry

The NSF Digital Libraries Initiative: The Origins of Google September 1, 1994

In 1994 the National Science Foundation Digital Libraries Initiative made its first six awards. 

"DLI and DLI-2: 1994-2003. From 1994 to 1998, NSF, DARPA and NASA funded six digital library projects in the $30 million Phase 1 of the Digital Libraries Initiative. In 1999, NSF, DARPA, the National Library of Medicine, the Library of Congress, NASA and the National Endowment for the Humanities, with participation from the National Archives and the Smithsonian Institution, provided $55 million for Phase 2 (DLI-2). DLI-2 funded 36 projects to extend and develop innovative digital library technologies and applications" (http://www.nsf.gov/news/special_reports/cyber/digitallibraries.jsp, accessed 11-15-2013).

One of the six initial awards, funded on September 1, 1994, was for The Stanford Integrated Digital Library Project, in which Larry Page and Sergey Brin participated.

"This project . . . is to develop the enabling technologies for a single, integrated and universal' library, proving uniform access to the large number of emerging networked information sources and collections. These include both on-line versions of pre-existing works and new works and media of all kinds that will be available on the globally interlinked computer networks of the future. The Integrated Digital Library is broadly defined to include everything from personal information collections, to the collections that one finds today in conventional libraries, to the large data collections shared by scientists. The technology developed in this project will provide the "glue" that will make this worldwide collection usable as a unified entity, in a scalable and economically viable fashion."

View Map + Bookmark Entry

The National Digital Library Program is Announced October 13, 1994

On October 13, 1994 the Library of Congress announced The National Digital Library Program.

View Map + Bookmark Entry

The Task Force on Digital Archiving is Created December 1994

In December 1994 the Commission on Preservation and Access and the Research Libraries Group (RLG), Mountain View, California, created the Task Force on Digital Archiving. The purpose of the Task Force was to investigate the means of ensuring “continued access indefinitely into the future of records stored in digital electronic form.” On May 1, 1996 the group issued its report: Preserving Digital Information.

View Map + Bookmark Entry

An Online Searchable Archive of Over 1000 Academic Journals 1995

JSTOR (short for Journal Storage), an online system for archiving academic journals, was founded in 1995.  In 2012 it provided online searchable texts of more than 1000 academic journals to member educational institutions. 

"JSTOR was originally conceived as a solution to one of the problems faced by libraries, especially research and university libraries, due to the increasing number of academic journals in existence. The founder, William G. Bowen, was the president of Princeton University from 1972 to 1988. Most libraries found it prohibitively expensive in terms of cost and space to maintain a comprehensive collection of journals. By digitizing many journal titles, JSTOR allowed libraries to outsource the storage of these journals with the confidence that they would remain available for the long term. Online access and full-text search ability improved access dramatically. JSTOR originally encompassed ten economics and history journals and was initiated in 1995 at seven different library sites. As of November 2010, there were 6,425 participating libraries. JSTOR access was improved based on feedback from these sites and it became a fully searchable index accessible from any ordinary Web browser. Special software was put in place to make pictures and graphs clear and readable.

"With the success of this limited project, Bowen and Kevin Guthrie, then-president of JSTOR, were interested in expanding the number of participating journals. They met with representatives of the Royal Society of London, and an agreement was made to digitize the Philosophical Transactions of the Royal Society back to its beginning in 1665. The work of adding these volumes to JSTOR was completed by December 2000. As of November 2, 2010, the database contained 1,289 journal titles in 20 collections representing 53 disciplines, and 303,294 individual journal issues, totaling over 38 million pages of text (Wikipedia article on JSTOR, accessed 01-12-2012).

View Map + Bookmark Entry

D-Lib Magazine Begins July 1995

In July 1995 the Corporation for National Research Initiiatives, sponsored by DARPA, began web publication only of D-lib Magazine, the Magazine of the Digital Library Forum.

View Map + Bookmark Entry

The Kulturarw3 Project 1996

In 1996 the National Library of Sweden (Kungl. Biblioteket) initiated the Kulturarw3 Project - The Royal Swedish Web Archiw3e.

View Map + Bookmark Entry

Brewster Kahle Founds the Internet Archive 1996

In 1996 computer engineer, Internet entrepreneur, activist, and digital librarian Brewster Kahle founded the Internet Archive in San Francisco.  After the modern Bibliotheca Alexandrina opened in 2002 the Internet Archive established a mirror site at that historic location.

This video embedded below was produced in 2012:

View Map + Bookmark Entry

LexisNexis Exceeds One Billion Documents 1996

In 1996 the database of LexisNexis online services of Miamisburg, Ohio, exceeded one billion documents.

View Map + Bookmark Entry

The First ACM International Conference on Digital Libraries March 20 – March 23, 1996

The first ACM International Conference on Digital Libraries occurred in Bethesda, Maryland from March 20-23, 1996.

View Map + Bookmark Entry

The IEEE Technical Committee on Digital Libraries is Established 1997

In 1997 the IEEE Computer Society established the Technical Committee on Digital Libraries. "It is to promote research in the theory and practice of all aspects of Collective Memory, i.e. the fields of Digital Libraries, Digital Museums and Digital Archives of all kinds."

View Map + Bookmark Entry

The California Digital Library is Founded 1997

At a news conference in San Francisco in 1997 the California Digital Library was founded "by University of California President Emeritus Richard Atkinson to build the University's digital library, assist campus libraries with sharing their resources and holdings more effectively, and provide leadership in applying technology to the development of library collections and services."

View Map + Bookmark Entry

Origins of Australia's Web Archive 1998

In 1998 the National Library of Australia, Canberra, initiated its Digital Services Project with the goal of establishing a web archive. This evolved into PANDORA, Australia's Web Archive.

View Map + Bookmark Entry

NARA Begins ERA for Preservation of Digital Archives 1998

In 1998 the U.S. National Archives and Records Administration (NARA) began the Electronic Records Archives Program (ERA) for the eventual preservation of digital archives.

View Map + Bookmark Entry

On the Preservation of Knowledge in the Electronic Age 1998

In 1998 American filmaker Terry Sanders, the American Film Foundation, the Council on Library and Information Resources, and the American Council of Learned Societies issued Into the Future: On the Preservation of Knowledge in the Electronic Age.

This film, narratived by Robert McNeil, was a sequel to Slow Fires (1987). It "explores the hidden crisis of the digital information age. Will digitally stored information and knowledge survive into the future? Will humans twenty, fifty, one hundred years from now have access to the electronically recorded history of our time?" (from the American Film Foundation blurb; it was available in 33 and 58 minute versions on July 28, 2009). The film included interviews with Peter Norton and Tim Berners-Lee.

View Map + Bookmark Entry

The Digital Michelangelo Project 1998

Marc Levoy and team began The Digital Michelangelo Project at Stanford University in 1998 using laser scanners to digitize the statues of Michelangelo, as well as 1,163 fragments of the Forma Urbis Romae, a giant marble map of ancient Rome.

The quality of the scans was so high that the Italian government would not permit the release of the full data set on the Internet; however, the Stanford researchers built a system called ScanView that allowed viewing of details of specific parts of the statue, including parts that would be inaccessible to a normal museum visitor. In December 2013 Scanview could be downloaded at this link.

The laser scan data for Michelangelo's David was utilized in its cleaning and restoration that began in September 2002. This eventually resulted in a 2004 book entitled Exploring David: Diagnostic Tests and State of Conservation.

"In preparation for this restoration, the Galleria dell'Accademia undertook an ambitious 10-year program of scientific study of the statue and its condition. Led by Professor Mauro Matteini of CNR-ICVBC, a team of Italian scientists studied every inch of the statue using color photography, radiography (i.e. X-rays), ultraviolet fluorescence and thermographic imaging, and several other modalities. In addition, by scraping off microsamples and performing in-situ analyses, the mineralogy and chemistry of the statue and its contaminants were characterized. Finally, finite element structural analyses were performed to determine the origin of hairline cracks that are visible on his ankles and the tree stump, to decide if intervention was necessary. (They decided it wasn't; these cracks arose in 1871, when the statue briefly tilted forward 3 degrees due to settling of the ground in the Piazza Signoria. This tilt was one of the reasons they moved the statue to the Galleria dell'Accademia.)  

"The results of this diagnostic campaign are summarized in the book Exploring David . . . . The book, written in English, also contains a history of the statue and its past restorations, a visual analysis of the chisel marks of Michelangelo as evident from the statue surface, and an essay by museum director Franca Falletti on the difficulties of restoring famous artworks. . . .  

"Aside from its sweeping scientific vision, what is remarkable about this book is that many of the studies employed a three-dimensional computer model of the statue - the model created by us during the Digital Michelangelo Project. Although we worked hard to create this model, and we envisioned 3D models eventually being used to support art conservation, we did not expect such uses to become practical so soon. After all, our model of the David is huge; outside our laboratory and a few others in the computer graphics field, little software exists that can manipulate such large models. However, with help from Roberto Scopigno and his team at CNR-Pisa, museum director Franca Falletti prodded, encouraged, and cajoled the scientists working under her direction to use our model wherever possible. We contributed a chapter to this book, on the scanning of the statue, but we take no credit for its use in the rest of the book. In fact, to us at Stanford University, the timing of our scanning project relative to the statue's restoration and the creation of this book seems merely fortuitious. However, Falletti insists that she had this use of our model in mind all along! In any case, this is a landmark book - the most extensive use that has ever been made of a 3D computer model in an art conservation project" (http://graphics.stanford.edu/projects/mich/book/book.html, accessed 12-23-2009).

On July 21, 2009 the team announced that they had a "full-resolution (1/4mm) 3D model of Michelangelo's 5-meter statue of David", containing "about 1 billion polygons."

View Map + Bookmark Entry

Storing Public Records Electronically 1999

In 1999 the British Government issued a white paper entitled Modernising Government, setting among its goals that by 2004 all newly created public records would be electronically stored and retrieved. 

View Map + Bookmark Entry

"Lots of Copies Keep Stuff Safe" (LOCKSS): a Digital Information Preservation System for Libraries and Publishers 1999

In 1999 the LOCKSS open-source, library-led digital preservation system ("Lots of Copies Keep Stuff Safe") began intensive testing at Stanford University. The system provides libraries with:

"low-cost, open source digital preservation tools to preserve and provide access to persistent and authoritative digital content.

 "Librarians need to know that the digital content they acquire today will not disappear when they cancel subscriptions, and that their electronic collections can be preserved and accessed by readers far into the future.

"Publishers need to know that the integrity of their web-based content will remain unchanged and available in perpetuity—even if their own website is no longer available." 

"The LOCKSS system is the first and only mechanism to apply the traditional purchase-and-own library model to electronic materials. The LOCKSS system allows librarians at each institution to take custody of and preserve access to the e-content to which they subscribe, restoring the print purchase model with which librarians are familiar. Using their computers and network connections, librarians can obtain, preserve and provide access to purchased copies of e-content. This is analogous to libraries’ using their own buildings, shelves and staff to obtain, preserve and provide access to paper content. The LOCKSS model restores libraries’ ability to build and preserve local collections.

A LOCKSS network functions in much the same way as traditional library networks, reinforcing the strength of the library community. Participating libraries acquire copies of important “stuff,” but instead of paper, LOCKSS libraries acquire digital content in their local LOCKSS Box. Through a LOCKSS distributed network, libraries are cooperating with one other to ensure their preserved content remains authentic and authoritative. This collaboration measure and validates the integrity of the participants’ holdings. As a result, libraries are self-reliant and self-sustainable in their communities.

When the publisher’s web site is unavailable for any reason, content is served from the library’s “LOCKSS Box,” guaranteeing immediate and continuous user access. There are no “trigger events” that require human intervention. LOCKSS delivers a copy of the original publication to authorized users in real time, whenever it is needed. Because LOCKSS preserves the original publisher’s copy of each item, it ensures that the most authoritative version persists, unchanged, with full credit to the publisher" (http://www.lockss.org/about/what-is-lockss/, accessed 12-08-2013).

View Map + Bookmark Entry

Continuing to Print the British Parliamentary Papers on Vellum November 2, 1999

On November 2, 1999 an unlikely alliance of disgruntled Labor backbenchers and Tories in the British Parliament defeated a move to end the centuries-old tradition of printing copies of Acts of Parliament on vellum, by 121 votes to 53. Remarkably this also shows that the centuries-old debate continued on whether paper or vellum are the more permanent material for the storage of information.

"Under the scheme, already approved by the Lords, instead of two copies printed on vellum, only one would be produced on archive paper which has a life expectancy of 500 years.

“Labour's Nick Palmer, a Commons administration committee member, urged MPs to approve the change - which would have saved £30,000 a year and the skin of several goats.

“But opposition to it was led by Labour's Brian White (Milton Keynes NE) who said it would almost certainly put 12 people at William Cowley, a parchment and vellum printing company in his constituency, out of work and mean the death of the industry in Britain.

"He claimed the committee had not consulted the firm about the change until it was too late, and urged MPs to find a "different way forward that doesn't destroy an industry".

“Acts of Parliament dating back to 1497 recorded on vellum are currently held in the House of Lords Public Record Office.

“Under the proposed change duplicate copies of Acts of Parliament would also no longer be placed in the Public Record Office at Kew, replacing a resolution decreed in 1849 that two copies of every Act should be printed on vellum.

“Opening the short debate, Dr Palmer (Broxtowe) said the committee considered the change "appropriate and justifiable".

“Continuing to deposit duplicate record copies of both public and private Acts at the Public Record Offices appeared to "serve no useful purpose".

“Dr Palmer dismissed concerns about the durability of archive paper compared with vellum as "groundless".

“He said vellum and archive paper were both flammable so security could not depend alone on the document.

“Dr Palmer said he found it "attractive" that Parliament would not be using animal products where it was not necessary, although it was not one of the arguments advanced by the committee report.

“'We didn't have sentiment or animal welfare consideration affecting our judgment here, we reached it for practical, you might even say prosaic, reasons,' he said.

“Dr Palmer said British Library conservation department laboratory tests had proved that archival paper could have a life expectancy exceeding 500 years.

“But Tory Gerald Howarth (Aldershot) said: "I don't believe that this kind of tradition should lightly be tossed aside."

“Mr Howarth said the death warrant of Charles I was recorded on vellum and added: 'Who is to say whether archival paper will last 300 to 400 years? We shouldn't take the chance.' "

quoted from BBC News http://news.bbc.co.uk/2/hi/uk_news/politics/502342.stm accessed 12-04-2008.

 

View Map + Bookmark Entry

2000 – 2005

Over 5,000,000 Items in the National Digital Library Program 2000

In 2000 the National Digital Library Program sponsored by the Library of Congress digitized and made available online over 5,000,000 items.

View Map + Bookmark Entry

MINERVA to Preserve Open-Access Web Resources 2000

In 2000 the Library of Congress initiated a prototype system called Minerva (Mapping the Internet the Electronic Resources Virtual Archive) to collect and preserve open-access Web resources.

View Map + Bookmark Entry

National Digital Information Infrastructure and Preservation Program December 21, 2000

On December 21, 2000 the U.S. Congress appropriated $99,800,000 for the planning and implementation of the National Digital Information Infrastructure and Preservation Program (NDIIPP). It was a collaborative initiative of the Library of Congress.

View Map + Bookmark Entry

High Density Rosetta Archival Preservation Technology 2001

"NanoRosetta archival preservation technology utilizes unique microscopic processes to provide analog and / or digital data, including information as text, line illustrations, or photos on nickel plates. Density can be as high as 50,000 pages per plate."

In 2001 Norsam Technologies, Santa Fe, New Mexico, developed High Density Rosetta (HD-Rosetta) archival preservation technology, which "uses unique microscopic processes to provide analog and/or digital data, information or pictures on nickel plates." Density could be 20 times that of microfilm/microfiche. 

196,000 pages of text could be etched with an electron microscope on a two square-inch plate. 

"Benefits of the HD-ROSETTA Nickel Tablet System:

"Few environmental controls required

"Immune to technology obsolescence

"High temperature tolerance

"Immune to water damage

"Unaffected by electromagnetic radiation

"Highly durable over long periods of time."

View Map + Bookmark Entry

The Digital Preservation Coalition January 2001

In January 2001 the Digital Preservation Coalition was established in Heslington, York, United Kingdom "to foster join action to address the urgent challenges of securing the preservation of digital resources in the UK and to work with others internationally to secure our global digital memory and knowledge base."

View Map + Bookmark Entry

A Reference Model for an Open Archival Information System January 2001

In January 2001 The Consultative Committee for Space Data Systems (CCSDS), Washington, D.C., issued Reference Model for an Open Archival Information System (OAIS).

"An OAIS is an archive, consisting of an organization of people and systems, that has accepted the responsibility to preserve information and make it available for a Designated Community. It meets a set of such responsibilities as defined in this document and this allows an OAIS archive to be distinguished from other uses of the term ‘archive’. The model provides a framework for the understanding and increased awareness of archival concepts needed for long-term digital information preservation and access, and for describing and comparing architectures and operations of existing and future archives. It also guides the identification and production of OAIS related standards." ISO Number : 1472

View Map + Bookmark Entry

Double Fold: Libraries and the Assault on Paper April 2001

Nicholson Baker

The cover art for Double Fold : Libraries and the Assault on Paper

Front cover of the July 24, 2000 Edition of The New Yorker

In April 2001 American writer Nicholson Baker, of South Berwick, Maine, published Double Fold: Libraries and the Assault on Paper. Prior to the book an excerpt appeared in the July 24, 2000 issue of The New Yorker, under the title "Deadline: The Author's Desperate Bid to Save America's Past."

Baker's exhaustively researched polemic detailed his quest to expose the fate of thousands of books and newspapers that were replaced and often destroyed during the microfilming boom of the 1980s and '90s.

"The term 'double fold' refers to the test used by many librarians and preservation administrators to determine the brittleness and 'usability' of paper. The test consists of folding down the corner of a page of a book or newspaper, then folding it back in the opposite direction—one double fold. The action is then repeated until the paper breaks or is about to break. The more folds the page can withstand, the more durable it is. (In the late 1960s, preservation founding father William Barrow was fond of using a machine-run fold tester to back up his claims about the number of endangered books.) This experiment was used by library officials to identify their institution's brittle books, and, in some case, to justify withdrawing items from the shelves or replacing them with another format (most often microfilm). Baker's take on the double-fold test? '...utter horseshit and craziness. A leaf of a book is a semi-pliant mechanism. It was made for non-acute curves, not for origami.' (p. 157)"

"In 1999, Baker took matters into his own hands and founded the American Newspaper Repository in order to save some of the collections being auctioned off by the British Library. A year later he became the owner of thousands of volumes of old newspapers, including various runs of the New York Times, the Chicago Tribune, the New York Herald Tribune, and the New York World. In May 2004 the entire collection was moved to Duke University, where it is stored on climate-controlled shelves and looked after by the Rare Books and Special Collections division. As part of the gift agreement between the American Newspaper Repository and Duke, the collection will kept together in perpetuity, and no disbinding or experimental deacidification will be allowed.

"Baker makes four recommendations in Double Fold's epilogue: that libraries should be required to publish lists of discarded holdings on their websites, that the Library of Congress should fund a building that will serve as a storage repository for publications and documents not housed on-site, that some U.S. libraries should be designated with saving newspapers in bound form, and that both the U.S. Newspaper and the Brittle Books Programs should be abolished, unless they can promise that all conservation procedures will be non-destructive and that originals will be saved" (Wikipedia article on Double Fold, accessed 07-28-2009).

View Map + Bookmark Entry

"The Wayback Machine" Becomes Operational October 24, 2001

On October 24, 2001 The Internet Archive first made its retrospective data available through the Wayback Machine. The name Wayback Machine is a droll reference to a plot device in the animated cartoon series, The Rocky and Bullwinkle Showin which Mr. Peabody and Sherman routinely used a time machine called the "WABAC machine" (pronounced "Wayback") to witness, participate in, and, more often than not, alter famous events in history.

"In 1996 Brewster Kahle, with Bruce Gilliat, developed software to crawl and download all publicly accessible World Wide Web pages, the Gopher hierarchy, the Netnews bulletin board system, and downloadable software. The information collected by these "crawlers" does not include all the information available on the Internet, since much of the data is restricted by the publisher or stored in databases that are not accessible. These "crawlers" also respect the robots exclusion standard for websites whose owners opt for them not to appear in search results or be cached. To overcome inconsistencies in partially cached websites, Archive-It.org was developed in 2005 by the Internet Archive as a means of allowing institutions and content creators to voluntarily harvest and preserve collections of digital content, and create digital archives.

"Information was kept on digital tape for five years, with Kahle occasionally allowing researchers and scientists to tap into the clunky database.When the archive reached its five-year anniversary, it was unveiled and opened to the public in a ceremony at the University of California" (Wikipedia article on Wayback Machine, accessed 12-06-2013).

View Map + Bookmark Entry

Trusted Digital Repositories: Attributes and Responsibilities May 2002

In May 2002 RLG in Mountain View, California, and OCLC in Dublin, Ohio issued the report, Trusted Digital Repositories: Attributes and Responsibilities.

View Map + Bookmark Entry

How Much Information? 2003

The University of California Berkeley logo

How much information 2003: The research project from the University of California at Berkeley, first published on the web in 2000, updated its findings in 2003. Strikingly it estimated that each person in the U.S. generated 800 MB of recorded information. This was more than three times the data per capita that the same research project calculated was being produced in 2000. The remaining data in this entry of the database is quoted from the 2003 website:

"How much new information is created each year? Newly created information is stored in four physical media -- print, film, magnetic and optical --and seen or heard in four information flows through electronic channels -- telephone, radio and TV, and the Internet. This study of information storage and flows analyzes the year 2002 in order to estimate the annual size of the stock of new information recorded in storage media, and heard or seen each year in information flows. Where reliable data was available we have compared the 2002 findings to those of our 2000 study (which used 1999 data) in order to describe a few trends in the growth rate of information.

  1. Print, film, magnetic, and optical storage media produced about 5 exabytes of new information in 2002. Ninety-two percent of the new information was stored on magnetic media, mostly in hard disks.
    • How big is five exabytes? If digitized with full formatting, the seventeen million books in the Library of Congress contain about 136 terabytes of information; five exabytes of information is equivalent in size to the information contained in 37,000 new libraries the size of the Library of Congress book collections.
    • Hard disks store most new information. Ninety-two percent of new information is stored on magnetic media, primarily hard disks. Film represents 7% of the total, paper 0.01%, and optical media 0.002%.
    • The United States produces about 40% of the world's new stored information, including 33% of the world's new printed information, 30% of the world's new film titles, 40% of the world's information stored on optical media, and about 50% of the information stored on magnetic media.
    • How much new information per person? According to the Population Reference Bureau, the world population is 6.3 billion, thus almost 800 MB of recorded information is produced per person each year. It would take about 30 feet of books to store the equivalent of 800 MB of information on paper.
  2. We estimate that the amount of new information stored on paper, film, magnetic, and optical media has about doubled in the last three years.
    • Information explosion? We estimate that new stored information grew about 30% a year between 1999 and 2002.
    • Paperless society? The amount of information printed on paper is still increasing, but the vast majority of original information on paper is produced by individuals in office documents and postal mail, not in formally published titles such as books, newspapers and journals.
  3. Information flows through electronic channels -- telephone, radio, TV, and the Internet -- contained almost 18 exabytes of new information in 2002, three and a half times more than is recorded in storage media. Ninety eight percent of this total is the information sent and received in telephone calls - including both voice and data on both fixed lines and wireless.
    • Telephone calls worldwide � on both landlines and mobile phones � contained 17.3 exabytes of new information if stored in digital form; this represents 98% of the total of all information transmitted in electronic information flows, most of it person to person.
    • Most radio and TV broadcast content is not new information. About 70 million hours (3,500 terabytes) of the 320 million hours of radio broadcasting is original programming. TV worldwide produces about 31 million hours of original programming (70,000 terabytes) out of 123 million total hours of broadcasting.
    • The World Wide Web contains about 170 terabytes of information on its surface; in volume this is seventeen times the size of the Library of Congress print collections.
    • Instant messaging generates five billion messages a day (750GB), or 274 Terabytes a year.
    • Email generates about 400,000 terabytes of new information each year worldwide.
    • P2P file exchange on the Internet is growing rapidly. Seven percent of users provide files for sharing, while 93% of P2P users only download files. The largest files exchanged are video files larger than 100 MB, but the most frequently exchanged files contain music (MP3 files).
    • How we use information. Published studies on media use say that the average American adult uses the telephone 16.17 hours a month, listens to radio 90 hours a month, and watches TV 131 hours a month. About 53% of the U.S. population uses the Internet, averaging 25 hours and 25 minutes a month at home, and 74 hours and 26 minutes a month at work � about 13% of the time."
View Map + Bookmark Entry

Collecting and Preserving the World Wide Web February 23, 2003

UKOLN logo

University of Bath logo

On February 23, 2003 Michael Day of UKOLN, University of Bath, published a comprehensive review of worldwide projects for preservation of web data: Collecting and Preserving the World Wide Web.

View Map + Bookmark Entry

The First Automatic Page-Turning Scanner: "Moving Knowledge from Books to Bytes" April 7 – April 9, 2003

Lotfi Belkhir

The Kirtas BookScan 1200

The Kirtas Technologies logo

At the AIIM Exhibition in New York City on April 7, 2003 Lotfi Belkhir (formerly of the Venture Lab at Xerox) introduced the Kirtas BookScan 1200 produced by Kirtas Technologies, Victor, New York

The BookScan 1200 was the first automatic, page-turning scanner for the conversion of bound volumes to digital files. The manufacturers claimed that it could scan volumes at up to 1200 pages per hour. The motto of the company was "Moving knowledge from Books to Bytes."

View Map + Bookmark Entry

Netpreserve.org is Founded July 2003

The IIPC logo

In July 2003 the International Internet Preservation Consortium  (IIPC,) netpreserve.org, was founded.

"In July 2003 the national libraries of Australia, Canada, Denmark, Finland, France, Iceland, Italy, Norway, Sweden, The British Library (UK), The Library of Congress (USA) and the Internet Archive (USA) acknowledged the importance of international collaboration for preserving Internet content for future generations. This group of 12 institutions chartered the IIPC to fund and participate in projects and working groups to accomplish the Consortium’s goals. The initial agreement was in effect for three years, during which time the membership was limited to the charter institutions. Since then, membership has expanded to include additional libraries, archives, museums and cultural heritage institutions involved in Web archiving.

"The goals of the consortium are:

" * To enable the collection, preservation and long-term access of a rich body of Internet content from around the world.

" * To foster the development and use of common tools, techniques and standards for the creation of international archives.

" * To be a strong international advocate for initiatives and legislation that encourage the collection, preservation and access to Internet content.

" * To encourage and support libraries, archives, museums and cultural heritage institutions everywhere to address Internet content collecting and preservation."

View Map + Bookmark Entry

OCLC Serves More than 50,000 Libraries, Contains 56 Million Records 2004

The OCLC logo

In 2004 OCLC (Online Computer Library Center), Dublin, Ohio, served more than 50,540 libraries of all types in the U.S. and 84 countries and territories around the world. OCLC WorldCat contained 56 million catalogue records, representing 894 million holdings.

View Map + Bookmark Entry

Approximately 530 miles of Bookshelves 2004

The Library of Congress, as seen from the outside

The interior of the Library of Congress

Bookshelves inside the Library of Congress

In 2004 the Library of Congress contained 130,000,000 physical items on approximately 530 miles of bookshelves. Its collections included more than 29 million books and other printed materials, 2.7 million recordings, 12 million photographs, 4.8 million maps, and 58 million manuscripts

View Map + Bookmark Entry

The National Digital Newspaper Program March 2004

In March 2004 the National Endowment for the Humanities and the Library of Congress founded the National Digital Newspaper Program (NDNP).

"Ultimately over a period of approximately 20 years, NDNP will create a national, digital resource of historically significant newspapers from all the states and U.S. territories published between 1836 and 1922. This searchable database will be permanently maintained at the Library of Congress (LC) and be freely accessible via the Internet. An accompanying national newspaper directory of bibliographic and holdings information on the website will direct users to newspaper titles available in all types of formats."

View Map + Bookmark Entry

The Google Print Project is Announced October 2004

The Frankfurt Bookfair logo

The original Google Print homepage

The new Google Books homepage

At the Frankfurt Book Fair in October 2004 Google announced the Google Print project to scan and make searchable on the Internet the texts of more than ten million books from the collections of the New York Public Library, and the libraries of Michigan, Stanford, Harvard and Oxford Universities.

The project was renamed Google Books in December 2005.

View Map + Bookmark Entry

2005 – 2010

Digitizing the Matthew Parker Library Begins 2005

The Parker Library on the Web homepage

Archbishop Matthew Parker

The Parker Library at Corpus Christi College, Cambridge

In 2005 the Parker Library on the Web project began the process of digitizing one of the greatest collections of medieval manuscripts formed in the sixteenth century by Archbishop Matthew Parker. It was:

"a multi-year undertaking of Corpus Christi College, the Stanford University Libraries and the Cambridge University Library, to produce a high-resolution digital copy of every imageable page in the 538 manuscripts described in M. R. James Descriptive Catalogue of the Manuscripts in the Library of Corpus Christi College, Cambridge (Cambridge University Press, 1912), and to build an interactive web application in which the manuscript page images can be used by scholars and students in the context of editions, translations and secondary sources" (Parker Library on the Web site, accessed 11-27-2008).

The project was expected to be completed in 2009. The website of the Parker Library is at this link.

View Map + Bookmark Entry

Proposal for a World Digital Library June 6, 2005

The UNESCO logo

James H. Billington

The World Digital Library homepage

At the Plenary Session of the U.S. National Commission for UNESCO held at Georgetown University on June 6, 2005 Librarian of Congress James H. Billington offered a Proposal for a World Digital Library. A link to the World Digital Library site can be found here.

"The invention of the printing press with movable type fanned religious wars in the 16th century. The onset of telegraphy, photography, and the power-driven printing press in the 19th century created mass journalism that fulminated nationalistic passions and world wars in the 20th century. The arrival in the late 20th century of instantaneous, networked, global communication may well have facilitated the targeted propaganda, recruitment, and two-way communication of transnational terrorist organizations more than it has helped combat them.

"We are now discovering—painfully and much too slowly—that deep conflict between cultures is in many ways being fired up rather than cooled down by this revolution in communications, as was the case in the 16th and 19th centuries. Whenever new technology suddenly brings different peoples closer together and makes them aware of certain commonalities, it seems simultaneously to create a compensatory psychological need for the different peoples to define—and even assert aggressively—what is unique and distinctive about their own historic cultures."

View Map + Bookmark Entry

Electronic Records Archives System September 8, 2005

The Lockheed Martin logo

The National Archives seal

On September 8, 2005 the National Archives and Records Administration (NARA) selected Lockheed Martin Corporation to build the Electronic Records Archives (ERA) system, a permanent electronic archives system to preserve, manage, and make accessible the electronic records created by the federal government. The ERA system would capture electronic information – regardless of its format – save it permanently, and make it accessible on whatever future hardware or software is currently in use. Development of the system would continue over the next six years, and cost $308,000,000.

View Map + Bookmark Entry

Second International Conference on the Preservation of Digital Objects September 15 – September 16, 2005

On September 15-6, 2005 the second International Conference of the Preservation of Digital Objects took place in Göttingen, Germany.  (The first international conference in this series took place in 2004 in Beijing.)

View Map + Bookmark Entry

Google Print Morphs in Two October 2005

In October 2005 Google Print morphed into the Google Print Publisher Program and the Google Print Library Program.

View Map + Bookmark Entry

The Genetic Code of Avian Flu Virus H5N1 is Deciphered October 5, 2005

The Armed Forces Institute of Pathology logo

Colorized transmission electron micrograph of Avian influenza A H5N1 viruses (seen in gold) grown in MDCK cells (seen in green)

On October 5,2005 scientists at the Armed Forces Institute of Pathology announced that they deciphered the genetic code of the 1918 avian flu virus H5N1, which killed as many as 50,000,000 people worldwide, from a victim exhumed in 1997 from the Alaskan permafrost. The scientists reconstructed the virus in the laboratory and published the genetic sequence.

View Map + Bookmark Entry

A Plan to Create a World Digital Library November 11, 2005

The World Digital Library homepage

On November 11, 2005 the Library of Congress announced a plan to create the World Digital Library of works in the public domain. Google donated $3,000,000 toward the costs of planning this project.

View Map + Bookmark Entry

Google Books December 2005

The Google Books logo

The shift from Google Print to Google Books

In December 2005 the Google Print project morphed into Google Books.

View Map + Bookmark Entry

Maybe the World's Largest Physical Library December 2005

The exterior of the British Library

The interior of the British Library

A view of the statue outside the British Library

Bookshelves within the British Library

The British Library with about 150,000,000 physical items on 625km of shelves might have been the world's largest physical library in 2005, though the U.S. Library of Congress also made this claim. The British Library added about 3,000,000 physical items per year, which occupied about 12km of new shelving. At the end of 2005 the Library of Congress held about 130,000,000 physical items and had more than 8,000,000 digital items online.

View Map + Bookmark Entry

The Heritage Health Index Report on the State of America's Collections December 2005

The Heritage Preservation logo

The Institute of Museum and Library Services logo

In December 2005 Heritage Preservation, the U.S. National Institute for Conservation, and the Institute of Museum and Library Services published The Heritage Health Index Report on the State of America's Collections. Among the conclusions of this report were that there were 4.8 billion cultural heritage materials in the U.S. and over 1.3 billion of those items were at risk.  Forty percent of the surveyed institutions that housed those items reported no budget allocated for preservation while 80% of the institutions had no disaster plan.

View Map + Bookmark Entry

Data Curation as a Profession 2006

In 2006 The Center for Informatics Research in Science and Scholarship (CIRSS), formerly the Library Research Center (LRC), of the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign, began funding the Data Curation Education Program (DCEP).

"Data curation is the active and on-going management of data through its lifecycle of interest and usefulness to scholarly and educational activities across the sciences, social sciences, and the humanities. Data curation activities enable data discovery and retrieval, maintain data quality, add value, and provide for re-use over time. This new field includes representation, archiving, authentication, management, preservation, retrieval, and use. Our program offers a focus on data collection and management, knowledge representation, digital preservation and archiving, data standards, and policy, providing the theory and skills necessary to work directly with academic and industry researchers who need data curation expertise. To this end, DCEP has established a number of educational collaborations with premier science, social science, and humanities data centers across the country to prepare a new generation of library and information science professionals to curate materials from databases and other formats. We anticipate that our graduates will be employed across a range of information-oriented institutions, including museums, data centers, libraries, institutional repositories, archives, and private industry."

The program began with a focus on "data curation curriculum and best practices for the LIS and scientific communities. IMLS provided additional funding in 2008 to extend the curriculum to include humanities data" (Data Curation Education Program website, accessed 01-28-2009).

View Map + Bookmark Entry

A Research Library Based on Historical Collections of the Internet Archive February 2006

A screenshot of the D-Lib Magazine homepage

In the February 2006 issue of D-Lib Magazine researchers at Cornell University from the departments of Computer Science, Information Science, and the Cornell Theory Center described plans for A Research Library Based on the Historical Collections of the Internet Archive. The library, a super-computing application consisting of 10 billion web pages, was intended to be used by social scientists.

View Map + Bookmark Entry

World Wide Web History Center is Founded March 2006

The World Wide Web History Center logo

William B. Prickett

In March 2006 Marc Weber and William B. Pickett founded the World Wide Web History Center.

View Map + Bookmark Entry

Studies on Digital Library Evolution March 2006

In March 2006D-Lib Magazine, produced by the Corporation for National Research Initiatives, Reston, Virginiapublished a special issue on "Digital Library Evolution."

View Map + Bookmark Entry

Damage to Codex Atlanticus Caused by Efforts at Preservation April 2006

A self-portrait by Leonardo da Vinci in red chalk

A unique edition of the Codex Atlanticus as it was in the 1600s. The book is a box made by Pompeo Leoni to collect all of the pages made by Mario Taddei in 2007

In April 2006 Carmen Bambach of the Metropolitan Museum of Art, New York discovered an extensive invasion of molds of various colors, "including black, red, and purple, along with swelling of pages" on the priceless manuscripts of Leonardo da Vinci's Codex Atlanticus, preserved in the Bibliotheca Ambrosiana in Milan. 

In 2008 The Opificio delle Pietre Dure, in Florence "determined that the colors found on the pages weren't the product of mold, but instead caused by mercury salts added to protect the Codex from mold."

View Map + Bookmark Entry

A Critical Review at the Library of Congress April 3, 2006

Representing the Library of Congress Professional Guild, Thomas Mann published A Critical Review of Karen Calhoun's paper published on March 17. This review rebutted various assertions in the Calhoun report.

View Map + Bookmark Entry

"The entire works of humankind, from the beginning of recorded history, in all languages" would amount to 50 petabytes of data. May 14, 2006

Kevin Kelly

In the New York Times Magazine on May 14, 2006 Kevin Kelly of Pacifica, California, published Scan this Book!—an account of developments leading to the "universal" digital library on the Internet.

"From the days of Sumerian clay tablets till now, humans have "published" at least 32 million books, 750 million articles and essays, 25 million songs, 500 million images, 500,000 movies, 3 million videos, TV shows and short films and 100 billion public Web pages. All this material is currently contained in all the libraries and archives of the world. When fully digitized, the whole lot could be compressed (at current technological rates) onto 50 petabyte hard disks. Today you need a building about the size of a small-town library to house 50 petabytes. With tomorrow's technology, it will all fit onto your iPod. When that happens, the library of all libraries will ride in your purse or wallet — if it doesn't plug directly into your brain with thin white cords. Some people alive today are surely hoping that they die before such things happen, and others, mostly the young, want to know what's taking so long. (Could we get it up and running by next week? They have a history project due.)"

View Map + Bookmark Entry

OCLC Merges with RLG July 1, 2006

The OCLC logo

The RLG logo

On July 1, 2006 OCLC merged with RLG. The combination of programs and services was expected to "advance offerings and drive efficiencies for libraries, archives, museums and other research organizations worldwide."

View Map + Bookmark Entry

The Royal Society Digital Journal Archive October 29, 2006

The entrance to the Royal Society of London

On October 29, 2006 The Royal Society of London announced that The Royal Society Digital Journal Archive, dating back to 1665 and containing the full text and illustrations of more than 60,000 articles published in the Philosophical Transactions of the Royal Society was available online.

View Map + Bookmark Entry

The EPA Begins to Close its Scientific Libraries November 20, 2006

The Environmental Protection Agency seal

The Boston Globe reported on November 20, 2006 that the The Environmental Protection Agency (EPA) had begun to close its nationwide network of scientific libraries, effectively preventing EPA scientists and the public from accessing vast amounts of data and information on issues from toxicology to pollution. Several libraries were already dismantled, with their contents either destroyed or shipped to repositories where they were uncataloged and inaccessible.

View Map + Bookmark Entry

Demanding that the U.S. EPA Desist from Destroying its Libraries November 30, 2006

Stephen Johnson

On November 30, 2006 ranking members of congressional committees wrote to Stephen Johnson, Administrator of the U.S. Environmental Protection Agency, demanding that the agency desist from destroying its libraries:

"Over the past 36 years, EPA's libraries have accumulated a vast and invaluable trove of public health and environmental information, including at least 504,000 books and reports, 3,500 journal titles, 25,000 maps, and 3.6 million information objects on microfilm, according to the report issued in 2004: Business Case for Information Services: EPA's Regional Libraries and Centers prepared for the Agency by Stratus Consulting. Each one of EPA's libraries also had information experts who helped EPA staff and the public access and use the Agency's library collection and information held in other library collections outside of the Agency. It now appears that EPA officials are dismantling what is likely one of our country's most comprehensive and accessible collections of environmental materials.
The press has reported on the concerns over the library reorganization plan voiced by EPA professional staff of the Office of Enforcement and Compliance Assurance (OECA), 16 local union Presidents representing EPA employees, and the American Library Association. In response to our request of September 19, 2006, (attached), the Government Accountability Office has initiated an investigation of EPA's plan to close its libraries. Eighteen Senators sent a letter on November 3, 2006, to leaders of the Senate Appropriations Committee asking them
to direct EPA "to restore and maintain public access and onsite library collections and services at EPA's headquarters, regional, laboratory and specialized program libraries while the Agency solicits and considers public input on its plan to drastically cut its library budget and services"
(attached). Yet, despite the lack of Congressional approval and the concerns expressed over this plan, your Agency continues to move forward with dismantling the EPA libraries. It is imperative that the valuable government information maintained by EPA's libraries
be preserved. We ask that you please confirm in writing by no later than Monday, December 4, 2006, that the destruction or disposition of all library holdings immediately ceased upon the Agency's receipt of this letter and that all records of library holdings and dispersed materials are being maintained."

View Map + Bookmark Entry

A Printed Book on Preserving Digital Information 2007

The cover art of Preserving Digital Information by Henry M. Gladney

In 2007 Henry M. Gladney, Saratoga, California, issued his monograph, Preserving Digital Information, as a printed book.

View Map + Bookmark Entry

Data-Storing Bacteria Could Last Thousands of Years February 27, 2007

The Keio University crest

Bacillius Subtilis, the bacteria on which the data was stored

A technology developed at Keio University, Tokyo, Japan, and announced on February 27, 2007, carried with it the possibility that bacterial DNA could be used as a medium for storing digital information long-term—potentially thousands of years.

"Keio University Institute for Advanced Biosciences and Keio University Shonan Fujisawa Campus announced the development of the new technology, which creates an artificial DNA that carries up to more than 100 bits of data within the genome sequence, according to the JCN Newswire. The universities said they successfully encoded "e= mc2 1905!" -- Einstein's theory of relativity and the year he enunciated it -- on the common soil bacteria,  Bacillius subtilis."

View Map + Bookmark Entry

It Could Take 1800 Years to Convert the Paper Records . . . . March 10, 2007

Bookshelves inside the Library of Congress

On March 10, 2007 the U.S. National Archives estimated that at the current rate of digitization of its 9 billion text records, it could take 1800 years to convert the paper text records in the National Archives to digital form. This estimate came from an article in The New York Times entitled History Digitized (and Abridged), which pointed out that economic and copyright considerations required the digitization of library and archival collections to be very selective. 

View Map + Bookmark Entry

DROID, an Archives Analysis and Identification Tool September 27, 2007

The National Archives in London

"An innovative tool to analyse and identify computer file formats has won the 2007 Digital Preservation Award. DROID, developed by The National Archives in London, can examine any mystery file and identify its format. The tool works by gathering clues from the internal 'signatures' hidden inside every computer file, as well as more familiar elements such as the filename extension (.jpg, for example), to generate a highly accurate 'guess' about the software that will be needed to read the file. . . .

"Now, by using DROID and its big brother, the unique file format database known as PRONOM, experts at the National Archives are well on their way to cracking the problem. Once DROID has labelled a mystery file, PRONOM's extensive catalogue of software tools can advise curators on how best to preserve the file in a readable format. The database includes crucial information on software and hardware lifecycles, helping to avoid the obsolescence problem. And it will alert users if the program needed to read a file is no longer supported by manufacturers.

"PRONOM's system of identifiers has been adopted by the UK government and is the only nationally-recognised standard in its field."

View Map + Bookmark Entry

The World's Oldest Oil Paintings Restored After Taliban Dynamite February 19, 2008

"The oldest known oil painting, dating from 650 A.D., has been found in caves in Afghanistan's Bamiyan Valley, according to a team of Japanese, European and U.S. scientists.

"The discovery reverses a common perception that the oil painting, considered a typically Western art, originated in Europe, where the earliest examples date to the early 12th century A.D.

"Famous for its 1,500-year-old massive Buddha statues, which were destroyed by the Taliban in 2001, the Bamiyan Valley features several caves painted with Buddhist images.

"Damaged by the severe natural environment and Taliban dynamite, the cave murals have been restored and studied by the National Research Institute for Cultural Properties in Tokyo, as a UNESCO/Japanese Fund-in-Trust project.

"Since most of the paintings have been lost, looted or deteriorated, we are trying to conserve the intact portions and also try to understand the constituent materials and painting techniques," Yoko Taniguchi, a researcher at the National Research Institute for Cultural Properties in Tokyo, told Discovery News.

" 'It was during such analysis that we discovered oily and resinous components in a group of wall paintings.'

"Painted in the mid-7th century A.D., the murals have varying artistic influences and show scenes with knotty-haired Buddhas in vermilion robes sitting cross-legged amid palm leaves and mythical creatures.

"Most likely, the paintings are the work of artists who traveled on the Silk Road, the ancient trade route between China, across Central Asia's desert to the West" (http://dsc.discovery.com/news/2008/02/19/oldest-oil-painting.html, accessed 07-11-2009).

View Map + Bookmark Entry

The Effect of Decay Fungi on Wood Used in the Production of Violins June 28, 2008

On June 28, 2008 Francis W. M. R. Schwartze of the Section of Wood Protection and Biotechnology, Wood Laboratory, Swiss Federal Laboratories for Materials Testing and Research (Empa) St. Gallen, and Melanie Spycher, and Siegfried Fink published "Superior wood for violins – wood decay fungi as a substitute for cold climate," New Phytologist 179 (2008) 1095-1104.

ABSTRACT 

"• Violins produced by Antonio Stradivari during the late 17th and early 18th centuries are reputed to have superior tonal qualities. Dendrochronological studies show that Stradivari used Norway spruce that had grown mostly during the Maunder Minimum, a period of reduced solar activity when relatively low temperatures caused trees to lay down wood with narrow annual rings, resulting in a high modulus of elasticity and low density. 

"• The main objective was to determine whether wood can be processed using selected decay fungi so that it becomes acoustically similar to the wood of trees that have grown in a cold climate (i.e. reduced density and unchanged modulus of elasticity). 

"• This was investigated by incubating resonance wood specimens of Norway spruce (Picea abies) and sycamore (Acer pseudoplatanus) with fungal species that can reduce wood density, but lack the ability to degrade the compound middle lamellae, at least in the earlier stages of decay. 

"• Microscopic assessment of the incubated specimens and measurement of five physical properties (density, modulus of elasticity, speed of sound, radiation ratio, and the damping factor) using resonance frequency revealed that in the wood of both species there was a reduction in density, accompanied by relatively little change in the speed of sound. Thus, radiation ratio was increased from 'poor' to 'good', on a par with 'superior' resonance wood grown in a cold climate."

View Map + Bookmark Entry

Creation of the HathiTrust Digital Library October 2008 – March 2012

In October 2008 thirteen universities in the Committee on Institutional Cooperation and the University of California founded the HathiTrust, a very large scale collaborative repository of digital content from research libraries, including content digitized via the Google Books project, and Internet Archive digitization initiatives, as well as content digitized locally by member libraries. The name came from the Hindu word for elephant, as in "an elephant never forgets."

♦ As of January 2011 over 50 academic research libraries were members of the HathiTrust. Its website published the following statistics:

7,909,950 total volumes,  4,057,969 book titles, 189,013 serial titles 2,768,482,500 pages,  355 terabytes,  94 miles,  6,427 tons,  1,972,865 volumes (~25% of total) in the public domain.

♦ In March 2012 the HathiTrust website published the following statistics:

10,109,695 total volumes,  5,371,919 book titles, 266,508 serial titles 3,538,393,250 pages,  453 terabytes,  120 miles,  8,214 tons,  2,802,045 volumes (~28% of total) in the public domain.

View Map + Bookmark Entry

Raphael's Madonna of the Goldfinch Restored 450 Years after it was Nearly Destroyed October 30, 2008

The Italian Renaissance painter Raphael's masterpiece, Madonna of the Goldfinch, which survived the collapse of a palace and more than four centuries of decay, reached the completion of a ten year restoration process, and on October 30, 2008 was returned to the Uffizi gallery with a strengthened canvas and its colors restored to their original radiance.

"Raphael painted this work around 1505 for the wedding of his friend Lorenzo Nasi, a rich merchant in Florence. When Nasi’s palace collapsed in 1548, the painting was shredded into 17 pieces. The work was first put together with pieces of wood and long nails. The work later developed a yellowish opaque color. Restorers feared touching it because it was very fragile."

"The painting features a seated Mary with John the Baptist passing on a goldfinch to Jesus as a forewarning of his violent death. The bird has been associated in art with Christ's crucifixion.

"The restoration work began in 1999 using X-rays, microscopes, and lasers to find and seal the ancient fractures."

View Map + Bookmark Entry

Discovery of a Previously Unknown Self- Portrait of Leonardo February 28, 2009

On February 28, 2009 Italian researchers reported the discovery of a previously unknown self-portrait by Leonardo da Vinci drawn when the artist was a young man. The faint pencil sketch was recognized underneath writing on a sheet of the “Codex on the Flight of Birds”, written between 1490 and 1505 and preserved in the Biblioteca Reale, Torino, Italy.

Piero Angela, an Italian scientific journalist, studying the document noticed the faint outline of a human nose hidden underneath lines of ink handwriting. It struck him as being similar in shape and drawing style to a later self-portrait of Leonardo. It is thought that Leonardo first made the drawing during the 1480s and reused the sheet for his manuscript on bird flight.

"Over months of micro-pixel work, graphic designers gradually 'removed' the text by making it white instead of black, revealing the drawing beneath. "What emerged was the face of a young to middle-aged man with long hair, a short beard and a pensive gaze.

"Mr Angela was struck by similarities to a famous self-portrait of Leonardo, made when the artist was an old man around 1512. The portrait, in red chalk, is kept in Turin’s Biblioteca Reale, or Royal Library.

"The research team used criminal investigation techniques to digitally correlate the newly-discovered sketch with the well-known portrait.

"They employed facial reconfiguration technology to age the drawing of the younger man, hollowing the cheeks, darkening the eyes and furrowing the brow.

"The two portraits were so similar 'that we may regard the hypothesis that the images portray the same person as reasonable', police photo-fit experts declared.

"To make doubly sure, the ageing process was reversed, with researchers using a digital 'facelift' to rejuvenate the older self-portrait.

"After removing the older Leonardo’s wrinkles and filling out his cheeks, the image that emerged was almost identical to the newly discovered sketch.

" 'When I actually tried to age the face [of the newly discovered portrait], and to put the hair and the beard of the famous self-portrait around it, a shiver ran down my spine,' said Mr Angelo. 'It resembled Leonardo like a twin brother. To uncover a new Leonardo drawing was astonishing.'

"The similarities were also studied by a facial reconstruction surgeon in Rome. '[He] said the two faces could well belong to the same man at different times in his life', said Mr Angelo.

"A world expert on Leonardo, Carlo Pedretti from the University of California, described the sketch as 'one of the most important acquisitions in the study of Leonardo, in the study of his image, and in the study of his thought too' (http://www.telegraph.co.uk/news/worldnews/europe/italy/4884789/Leonardo-da-Vinci-self-portrait-discovered-hidden-in-manuscript.html, accessed 02-28-2009).

View Map + Bookmark Entry

The WARC Format as an International File Preservation Standard June 1, 2009

On June 1, 2009 the International Internet Preservation Consortium (IIPC), netpreserve. org published the WARC file format as an international standard: ISO 28500:2009, Information and documentation—WARC file format.

"For many years, heritage organizations have tried to find the most appropriate ways to collect and keep track of World Wide Web material using web-scale tools such as web crawlers. At the same time, these organizations were concerned with the requirement to archive very large numbers of born-digital and digitized files. A need was for a container format that permits one file simply and safely to carry a very large number of constituent data objects (of unrestricted type, including many binary types) for the purpose of storage, management, and exchange. Another requirement was that the container need only minimal knowledge of the nature of the objects.

"The WARC format is expected to be a standard way to structure, manage and store billions of resources collected from the web and elsewhere. It is an extension of the ARC format , which has been used since 1996 to store files harvested on the web. WARC format offers new possibilities, notably the recording of HTTP request headers, the recording of arbitrary metadata, the allocation of an identifier for every contained file, the management of duplicates and of migrated records, and the segmentation of the records. WARC files are intended to store every type of digital content, either retrieved by HTTP or another protocol" (http://netpreserve.org/press/pr20090601.php).

View Map + Bookmark Entry

Costs of Managed Archiving versus Passive Archiving of Data June 4, 2009

"Regarding storage costs -- again its unhelpful to be vague, but equally unhelpful to be too specific. The cost of a 1 TB [terabyte] hard drive from the local IT hyperstore is NOT a useful number for estimating cost of reliable storage. Unfortunately the 'price of reliability' is equally hard to determine.

"The 'rule of thumb' most quoted now is 'one million dollars per year per petabyte' for 'managed server' storage eg disc-based storage from a well-run data centre that does good redundancy and backups. That means of course one thousand dollars per terabyte (per year) and that's a good estimate, in my view, to use for funding request and planning purposes. It can be done more cheaply -- up to ten times cheaper -- but that introduces various risks and requirements that you may or may not want to get into. In the BBC where we know that archive content is, on average, used once per four years, we're happy to put datatape on shelves and go for a much lower cost per terabyte" (Richard Wright, Sr Research Engineer, Research & Development, BBC Future Media & Technology, from: owner-dcc-associates@lists.ed.ac.uk, 06-04-2009).

View Map + Bookmark Entry

'Material Degradomics" or, The Sniff Test for a Book's Physical State September 17, 2009

In a paper entitled "Material Degradomics: On the Smell of Old Books", published in the journal Analytical Chemistry in September 2009, Matija Strlic at University College London, and associates at the Tate art museum (U.K.), the University of Ljubljana, and Morana RTD in Ivančna Gorica, (both in Slovenia) introduced a new method for linking a book’s physical state to its corresponding VOC emissions pattern. The goal was to “diagnose” decomposing historical documents noninvasively as a step toward protecting them.

“Ordinarily, traditional analytical methods are used to test paper samples that have been ripped out,” Strlic says. “The advantage of our method is that it’s nondestructive" (http://pubs.acs.org/doi/full/10.1021/ac902143z?cookieSet=1).

"The test is based on detecting the levels of volatile organic compounds. These are released by paper as it ages and produce the familiar 'old book smell'.

"The international research team, led by Matija Strlic from University College London's Centre for Sustainable Heritage, describes that smell as 'a combination of grassy notes with a tang of acids and a hint of vanilla over an underlying mustiness'. 

" 'This unmistakable smell is as much part of the book as its contents,' they wrote in the journal article. Dr Strlic told BBC News that the idea for new test came from observing museum conservators as they worked.

" 'I often noticed that conservators smelled paper during their assessment,' he recalled.  'I thought, if there was a way we could smell paper and tell how degraded it is from the compounds it emits, that would be great.'

"The test does just that. It pinpoints ingredients contained within the blend of volatile compounds emanating from the paper.

"That mixture, the researchers say, 'is dependent on the original composition of the... paper substrate, applied media, and binding' " (http://news.bbc.co.uk/2/hi/science/nature/8355888.stm)

View Map + Bookmark Entry

" A Library to Last Forever" ?? October 9, 2009

On October 9, 2013 Sergey Brin, co-founder and technology president of Google published an Op-Ed piece regarding the Google Book Search program in The New York Times entitled, perhaps overly optimistically, "A Library to Last Forever," from which I quote without implied endorsement:

".  . .the vast majority of books ever written are not accessible to anyone except the most tenacious researchers at premier academic libraries. Books written after 1923 quickly disappear into a literary black hole. With rare exceptions, one can buy them only for the small number of years they are in print. After that, they are found only in a vanishing number of libraries and used book stores. As the years pass, contracts get lost and forgotten, authors and publishers disappear, the rights holders become impossible to track down.

"Inevitably, the few remaining copies of the books are left to deteriorate slowly or are lost to fires, floods and other disasters. While I was at Stanford in 1998, floods damaged or destroyed tens of thousands of books. Unfortunately, such events are not uncommon - a similar flood happened at Stanford just 20 years prior. You could read about it in The Stanford-Lockheed Meyer Library Flood Report, published in 1980, but this book itself is no longer available.

"Because books are such an important part of the world's collective knowledge and cultural heritage, Larry Page, the co-founder of Google, first proposed that we digitize all books a decade ago, when we were a fledgling startup. At the time, it was viewed as so ambitious and challenging a project that we were unable to attract anyone to work on it. But five years later, in 2004, Google Books (then called Google Print) was born, allowing users to search hundreds of thousands of books. Today, they number over 10 million and counting.

"The next year we were sued by the Authors Guild and the Association of American Publishers over the project. While we have had disagreements, we have a common goal - to unlock the wisdom held in the enormous number of out-of-print books, while fairly compensating the rights holders. As a result, we were able to work together to devise a settlement that accomplishes our shared vision. While this settlement is a win-win for authors, publishers and Google, the real winners are the readers who will now have access to a greatly expanded world of books.

"There has been some debate about the settlement, and many groups have offered their opinions, both for and against. I would like to take this opportunity to dispel some myths about the agreement and to share why I am proud of this undertaking. This agreement aims to make millions of out-of-print but in-copyright books available either for a fee or for free with ad support, with the majority of the revenue flowing back to the rights holders, be they authors or publishers.

"Some have claimed that this agreement is a form of compulsory license because, as in most class action settlements, it applies to all members of the class who do not opt out by a certain date. The reality is that rights holders can at any time set pricing and access rights for their works or withdraw them from Google Books altogether. For those books whose rights holders have not yet come forward, reasonable default pricing and access policies are assumed. This allows access to the many orphan works whose owners have not yet been found and accumulates revenue for the rights holders, giving them an incentive to step forward.

"Others have questioned the impact of the agreement on competition, or asserted that it would limit consumer choice with respect to out-of-print books. In reality, nothing in this agreement precludes any other company or organization from pursuing their own similar effort. The agreement limits consumer choice in out-of-print books about as much as it limits consumer choice in unicorns. Today, if you want to access a typical out-of-print book, you have only one choice - fly to one of a handful of leading libraries in the country and hope to find it in the stacks." (http://www.nytimes.com/2009/10/09/opinion/09brin.html?scp=2&sq=sergey%20brin&st=cse, accessed 10-09-2009).

View Map + Bookmark Entry

2010 – 2012

Biological Journals to Require Data-Archiving January 2010

"To promote the preservation and fuller use of data, The American Naturalist, Evolution, the Journal of Evolutionary Biology, Molecular Ecology, Heredity, and other key journals in evolution and ecology will soon introduce a new data‐archiving policy. The policy has been enacted by the Executive Councils of the societies owning or sponsoring the journals. For example, the policy of The American Naturalist will state:  

"This journal requires, as a condition for publication, that data supporting the results in the paper should be archived in an appropriate public archive, such as GenBank, TreeBASE, Dryad, or the Knowledge Network for Biocomplexity. Data are important products of the scientific enterprise, and they should be preserved and usable for decades in the future. Authors may elect to have the data publicly available at time of publication, or, if the technology of the archive allows, may opt to embargo access to the data for a period up to a year after publication. Exceptions may be granted at the discretion of the editor, especially for sensitive information such as human subject data or the location of endangered species.  

"This policy will be introduced approximately a year from now, after a period when authors are encouraged to voluntarily place their data in a public archive. Data that have an established standard repository, such as DNA sequences, should continue to be archived in the appropriate repository, such as GenBank. For more idiosyncratic data, the data can be placed in a more flexible digital data library such as the National Science Foundation–sponsored Dryad archive at http://datadryad.org"  (http://www.journals.uchicago.edu/doi/full/10.1086/650340, accessed 01-22-2010).

View Map + Bookmark Entry

The Vatican Library Plans the Scanning of all its Manuscripts into the FITS Document Format March 24, 2010

"An initiative of the Vatican Library Digital manuscripts

"by Cesare Pasini  

"The digitization of 80,000 manuscripts of the Vatican Library, it should be realized, is not a light-hearted project. Even with only a rough calculation one can foresee the need to reproduce 40 million pages with a mountain of computer data, to the order of 45 petabytes (that is, 45 million billion bytes). This obviously means pages variously written and illustrated or annotated, to be photographed with the highest definition, to include the greatest amount of data and avoid having to repeat the immense undertaking in the future.  

"And these are delicate manuscripts, to be treated with care, without causing them damage of any kind. A great undertaking for the benefit of culture and in particular for the preservation and conservation of the patrimony entrusted to the Apostolic Library, in the tradition of a cultural service that the Holy See continues to express and develop through the centuries, adapting its commitment and energy to the possibilities offered by new technologies.  

"The technological project of digitization with its various aspects is now ready. In the past two years, a technical feasibility study has been prepared with the contribution of the best experts, internal, external and also international. This resulted in a project of a great and innovative value from various points of view: the realization of the photography, the electronic formats for conservation, the guaranteed stability of photographs over time, the maintenance and management of the archives, and so forth.  

"This project may be achieved over a span of 10 years divided into three phases, with possible intervals between them. In a preliminary phase the involvement of 60 people is planned, including photographers and conservator-verifiers, in the second and third phases at least 120. Before being able to initiate an undertaking of this kind, which is causing some anxiety to those in charge of the library (and not only to them!), naturally it will be necessary to find the funds. Moves have already been made in this direction with some positive results.  

"The second announcement is that some weeks ago the “test bed” was set up; in other words the “bench test” that will make it possible to try out and examine the whole structure of the important project that has been studied and formulated so as to guarantee that it will function properly when undertaken in its full breadth.  

"The work of reproduction uses two different machines, depending on the different types of material to be reproduced: one is a Metis Systems scanner, kindly lent to us free of charge by the manufacturers, and a 50 megapixel Hasselblad digital camera. Digitized images will be converted to the Flexible Image Transport System (FITS), a non-proprietary format, is extremely simple, was developed a few decades ago by NASA. It has been used for more than 40 years for the conservation of data concerning spatial missions and, in the past decade, in astrophysics and nuclear medicine. It permits the conservation of images with neither technical nor financial problems in the future, since it is systematically updated by the international scientific community.  

"In addition to the servers that collect the images in FITS format accumulated by the two machines mentioned, another two servers have been installed to process the data to make it possible to search for images both by the shelf mark and the manuscript's descriptive elements, and also and above all by a graphic pattern, that is, by looking for similar images (graphic or figurative) in the entire digital memory.  

"The latter instrument, truly innovative and certainly interesting for all who intend to undertake research on the Vatican's manuscripts – only think of when it will be possible to do such research on the entire patrimony of manuscripts in the Library! – was developed from the technology of the Autonomy Systems company, a leading English firm in the field of computer science, to which, moreover, we owe the entire funding of the “test bed”.  

"For this “bench test”, set up in these weeks, 23 manuscripts are being used for a total of 7,500 digitized and indexed pages, with a mountain of computer data of about 5 terabytes (about 5,000 billion bytes).

"The image of the mustard seed springs to mind: the “text bed” is not much more in comparison with the immensity of the overall project. But we know well that this seed contains an immense energy that will enable it to grow, to become far larger than the other plants and to give hospitality to the birds of the air. In accepting the promise guaranteed in the parable, let us also give hope of it to those who await the results of this project's realization" (http://www.vaticanlibrary.va/home.php?, pag=newsletter_art_00087&BC=11, accessed 03-24-2010).

View Map + Bookmark Entry

The Library of Congress Will Preserve All "Tweets" April 14, 2010

On April 14, 2010 Twitter announced in its blog that it would donate to the Library of Congress its archive of 10,000,000,000 text messages (tweets) accumulated since the founding of the company in October 2006:

"The Library of Congress is the oldest federal cultural institution in the United States and it is the largest library in the world. The Library's primary mission is research and it receives copies of every book, pamphlet, map, print, and piece of music registered in the United States. Recently, the Library of Congress signaled to us that the public tweets we have all been creating over the years are important and worthy of preservation.

"Since Twitter began, billions of tweets have been created. Today, fifty-five million tweets a day are sent to Twitter and that number is climbing sharply. A tiny percentage of accounts are protected but most of these tweets are created with the intent that they will be publicly available. Over the years, tweets have become part of significant global events around the world—from historic elections to devastating disasters.  

"It is our pleasure to donate access to the entire archive of public Tweets to the Library of Congress for preservation and research. It's very exciting that tweets are becoming part of history. It should be noted that there are some specifics regarding this arrangement. Only after a six-month delay can the Tweets be used for internal library use, for non-commercial research, public display by the library itself, and preservation.

"The open exchange of information can have a positive global impact. This is something we firmly believe and it has driven many of our decisions regarding openness. Today we are also excited to share the news that Google has created a wonderful new way to revisit tweets related to historic events. They call it Google Replay because it lets you relive a real time search from specific moments in time.  

"Google Replay currently only goes back a few months but eventually it will reach back to the very first Tweets ever created. Feel free to give Replay a try—if you want to understand the popular contemporaneous reaction to the retirement of Justice Stevens, the health care bill, or Justin Bieber's latest album, you can virtually time travel and replay the Tweets. The future seems bright for innovation on the Twitter platform and so it seems, does the past!"

View Map + Bookmark Entry

Using the Twitter Archive for Historical Research April 30, 2010

The New York Times published "When History is Compiled 140 Characters at a Time" from which I quote:

“ 'Twitter is tens of millions of active users. There is no archive with tens of millions of diaries,' said Daniel J. Cohen, an associate professor of history at George Mason University and co-author of a 2006 book, 'Digital History.' What’s more, he said, 'Twitter is of the moment; it’s where people are the most honest.'  

"Last month, Twitter announced that it would donate its archive of public messages to the Library of Congress, and supply it with continuous updates.  

"Several historians said the bequest had tremendous potential. 'My initial reaction was, ‘When you look at it Tweet by Tweet, it looks like junk,’ said Amy Murrell Taylor, an associate professor of history at the State University of New York, Albany. 'But it could be really valuable if looked through collectively.' Ms. Taylor is working on a book about slave runaways during the Civil War; the project involves mountains of paper documents. 'I don’t have a search engine to sift through it,' she said.  

"The Twitter archive, which was 'born digital,' as archivists say, will be easily searchable by machine — unlike family letters and diaries gathering dust in attics.  

"As a written record, Tweets are very close to the originating thoughts. 'Most of our sources are written after the fact, mediated by memory — sometimes false memory,' Ms. Taylor said. 'And newspapers are mediated by editors. Tweets take you right into the moment in a way that no other sources do. That’s what is so exciting.'  

"Twitter messages preserve witness accounts of an extraordinary variety of events all over the planet. 'In the past, some people were able on site to write about, or sketch, as a witness to an event like the hanging of John Brown,' said William G. Thomas III, a professor of history at the University of Nebraska-Lincoln. 'But that’s a very rare, exceptional historical record.'  

"Ten billion Twitter messages take up little storage space: about five terabytes of data. (A two-terabyte hard drive can be found for less than $150.) And Twitter says the archive will be a bit smaller when it is sent to the library. Before transferring it, the company will remove the messages of users who opted to designate their account 'protected,' so that only people who obtain their explicit permission can follow them.

"A Twitter user can also elect to use a pseudonym and not share any personally identifying information. Twitter does not add identity tags that match its users to real people.  

"Each message is accompanied by some tidbits of supplemental information, like the number of followers that the author had at the time and how many users the author was following. While Mr. Cohen said it would be useful for a historian to know who the followers and the followed are, this information is not included in the Tweet itself.  

"But there’s nothing private about who follows whom among users of Twitter’s unprotected, public accounts. This information is displayed both at Twitter’s own site and in applications developed by third parties whom Twitter welcomes to tap its database.  

"Alexander Macgillivray, Twitter’s general counsel, said, 'From the beginning, Twitter has been a public and open service.' Twitter’s privacy policy states: 'Our services are primarily designed to help you share information with the world. Most of the information you provide to us is information you are asking us to make public.  

"Mr. Macgillivray added, 'That’s why, when we were revising our privacy policy, we toyed with the idea of calling it our ‘public policy.’ ' He said the company would have done so but California law required that it have a 'privacy policy' labeled as such.  

"Even though public Tweets were always intended for everyone’s eyes, the Library of Congress is skittish about stepping anywhere in the vicinity of a controversy. Martha Anderson, director of the National Digital Information Infrastructure and Preservation Program at the library, said, 'There’s concern about privacy issues in the near term and we’re sensitive to these concerns.'  

"The library will embargo messages for six months after their original transmission. If that is not enough to put privacy issues to rest, she said, 'We may have to filter certain things or wait longer to make them available.' The library plans to dole out its access to its Twitter archive only to those whom Ms. Anderson called “qualified researchers.”  

"BUT the library’ s restrictions on access will not matter. Mr. Macgillivray at Twitter said his company would be turning over copies of its public archive to Google, Yahoo and Microsoft, too. These companies already receive instantaneously the stream of current Twitter messages. When the archive of older Tweets is added to their data storehouses, they will have a complete, constantly updated, set, and users won’t encounter a six-month embargo.  

"Google already offers its users Replay, the option of restricting a keyword search only to Tweets and to particular periods. It’s quickly reached from a search results page. (Click on 'Show options,' then 'Updates,' then a particular place on the timeline.)  

"A tool like Google Replay is helpful in focusing on one topic. But it displays only 10 Tweets at a time. To browse 10 billion — let’s see, figuring six seconds for a quick scan of each screen — would require about 190 sleepless years.  

"Mr. Cohen encourages historians to find new tools and methods for mining the 'staggeringly large historical record' of Tweets. This will require a different approach, he said, one that lets go of straightforward 'anecdotal history.' " (http://www.nytimes.com/2010/05/02/business/02digi.html?scp=1&sq=twitter%20+%20history&st=cse, accessed 05-06-2010).

View Map + Bookmark Entry

Universal Music Group Donates a "Mile of Music" to the Library of Congress January 10, 2011

The Universal Music Group, headquartered in Santa Monica, California, which traces its origins to 1898, donated its archive of recorded music, consisting of circa 200,000 metal, glass and lacquer master discs, recorded from 1926 to 1948, to the Library of Congress.  The agreement called for the Library of Congress to own and preserve the music and to convert it to digital form for usability and long-term data preservation. Universal Music Group retained the right to commercialize the digital files.

"Under the agreement negotiated during discussions that began two years ago the Library of Congress has been granted ownership of the physical discs and plans to preserve and digitize them. But Universal, a subsidiary of the French media conglomerate Vivendi that was formerly known as the Music Corporation of America, or MCA, retains both the copyright to the music recorded on the discs and the right to commercialize that music after it has been digitized.  

“The thinking behind this is that we have a very complementary relationship,” said Vinnie Freda, executive vice president for digital logistics and business services at Universal Music Logistics. “I’ve been trying to figure out a way to economically preserve these masters in a digital format, and the library is interested in making historically important material available. So they will preserve the physical masters for us and make them available to academics and anyone who goes to the library, and Universal retains the right to commercially exploit the masters.”  

"The agreement will also permit the Web site of the Library of Congress to stream some of the recordings for listeners around the world once they are cataloged and digitized, a process that Mr. DeAnna said could take five years or more, depending on government appropriations. But both sides said it had not yet been determined which songs would be made available, a process that could be complicated by Universal’s plans to sell some of the digitized material through iTunes.  

"Universal’s bequest is the second time in recent months that a historic archive of popular music has been handed over to a nonprofit institution dedicated to preserving America’s recorded musical heritage. Last spring the National Jazz Museum in Harlem acquired nearly 1,000 discs, transcribed from radio broadcasts in the late 1930s and early 1940s by the recording engineer William Savory, featuring some of the biggest names in jazz" (http://www.nytimes.com/2011/01/10/arts/music/10masters.html?hp, accessed 01-10-2011).

View Map + Bookmark Entry

The Expanding Digital Universe: Surpassing 1.8 Zetabytes June 2011

John F. Gantz and David Reinsell of International Data Corporation (IDC) published a summary of their annual study of the digital universe on the fifth anniversary of their study:

"We always knew it was big – in 2010 cracking the zettabyte barrier. In 2011, the amount of information created and replicated will surpass 1.8 zettabytes (1.8 trillion gigabytes) - growing by a factor of 9 in just five years.

"But, as digital universe cosmologists, we have also uncovered a number of other things — some predictable, some astounding, and some just plain disturbing.

"While 75% of the information in the digital universe is generated by individuals, enterprises have some liability for 80% of information in the digital universe at some point in its digital life. The number of "files," or containers that encapsulate the information in the digital universe, is growing even faster than the information itself as more and more embedded systems pump their bits into the digital cosmos. In the next five years, these files will grow by a factor of 8, while the pool of IT staff available to manage them will grow only slightly. Less than a third of the information in the digital universe can be said to have at least minimal security or protection; only about half the information that should be protected is protected.

"The amount of information individuals create themselves — writing documents, taking pictures, downloading music, etc. — is far less than the amount of information being created about them in the digital universe.

"The growth of the digital universe continues to outpace the growth of storage capacity. But keep in mind that a gigabyte of stored content can generate a petabyte or more of transient data that we typically don't store (e.g., digital TV signals we watch but don't record, voice calls that are made digital in the network backbone for the duration of a call).  

"So, like our physical universe, the digital universe is something to behold — 1.8 trillion gigabytes in 500 quadrillion "files" — and more than doubling every two years. That's nearly as many bits of information in the digital universe as stars in our physical universe" (http://idcdocserv.com/1142, accessed 08-09-2011).

♦ In August 2011 a video presentation of John Gantz delivering his summary speech was available at this link: http://www.emc.com/collateral/demos/microsites/emc-digital-universe-2011/index.htm

View Map + Bookmark Entry

"Physical Archiving is Still an Important Function in the Digital Era."The Internet Archive Builds an Archive of Physical Books June 6, 2011

In one of the more ironic developments since the Internet, the Internet Archive is creating a Physical Archive in Richmond, California, of all books they scanned that they did not have to return to institutional libraries, and of other physical books as well. Their goal is to collect "one coy of every book." Their purposes in doing this are that the physical books are authentic and original versions that can be used in the future, and "If there is ever a controversy about the digital version, the original can be examined." The physical books are being being stored in the most compact archival fashion in environmentally controlled shipping containers placed in warehouses—not in the way an institutional library would store them if they had to provide regular access.

Brewster Kahle, founder of the Internet Archive explained the Physical Archive of the Internet Archive:

"Digital technologies are changing both how library materials are accessed and increasingly how library materials are preserved. After the Internet Archive digitizes a book from a library in order to provide free public access to people world-wide, these books go back on the shelves of the library. We noticed an increasing number of books from these libraries moving books to 'off site repositories'  to make space in central buildings for more meeting spaces and work spaces. These repositories have filled quickly and sometimes prompt the de-accessioning of books. A library that would prefer to not be named was found to be thinning their collections and throwing out books based on what had been digitized by Google. While we understand the need to manage physical holdings, we believe this should be done thoughtfully and well.  

"Two of the corporations involved in major book scanning have sawed off the bindings of modern books to speed the digitizing process. Many have a negative visceral reaction to the “butchering” of books, but is this a reasonable reaction?  

"A reason to preserve the physical book that has been digitized is that it is the authentic and original version that can be used as a reference in the future. If there is ever a controversy about the digital version, the original can be examined. A seed bank such as the Svalbard Global Seed Vault is seen as an authoritative and safe version of crops we are growing. Saving physical copies of digitized books might at least be seen in a similar light as an authoritative and safe copy that may be called upon in the future.  

"As the Internet Archive has digitized collections and placed them on our computer disks, we have found that the digital versions have more and more in common with physical versions. The computer hard disks, while holding digital data, are still physical objects. As such we archive them as they retire after their 3-5 year lifetime. Similarly, we also archive microfilm, which was a previous generation’s access format. So hard drives are just another physical format that stores information. This connection showed us that physical archiving is still an important function in a digital era.  

"There is also a connection between digitized collections and physical collections. The libraries we scan in, rarely want more digital books than the digital versions that we scan from their collections. This struck us as strange until we better understood the craftsmanship required in putting together great collections of books, whether physical or digital. As we are archiving the books, we are carefully recording with the physical book what the identifier for the virtual version, and attaching information to the digital version of where the physical version resides. 

"Therefore we have determined that we will keep a copy of the books we digitize if they are not returned to another library. Since we are interested in scanning one copy of every book ever published, we are starting to collect as many books as we can" (http://blog.archive.org/2011/06/06/why-preserve-books-the-new-physical-archive-of-the-internet-archive/, accessed 06-09-2011).

"Mr. Kahle had the idea for the physical archive while working on the Internet Archive, which has digitized two million books. With a deep dedication to traditional printing — one of his sons is named Caslon, after the 18th-century type designer — he abhorred the notion of throwing out a book once it had been scanned. The volume that yielded the digital copy was special.  

"And perhaps essential. What if, for example, digitization improves and we need to copy the books again?  

“ 'Microfilm and microfiche were once a utopian vision of access to all information,' Mr. Kahle noted, 'but it turned out we were very glad we kept the books' " (http://www.nytimes.com/2012/03/04/technology/internet-archives-repository-collects-thousands-of-books.html?nl=todaysheadlines&emc=tha25, accessed 03-30-2012).

 

View Map + Bookmark Entry

Sheikh Sultan Dr. Al-Qasimi Pledges to Restore the Library of l'Institut de l'Egypte December 20, 2011

Sheikh Sultan bin Mohammed Al-Qasimi III (In Arabic: سلطان بن محمد القاسمي) governor of the UAE’s emirate of Sharjah, and a widely published writer and scholar, pledged to restore the library of the Institut de l'Egypte damaged by fire, and to replace  books destroyed or damaged beyond repair. 

" 'All the original documents in my private library I am giving as a gift to the Egyptian Scientific Complex,' Qassemi said in a phone interview from Paris with the independent Egyptian satellite Channel Dream TV. 'I have a rare collection that is not to be found anywhere else.'Qassemi added that he asked for a complete list of all the books that were damaged or lost during the fire and that he would do his best to look for other original copies and give them to the library, known for its collection of priceless books, maps, and manuscripts.  

“ 'What is happening in Egypt is happening to all of us and what I am doing is just a small token of gratitude that all of us, especially people from Sharjah, feel.  Egyptian [institutions] taught us  a lot and we were students in Egyptian universities and no matter what we do, it will not be enough to pay them back,' he said.  

"Qassemi added that he is overseeing the construction of a documents center in Cairo to house all the documents that are now kept in the Egyptian cabinet building, a place seen as unsafe at the moment because of clashes in Tahrir Square and surrounding areas.  

“ 'We will make sure that the documents are safely transferred before more acts of sabotage take place. We have been given the green light by the Egyptian government to do that.

"He added that he would do his best to preserve Egypt’s heritage as Egypt had always preserved the Arab world.  

“ 'Egypt has always been offering sacrifices and we will never forget what Egyptians did to liberate Kuwait. This alone is invaluable,' Qassemi said.  

"The Egyptian minister of antiquities, Mohamed Ibrahim, said he appreciates Qassemi’s initiative.  

“ 'Sheikh Qassemi has always supported the library and Egypt.'

"Ibrahim added that the French government has also offered to salvage what it can from the Scientific Complex.  

"Among the documents in Qassemi’s possession is a copy of Description de l'Égypte, written at the time of the French expedition to Egypt (1798-1801) and published between 1809 and 1822. The book, which contains a detailed description of Egypt, was a main cause for the uproar that accompanied the fire at the Scientific Complex.  

"According to sources at the Egyptian Ministry of Culture, around 20,000 books and manuscripts were saved from the fire and are currently kept in the cabinet and parliament buildings"  (http://english.alarabiya.net/articles/2011/12/20/183601.html, accessed 12-20-2011).

View Map + Bookmark Entry

2012 – 2016

Gelatin and Calcium in the Earliest Paper Was Responsible for its Longevity January 2012

Research on Paper Though Time by a University of Iowa team led by Timothy Barrett, director of papermaking facilities at the UI Center for the Book, showed that the earliest paper tended to be the most durable over time because of high qualities of gelatin and calcium in its manufacture. Over three years the team analyzed 1,578 historical papers made between the 14th and the 19th centuries. Barrett and his colleagues devised methods to determine their chemical composition without requiring a sample to be destroyed in the process, which had limited past research.

“This is news to many of us in the fields of papermaking history and rare book and art conservation,” says Barrett. “The research results will impact the manufacture of modern paper intended for archival applications, and the care and conservation of historical works on paper.”  

Barrett says one possible explanation for the higher quality of the paper in the older samples is that papermakers at the time were attempting to compete with parchment, a tough enduring material normally made from animal skins. In doing so, they made their papers thick and white and dipped the finished sheets into a dilute warm gelatin solution to toughen it.  

“Calcium compounds were used in making parchment, and they were also used in making paper,” Barrett says. “Turns out they helped prevent the paper from becoming acidic, adding a lot to its longevity.”

View Map + Bookmark Entry

Digitizing the Oldest Monastic Library & The Oldest Continuously Operating Library May 2012 – March 31, 2016

"St. Catherine’s Monastery is going digital. The monastery that claims to be the oldest in the world ­— not destroyed, not abandoned in 17 centuries — has begun digitizing its ancient manuscripts for the use of scholars. A new library to facilitate the process is about five years away.  

"The librarian, Father Justin, says the monastery’s library will grow an internet database of first-millennium manuscripts, which up until now have been kept under lock and key. Should a scholar want a manuscript, they need only email Father Justin.  

“ 'And if I don’t have book but see a reference, I can email a friend in Oxford. They can scan and send it the next day,' he says.  

"Still, as natural and inevitable as it sounds, that’s quite the sea change. Just 10 years ago, bad phone lines made it hard to connect a call with the monastery. One hundred years ago, it took 10 days to travel from Suez with a caravan of camels. And when I arrive unheralded, having not even called ahead, a monk shades his eyes, shakes head and — at first — says he will not introduce me to Father Justin.  

“ 'What if we said 'yes' to every reporter and scholar that came here? Everyone wants our time. But what about our own work?' he asks.  

"Not many of the 25 monks cloistered at the Sacred and Imperial Monastery of the God-Trodden Mount of Sinai have email addresses, or operate Mac G5 computers, or know their megapixel from their leviathan. Father Justin Sinaites is a native of Texas. He wears a black habit and a beard to his chest, and ties his long hair back in a ponytail. He is over six feet tall. When he stands, he keeps his arms ramrod straight at his sides.

"Every morning he attends the 4:30 am service — which has not changed its liturgy since AD 550 — and then climbs six flights of stairs to his office in the east wing of the three-story administrative building forming the back wall of St. Catherine’s Monastery. He powers up the G5 and passes the morning making digital photographs of scripture written on papyrus, written on animal hide and written with ink made from oak tree galls.  

“ 'It’s amazing, the juxtaposition,' is how he puts it.  

"A page that may have taken a bent-backed monk weeks to illuminate is clamped under the bellows of the 48MP CCD camera. Snap. Next page. It takes three or four days to do a whole book. There are about 3,300 manuscripts. . . . "(http://www.egyptindependent.com/news/st-catherine-monastery-seeks-permanence-through-technology, accessed 05-29-2012

By Kathy Brown on Mar 31, 2016

"The Ahmanson Foundation has awarded a major grant to the UCLA Library to fund key aspects of the Sinai Library Digitization Project.  This major project – initiated by the fathers of St. Catherine’s Monastery of the Sinai, Egypt, and made possible through the participation of the UCLA Library and the Early Manuscripts Electronic Library (EMEL) – will create digital copies of some 1,100 rare and unique Syriac and Arabic manuscripts dating from the fourth to the seventeenth centuries.

"A UNESCO World Heritage site located in a region of the Sinai Peninsula sacred to three world religions – Christianity, Islam, and Judaism - St. Catherine’s Monastery houses a collection of ancient and medieval manuscripts second only to that of the Vatican Library. Access to these remarkable materials has often been difficult, and now all the more so due to security concerns in the Sinai Peninsula.

“The manuscripts at St. Catherine’s are critical to our understanding of the history of the Middle East, and every effort must be made to digitally preserve them in this time of volatility. The Ahmanson Foundation’s visionary support honors the careful stewardship of St. Catherine’s Monastery over the centuries and ensures that these invaluable documents are not only accessible, but preserved in digital copies,” said UCLA University Librarian Ginny Steel.

“We are deeply grateful to The Ahmanson Foundation for its generous investment in this important project, and for its longstanding partnership with the UCLA Library,” Steel concluded.

“St. Catherine’s Monastery proposed a program to digitize its unparalleled manuscript collection, and an international team was assembled to help digitally preserve the ancient pages,” said Michael Phelps, EMEL director.  “EMEL is collaborating with the monastery to install world-class digitization systems, and the UCLA Library will host the images online on behalf of the monastery. The three-year project will digitize the monastery’s extensive collection of Syriac and Arabic manuscripts.” 

"Built in the sixth century, St. Catherine’s Monastery holds the oldest continually operating library in the world. The library’s manuscripts cover subjects ranging from history and philosophy to medicine and spirituality, making them of interest to scholars and learners across a wide range of disciplines. Among the monastery’s most important Syriac and Arabic manuscripts are a fifth century copy of the Gospels in Syriac; a Syriac copy of the Lives of Women Saints dated 779 CE; the Syriac version of the Apology of Aristides, of which the Greek original has been lost; and numerous Arabic manuscripts from the ninth and tenth centuries, when Middle Eastern Christians first began to use Arabic as a literary language"

View Map + Bookmark Entry

The Secret Race to Save Manuscripts in Timbuktu and Djenne December 27, 2012

By GEOFFREY YORK, The Globe and Mail, Dec. 27 2012

"As rebels searched the bags of the truck passengers at a checkpoint near Timbuktu, one man was trying to hide his nervousness.

"Mohamed Diagayete, an owlish scholar with an eager smile, was silently praying that the rebels would not discover his laptop computer. Buried in his laptop bag was an external hard drive with a cache of thousands of valuable images and documents from Timbuktu’s greatest cultural treasure: its ancient scholarly manuscripts.  

"Radical Islamist rebels in northern Mali have repeatedly attacked the fabled city’s heritage, taking pickaxes to the tombs of local saints and smashing down a door in a 15th century mosque. They demolished several more mausoleums this week and vowed to destroy the rest, despite strong protests from UNESCO, the United Nations cultural agency.  

"With the tombs demolished, Timbuktu’s most priceless remaining legacy is its vast libraries of crumbling Arabic and African manuscripts, written in ornate calligraphy over the past eight centuries, proof of a historic African intellectual tradition. Some experts consider them as significant as the Dead Sea Scrolls – and an implicit rebuke to the harsh narrow views of the Islamist radicals.  

"But now the manuscripts, too, could be under threat. And so a covert operation is under way to save them.  

"That’s why Mr. Diagayete was so anxious to smuggle his hard drive out of Timbuktu. For years, he’s been helping preserve the manuscripts by digitizing them. But the project was halted when the Islamists seized Timbuktu in April. A few months later, Mr. Diagayete made an undercover visit to Timbuktu and brought back as many of the digital images as he could.  

"The quest to save the documents rarely leaves his thoughts. 'What will happen to the manuscripts?' he asks from the safety of Mali’s capital, Bamako, where he fled after the fall of Timbuktu.  

“ 'I’m always asking myself thousands of questions about the manuscripts,' he says. 'When we lose them, we have no other copy. It’s forever.'

Mr. Diagayete is a researcher at the Ahmed Baba Institute, which has been digitizing the manuscripts for nearly a decade with support from foreign governments. But because of technical delays, and the huge number of manuscripts in the city (up to 700,000 by some estimates), only a tiny fraction has been copied so far.  

"The manuscripts, dating back to the 13th century, are evidence of ancient African and Islamist written scholarship, contradicting the myth of a purely oral tradition on the continent.  

"Many of the manuscripts are religious documents, but others are intellectual treatises on medicine, astronomy, literature, mathematics, chemistry, judicial law and philosophy. Many were brought to Timbuktu in camel caravans by scholars from Cairo, Baghdad and Persia who trekked to the city when it was one of the world’s greatest centres of Islamic learning. In the Middle Ages, when Europe was stagnating, the African city had 180 religious schools and a university with 20,000 students.  

"Timbuktu fell into decline after Moroccan invasions and French colonization, but its ancient gold-lettered manuscripts were preserved by dozens of owners, mostly private citizens, who kept them in wooden trunks or in their own libraries.  

"Today, under the occupation of the radical jihadists, the manuscripts face a range of threats. Conservation experts have fled the city, so the documents could be damaged by insects, mice, sand, dust or extreme temperatures. Or the Islamist militants could decide to raise money by looting and selling the documents.  

"There’s also a risk that the militants could simply destroy the manuscripts, since some are written by African mystics or moderate Sufis, regarded by the Islamist rebels as ideological enemies. Another threat is the planned Western-backed military campaign against the rebels, which could lead to house-to-house fighting in Timbuktu, further endangering the manuscripts.  

"The government-run Ahmed Baba Institute holds nearly 40,000 manuscripts in two main buildings, including a headquarters built with South African assistance in 2009. But the Islamist rebels have seized the institute, looting its computers and using its new building as a sleeping quarters.  

“ 'It’s a big setback for the institute,' said Susana Molins Lliteras, a researcher at a South African-based project to protect the Timbuktu manuscripts.  

“ 'It’s very possible that things have been lost,' she said. 'We haven’t even had a chance to research the manuscripts – we haven’t scratched the surface. So if they are lost, we won’t even know what is lost.'

"Since the rebel takeover, the private owners have scrambled to protect the manuscripts. Nobody knows exactly what they have done, but it is believed that some owners have hidden the manuscripts, buried them in the sand, or smuggled them to villages.  

"This, too, is dangerous, since the ancient texts can easily be damaged when they are moved. 'They are very fragile,' Mr. Diagayete said. 'The choice is difficult: Either we lose them all or we lose part of them. Everyone is trying to find a way to protect their manuscripts.'

"Adama Diarra, a Malian journalist, saw three owners piling their manuscripts into 50-kilogram rice bags in April, shortly after the Islamists seized Timbuktu, apparently in an effort to move them to safer places. 'The pages were falling out,' he said.  

"Mohamed Galla Dicko, director of the Ahmed Baba Institute for 17 years before leaving the institute this year, says the threat to the manuscripts is serious. 'The old pages can be damaged just by touching them,' he said. 'And the people who are moving them are not specialists in handling them.'

"While the Timbuktu manuscripts are in trouble, there is better news from another ancient Malian town, Djenne, south of the rebel-controlled territory. With help from the British Library, researchers are digitizing thousands of Djenne’s historic manuscripts – some nearly 500 years old.  

"Even when fuel and electricity were rationed after the rebel advances, dedicated workers kept toiling on the project at Djenne’s manuscript library. 'We’ve saved a large number of the manuscripts,' said Sophie Sarin, a Swedish hotel owner in Djenne.  

"The project aims to collect 200,000 images by next July. After the rebels captured northern Mali this year, Ms. Sarin travelled to London with a hard drive containing 80,000 digital images of the Djenne manuscripts. She brought them to specialists at the British Library, who were very relieved to see them, she said."

View Map + Bookmark Entry

"Born Digital: Guidance for Donors, Dealers, and Archival Repositories" January 2013

In January 2013 archivists and curators at six institutions, including Michael Forstrom at the Beinecke Library, Yale; Susan Thomas at the Bodleian Library, Oxford; Jeremy Leighton John at the British Library; Megan Barnard at the Harry Ransom Center, The University of Texas at Austin; Kate Donovan at the Manuscript, Archives and Rare Book Library (MARBL), Emory University; and Will Hansen and Seth Shaw at the Rubenstein Library, Duke University, published through Media Commons Press Born Digital: Guidance for Donors, Dealers, and Archival Repositories.

View Map + Bookmark Entry

The Library of Congress Has Archived 170 Billion Tweets January 4, 2013

On January 4, 2013 Gayle Osterberg, Director of Communications at the Library of Congress reported in the Library of Congress Blog

"An element of our mission at the Library of Congress is to collect the story of America and to acquire collections that will have research value. So when the Library had the opportunity to acquire an archive from the popular social media service Twitter, we decided this was a collection that should be here.  

"In April 2010, the Library and Twitter [based in San Francisco] signed an agreement providing the Library the public tweets from the company’s inception through the date of the agreement, an archive of tweets from 2006 through April 2010. Additionally, the Library and Twitter agreed that Twitter would provide all public tweets on an ongoing basis under the same terms.

"The Library’s first objectives were to acquire and preserve the 2006-10 archive; to establish a secure, sustainable process for receiving and preserving a daily, ongoing stream of tweets through the present day; and to create a structure for organizing the entire archive by date.

"This month, all those objectives will be completed. We now have an archive of approximately 170 billion tweets and growing. The volume of tweets the Library receives each day has grown from 140 million beginning in February 2011 to nearly half a billion tweets each day as of October 2012.  

"The Library’s focus now is on addressing the significant technology challenges to making the archive accessible to researchers in a comprehensive, useful way. These efforts are ongoing and a priority for the Library.  

"Twitter is a new kind of collection for the Library of Congress but an important one to its mission. As society turns to social media as a primary method of communication and creative expression, social media is supplementing, and in some cases supplanting, letters, journals, serial publications and other sources routinely collected by research libraries.  [Bold face is my addition, JN.]

"Although the Library has been building and stabilizing the archive and has not yet offered researchers access, we have nevertheless received approximately 400 inquiries from researchers all over the world. Some broad topics of interest expressed by researchers run from patterns in the rise of citizen journalism and elected officials’ communications to tracking vaccination rates and predicting stock market activity.

"Attached is a white paper [PDF] that summarizes the Library’s work to date and outlines present-day progress and challenges."

————

♦♦ To which James Gleick, author of The Information, responded in the New York Review of Books on January 16, 2013 in a blog entry titled Librarians of the Twitterverse, from which I quote this selection:

"For a brief time in the 1850s the telegraph companies of England and the United States thought that they could (and should) preserve every message that passed through their wires. Millions of telegrams—in fireproof safes. Imagine the possibilities for history!  

“ 'Fancy some future Macaulay rummaging among such a store, and painting therefrom the salient features of the social and commercial life of England in the nineteenth century,' wrote Andrew Wynter in 1854. (Wynter was what we would now call a popular-science writer; in his day job he practiced medicine, specializing in 'lunatics.') 'What might not be gathered some day in the twenty-first century from a record of the correspondence of an entire people?'

"Remind you of anything?  

"Here in the twenty-first century, the Library of Congress is now stockpiling the entire Twitterverse, or Tweetosphere, or whatever we’ll end up calling it—anyway, the corpus of all public tweets. There are a lot. The library embarked on this project in April 2010, when Jack Dorsey’s microblogging service was four years old, and four years of tweeting had produced 21 billion messages. Since then Twitter has grown, as these things do, and 21 billion tweets represents not much more than a month’s worth. As of December, the library had received 170 billion—each one a 140-character capsule garbed in metadata with the who-when-where. . . . "

View Map + Bookmark Entry

Part of Library of the Ahmed Baba Institute in Timbuktu is Burned January 28 – January 30, 2013

On January 28, 2013 it was widely reported that the Ahmed Baba Institute of Higher Learning and Islamic Research (CEDRAB) in Timbuktu (Tombouctou), Mali, the repository of 30,000 historic manuscripts from the ancient Muslim world, was set aflame by Islamist fighters.

On the same day Vivienne Walt reported on Time.com that the loss from the fire was far less than total:

"In interviews with TIME on Monday, preservationists said that in a large-scale rescue operation early last year, shortly before the militants seized control of Timbuktu, thousands of manuscripts were hauled out of the Ahmed Baba Institute to a safe house elsewhere. Realizing that the documents might be prime targets for pillaging or vindictive attacks from Islamic extremists, staff left behind just a small portion of them, perhaps out of haste, but also to conceal the fact that the center had been deliberately emptied. “The documents which had been there are safe, they were not burned,” said Mahmoud Zouber, Mali’s presidential aide on Islamic affairs, a title he retains despite the overthrow of the former President, his boss, in a military coup a year ago; preserving Timbuktu’s manuscripts was a key project of his office. By phone from Bamako on Monday night, Zouber told TIME, “They were put in a very safe place. I can guarantee you. The manuscripts are in total security.”

"In a second interview from Bamako, a preservationist who did not want to be named confirmed that the center’s collection had been hidden out of reach from the militants. Neither of those interviewed wanted the location of the manuscripts named in print, for fear that remnants of the al-Qaeda occupiers might return to destroy them.

"That was confirmed too by Shamil Jeppie, director of the Timbuktu Manuscripts Project at the University of Cape Town, who told TIME on Monday night that “there were a few items in the Ahmed Baba library, but the rest were kept away.” The center, financed by the South African government as a favored project by then President Thabo Mbeki, who championed reviving Africa’s historical culture, housed state-of-the-art equipment to preserve and photograph hundreds of thousands of pages, some of which had gold illumination, astrological charts and sophisticated mathematical formulas. Jeppie said he had been enraged by the television footage on Monday of the building trashed, and blamed in part Mali’s government, which he said had done little to ensure the center’s security. “It is really sad and disturbing,” he said.

"When TIME reached Timbuktu’s Mayor Cissé in Bamako late Monday night, he tempered the remarks he had made to journalists earlier in the day, conceding in an interview that, indeed, residents had worked to rescue the center’s manuscripts before al-Qaeda occupied the city last March. Still, he said that while many of the manuscripts had been saved, “they did not move all the manuscripts.” He said he had fled earlier this month after living through months of the Islamists’ rule, a situation he described as a “true catastrophe” and “very, very hard.” He said he expects to fly back home by the weekend on a French military jet. By then, perhaps, the state of Timbuktu’s astonishing historic libraries might be clearer."

On January 30, 2013 an article in Liberation.fr stated that "more than 90%" of the manuscripts at the Ahmed Baba Institute in Timbuktu were saved from destruction.

View Map + Bookmark Entry

The Historic Vatican Library to be Digitized in 2.8 Petabytes March 7, 2013

On March 7, 2013 EMC Corporation, headquartered in Hopkinton, MA, announced that it will support the Vatican Apostolic Library in digitizing its catalogue of 80,000 historic manuscripts and 8,900 incunabula as part of EMC’s Information Heritage Initiative. The project will result in 40 million pages of digital reproductions. "The first phase of the nine-year project will provision 2.8 petabytes of storage, utilizing a range of industry-leading solutions from EMC including Atmos®, Data Domain®, EMC Isilon®, NetWorker® and VNX®."

View Map + Bookmark Entry

Introduction of "Arches": an Open-source, Web-based, Geospatial Information System for Cultural Heritage Inventory and Management December 4, 2013

On December 4, 2013 The Getty Conservation Institute (GCI) and World Monuments Fund (WMF) and Farallon Geographics announced the public release of Arches 1.0—an open-source, web-based geospatial information system (GIS) for cultural heritage inventory and management, built specifically to help heritage organizations safeguard cultural heritage sites worldwide.  

"By incorporating a broad range of international standards, Arches meets a critical need in terms of gathering, making accessible and preserving key information about cultural heritage. “Knowing what you have is the critical first step in the conservation process. Inventorying heritage assets is a major task and a major investment,” said Bonnie Burnham, President and CEO of World Monuments Fund.

"Cultural heritage inventories are difficult to establish and maintain. Agencies often rely on costly proprietary software that is frequently a mismatch for the needs of the heritage field or they create custom information systems from scratch. Both approaches remain problematic and many national and local authorities around the world are struggling to find resources to address these challenges. The GCI and WMF have responded to this need by partnering to create Arches, which is available at no cost. Arches can present its user interface in any language or in multiple languages, and is configurable to any geographic location or region. It is web-based to provide for the widest access and requires minimal training.

"The system is freely available for download from the Internet so that institutions may install it at any location in the world. “Our hope is that by creating Arches we can help reduce the need for heritage institutions to expend scarce resources on creating systems from the ground up, and also alleviate the need for them to engage in the complexities and constantly changing world of software development,” said Tim Whalen, Director of the Getty Conservation Institute in Los Angeles. In developing Arches, the GCI and WMF consulted international best practices and standards, engaging nearly 20 national, regional, and local government heritage authorities from the US, England, Belgium, France, and the Middle East, as well as information technology experts from the US and Europe. The contributions of English Heritage and the Flanders Heritage Agency have played a particularly important role during the development process. Data provided by English Heritage has been valuable for system development, and it is incorporated as a sample data set within the demonstration version of Arches.

"The careful integration of standards in Arches also will encourage the creation and management of data using best practices. This makes the exchange and comparison of data between Arches and other information systems easier, both within the heritage community and related fields, and it will ultimately support the longevity of important information related to cultural sites. Once the Arches system is installed, institutions implementing it can control the degree of visibility of their data. They may choose to have the system and its data totally open to online access, partially open, accessible with a log-in, not accessible at all, or somewhere in between" (http://artdaily.com/news/66701/Getty-and-World-Monuments-Fund-release-Arches-Software-to-help-safeguard-cultural-heritage-sites-#.UqM80_SIBcY, accessed 12-07-2013).

View Map + Bookmark Entry