4406 entries. 94 themes. Last updated December 26, 2016.

2005 to 2010 Timeline

Theme

Use of Internet in China in 2005 2005

By Spring of 2005 it was estimated that over 100,000,000 people in China used the Internet.

View Map + Bookmark Entry

Attempting to Use an Ink-Jet Printer to Print Living Tissue. . . . 2005

The National Science Foundation logo

Gabor Forgacs

University of Missouri seal

A diagram of organ printing

In 2005 The National Science Foundation funded research headed by Gabor Forgacs at the University of Missouri-Columbia on what was called "Organ Printing," to "further advance our understanding of self-assembly during the organization of cells and tissues into functional organ modules."

From ABC News 2-10-2006:

"In what could be the first step toward human immortality, scientists say they've found a way to do all of these things and more with the use of a technology found in many American homes: an ink-jet printer.

"Researchers around the world say that by using the technology, they can actually 'print' living human tissue and one day will be able to print entire organs.

" 'The promise of tissue engineering and the promise of 'organ printing' is very clear: We want to print living, three-dimensional human organs,' Dr. Vladimir Mironov said. 'That's our goal, and that's our mission.' "

"Though the field is young, it already has a multitude of names.

" 'Some people call this 'bio-printing.' Some people call this 'organ printing.' Some people call this 'computer-aided tissue engineering.' Some people call this 'bio-manufacturing,' said Mironov, associate professor at the Medical University of South Carolina and one of the leading researchers in the field."

View Map + Bookmark Entry

In 2005, The World's Fastest Newspaper Offset Press 2005

The Mitsubishi DIAMONDSTAR 90 press

An example of the Mistubishi DIAMONDSTAR 90's ability to print in color

Tall as a four-story building, in 2005 the Mitsubishi DIAMONDSTAR 90, produced by Mitsubishi Heavy Industries Printing & Packaging Machinery Ltd., Hiroshima, Japan, was the world's fastest double width newspaper offset press, with a printing speed of 90,000 full color, 96-page broadsheet copies per hour. 

View Map + Bookmark Entry

Digitizing the Matthew Parker Library Begins 2005

The Parker Library on the Web homepage

Archbishop Matthew Parker

The Parker Library at Corpus Christi College, Cambridge

In 2005 the Parker Library on the Web project began the process of digitizing one of the greatest collections of medieval manuscripts formed in the sixteenth century by Archbishop Matthew Parker. It was:

"a multi-year undertaking of Corpus Christi College, the Stanford University Libraries and the Cambridge University Library, to produce a high-resolution digital copy of every imageable page in the 538 manuscripts described in M. R. James Descriptive Catalogue of the Manuscripts in the Library of Corpus Christi College, Cambridge (Cambridge University Press, 1912), and to build an interactive web application in which the manuscript page images can be used by scholars and students in the context of editions, translations and secondary sources" (Parker Library on the Web site, accessed 11-27-2008).

The project was expected to be completed in 2009. The website of the Parker Library is at this link.

View Map + Bookmark Entry

Kosmix.com 2005

The original Kosmix.com search engine homepage

Venky Harinarayan

Anand Rajaraman

"With the vision of connecting people to information that makes a difference in their lives,"in 2005 Venky Harinarayan and Anand Rajaraman founded Kosmix.com in Mountain View, California.

View Map + Bookmark Entry

"From Gutenberg to the Internet" 2005

In 2005 the author/editor of this database, Jeremy Norman, issued From Gutenberg to the Internet: A Sourcebook on the History of Information Technology.

This printed book was the first anthology of original publications, reflecting the origins of the various technologies that converged to form the Internet. Each reading is introduced by the editor.

View Map + Bookmark Entry

Google Earth is Launched 2005

An image of earth using the Google Earth program

The Google Earth logo

Keyhole EarthViewer 3D

In 2005 Google launched Google Earth, a virtual globe, map and geographical information program, which mapped the Earth by the superimposition of images obtained by satellite. The program, which Google acquired when it purchased Keyhole, Inc., was originally called EarthViewer 3D. 

View Map + Bookmark Entry

"Last Child in the Woods" : Exploration of Nature Versus Exposure to Media in Childhood 2005

In 2005 American journalist and non-fiction writer Richard Louv published Last Child in the Woods: Saving Our Children From Nature-Deficit DisorderIn this book Louv studied the relationship of children and the natural world in current and historical contexts, creating the term “nature-deficit disorder” to describe possible negative consequences to individual health and the social fabric as children move indoors as a result of immersion in television, Internet, and computer games, and away from physical contact with the natural world – particularly unstructured, solitary experience.

Louv cited research pointing to attention disorders, obesity, a dampening of creativity and depression as problems associated with a nature-deficient childhood. He amassed information on the subject from practitioners of many disciplines to make his case, and is commonly credited with helping to inspire an international movement to reintroduce children to nature.

I first learned about Louv's book in a lecture by paleontologist, educator, and television broadcaster Scott D. Sampson held at Marin Academy in San Rafael, California on October 26, 2011. Sampson's lecture was the first in a science lecture series organized by my son, Max, in his junior year in high school. An extremely engaging speaker, Sampson uses the electronic media to promote the disengagement from media, and active exploration of nature, especially in childhood. He also promotes the use of social media in promoting scientific exploration of nature by the individual in each person's locality.

View Map + Bookmark Entry

NarusInsight Supercomputer Network Monitoring Software 2005

The FBI seal

The Narus logo

In 2005 the FBI replaced Carnivore with commercially available network monitoring software such as NarusInsight, produced by Narus, a subsidiary of Boeing, headquartered in Sunnyvale, California.

"Features of NarusInsight include:

"♦ Scalability to support surveillance of large, complex IP networks (such as the Internet)

"♦ High-speed Packet processing performance, which enables it to sift through the vast quantities of information that travel over the Internet.

"♦ Normalization, Correlation, Aggregation and Analysis provide a model of user, element, protocol, application and network behaviors, in real-time. That is it can track individual users, monitor which applications they are using (e.g. web browsers, instant messaging applications, email) and what they are doing with those applications (e.g. which web sites they have visited, what they have written in their emails/IM conversations), and see how users' activities are connected to each other (e.g. compiling lists of people who visit a certain type of web site or use certain words or phrases in their emails).

"♦ High reliability from data collection to data processing and analysis.

"♦ NarusInsight's functionality can be configured to feed a particular activity or IP service such as security, lawful intercept or even Skype detection and blocking.

"♦ Compliance with CALEA and ETSI.

"♦ Certified by Telecommunication Engineering Center (TEC) in India for lawful intercept and monitoring systems for ISPs.

"The intercepted data flows into NarusInsight Intercept Suite. This data is stored and analyzed for surveillance and forensic analysis purposes.

"Other capabilities include playback of streaming media (i.e. VoIP), rendering of web pages, examination of e-mail and the ability to analyze the payload/attachments of e-mail or file transfer protocols. Narus partner products, such as Pen-Link, offer the ability to quickly analyze information collected by the Directed Analysis or Lawful Intercept modules.

"A single NarusInsight machine can monitor traffic equal to the maximum capacity (10 Gbit/s) of around 39,000 DSL lines or 195,000 telephone modems. But, in practical terms, since individual internet connections are not continually filled to capacity, the 10 Gbit/s capacity of one NarusInsight installation enables it to monitor the combined traffic of several million broadband users.

"According to a company press release, the latest version of NarusInsight Intercept Suite (NIS) is "the industry's only network traffic intelligence system that supports real-time precision targeting, capturing and reconstruction of webmail traffic... including Google Gmail, MSN Hotmail, Yahoo! Mail, and Gawab Mail (English and Arabic versions)."

"It can also perform semantic analysis of the same traffic as it is happening, in other words analyze the content, meaning, structure and significance of traffic in real time. The exact use of this data is not fully documented, as the public is not authorized to see what types of activities and ideas are being monitored" (Wikipedia article on Narus [company], accessed 01-14-2012). 

View Map + Bookmark Entry

Reflections on the Transition from Hot Metal to Digital Typesetting 2005

"Nothing is more striking, over the years covered by this survey, than the progressive dematerialization of the means by which texts are prepared for reproduction. At one extreme, in 1915, are the thousand pages of hand-set type for Fortescue's History, waiting to be printed at R & R Clark's works in Edinburgh. At the other are the resources used to produce this book, where the single concrete realization of the completed text that existed before printing was begun was the output from a laser imagesetter. In betwen are the disappearance of three-dimensional punches, matrices and type that came with direct-photography photocomposition, and the disappearance of the photographic matrix with the electronic technologies that followed.

"Over the same period the means used to produce the types with which text is composed have followed a similar course. The ranks of drawing desks or pantographs receding into the distance at Salfords date from the great days of the Monotype Corporation between the wars; but even in 1918 Rudolf Koch's Die Schriftgeisserei im Schattenbild shows 27 men, two women, two boys and two horses at work on the manufacture and despatch of foundry type. By contrast, the team that worked on the Colorado project, which in two years after 1995 produced all the type used for residential and business entries in telephone directories for most of the western United States, was made up at its largest of six people. The work was done in three different countries; the only concrete objects exchanged between the participants were character drawings and photocomposed proofs of type.

"In some ways the end of the twentieth century has brought the business of type manufacture back almost to where it began. Claude Garamont cut the punches for the grecs du Roy himself and had the matrices justified by Paterne Robelot, whom he chose for the task because he was clever at it. In the last couple of decades the development of computer-based typemaking tools and the world-wide web have meant that designers can now make and distribute type entirely on their own; though unless, like Garamont or the Colorado group, they are fulfilling a specific commission, marketing their work is still a problem.

"For the manufacture and composition of printer's type, paradoxically enough, the first decade of the twenty-first century is a period of relative technological calm. The basic tools - PostScript, TrueType, networked personal computers, page makeup programs and desktop laser printers - all appeared in the whirlwind of the 1980s. Increasing computing power has meant that more can now be done with them, and done more quickly; but the processes of type design, and the fundamental tehcnologies that underlie them, are very much the same today as they were for Sumner Stone in the 1980s.

"If typemaking tools changed beyond recognition in the early 1960s and again in the 1980s, there has been no change at all since 1445 or so in the task that types for composing text - or rather, the character images the types give rise to- are required to perform. The first objective in the design, manufacture, composition and reproduction of text types remains the same as it has always been: to put legible character images, legibly arranged, before the reader's eyes. It is the means of doing this that changed during the twentieth century, not the objective itself. The second objective - to give the type a voice of its own to speak with - has also remained the same, altthough changes in rendering techniques have had more effect on the difficulty of achieving this" (Southall, Printer's type in the twentieth century. Manufacturing and design methods [2005] 223-24).

View Map + Bookmark Entry

The "Selfie" Social Media Phenomenon Circa 2005

"In the early 2000s, before Facebook became the dominant online social network, self-taken photographs were particularly common on MySpace. However, writer Kate Losse recounts that between 2006 and 2009 (when Facebook became more popular than MySpace), the "MySpace pic" (typically "an amateurish, flash-blinded self-portrait, often taken in front of a bathroom mirror") became an indication of bad taste for users of the newer Facebook social network. Early Facebook portraits, in contrast, were usually well-focused and more formal, taken by others from distance. In 2009 in the image hosting and video hosting website Flickr, Flickr users used 'selfies' to describe seemingly endless self-portraits posted by teenage girls. According to Losse, improvements in design—especially the front-facing camera copied by the iPhone 4 (2010) from Korean and Japanese mobile phones, mobile photo apps such as Instagram, and selfie sites such as ItisMee—led to the resurgence of selfies in the early 2010s.

"Initially popular with young people, selfies gained wider popularity over time. By the end of 2012, Time magazine considered selfie one of the "top 10 buzzwords" of that year; although selfies had existed long before, it was in 2012 that the term "really hit the big time". According to a 2013 survey, two-thirds of Australian women age 18–35 take selfies—the most common purpose for which is posting on Facebook. A poll commissioned by smartphone and camera maker Samsung found that selfies make up 30% of the photos taken by people aged 18–24.

"By 2013, the word "selfie" had become commonplace enough to be monitored for inclusion in the online version of the Oxford English Dictionary. In November 2013, the word "selfie" was announced as being the "word of the year" by the Oxford English Dictionary, which gave the word itself an Australian origin.

"Selfies have also taken beyond the earth. A space selfie is a selfie that is taken in space. This include selfies taken by astronauts, machines and by an indirect method to have self-portrait photograph on earth retaken in space" (Wikipedia article on Selfie, accessed 02-27-2014).

View Map + Bookmark Entry

"Broadcast Yourself" : YouTube is Founded February 2005

The YouTube logo

Steven Chen

Chad Hurley

Jawed Karim

In February 2005 three former employees of Paypal — Steven Chen, Chad Hurley, and Jawed Karim — founded the video sharing website, YouTube.  Its first headquarters were above a pizzeria and Japanese restaurant in San Mateo, California. Most of the content on YouTube is uploaded by individuals, but media corporations including CBS, the BBCVevoHulu, and other organizations offer some of their material via YouTube, as part of the YouTube partnership program.

View Map + Bookmark Entry

The European Library is Launched March 17, 2005

The European Library logo

The European Library headquarters

On March 17, 2005 The European Library, a free service that offered access to the resources of the 48 national libraries of Europe in 20 languages, was launched from its headquarters at Koninklijke Bibliotheek, Den Haag (The Hague), Netherlands.  Resources included both digital or physical (books, posters, maps, sound recordings, videos, etc.).

"Currently The European Library gives access to 150 million entries across Europe. The amount of referenced digital collections is constantly increasing. Quality and reliability are guaranteed by the 48 collaborating national libraries of Europe. The European Library is a non-commercial organisation" (European Library website, accessed 11-21-2008).

View Map + Bookmark Entry

Filed under: Libraries

Sony's Playstation and PS1 Are the First Game Console to Sell 100 Million Units March 31, 2005

The PS1

On March 31, 2005 Sony's PlayStation and PS 1 reached "a combined total of 102.49 million units shipped", becoming the first video game console to reach the 100 million mark.

View Map + Bookmark Entry

Development and State Control of the Chinese Internet April 14, 2005

Xiao Qiang

The U. S.- China Economic and Security Review Commission (USCC.gov) issued the report of Xiao Qiang, University of California, Berkeley, on The Development and the State Control of the Chinese Internet. 

View Map + Bookmark Entry

The First Video is Uploaded to YouTube April 23, 2005

A screen shot from the first video uploaded to YouTube

The first video uploaded to YouTube—on April 23, 2005— was shot by Yakov Lapitsky at the San Diego Zoo. It showed co-founder Jawed Karim in front of the elephant enclosure "going on about long trunks."

By Feburary 2011 this brief video had been viewed 4,282,497 times. 

View Map + Bookmark Entry

AOL Buys The Huffington Post May 9, 2005 – February 7, 2011

The Huffington Post homepage

Arianna Huffington

The AOL logo

The Huffington Post, which launched on May 9, 2005 with a meager $1 million investment, and grew into one of the most heavily visited news sites in the country, announced that it would be acquired by AOL for $315 million, $300 million of it in cash and the rest in stock. 

"Arianna Huffington, the cable talk show pundit, author and doyenne of the political left, will take control of all of AOL’s editorial content as president and editor in chief of a newly created Huffington Post Media Group. The arrangement will give her oversight not only of AOL’s national, local and financial news operations, but also of the company’s other media enterprises like MapQuest and Moviefonea' (http://www.nytimes.com/2011/02/07/business/media/07aol.html?_r=1&hp).

"The company that brought dial-up Internet to millions of people is dead. In its place is a massive media empire that refuses to be ignored.  

"With its blockbuster acquisition of The Huffington Post, AOL has catapulted itself back into relevancy. It has sent a clear signal to the rest of the world that it is a media company and it is in this game to win.  

"AOL has been on a content acquisition spree recently, not only acquiring the technology blog network TechCrunch, but also snagging up Thing Labs, Brizzly and most recently About.me in the past few months" (http://mashable.com/2011/02/07/aol-huffington-post/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed:+Mashable+(Mashable), accessed 02-07-2010).

View Map + Bookmark Entry

"Brain Age: Train Your Brain in Minutes a Day!," the First Commercial NeuroGame May 19, 2005 – April 16, 2006

On May 19, 2005 Nintendo, headquartered in Kyoto, Japan, released Brain Age: Train Your Brain in Minutes a Day for the Nintendo DS dual-screen handheld gaming console in Japan. The game, which was also known as Dr. Kawashima's Brain Training: How Old is Your Brain, was released for the Nintendo DS in the United States on April 16, 2006. Though loosely based based on research by Japanese neuroscientist  Ryuta Kawashima, Nintendo made no claims for the scientific validation of the game. Brain Age may be considered the earliest commercial NeuroGame.

"Brain Age features a variety of puzzles, including stroop testsmathematical questions, and Sudoku puzzles, all designed to help keep certain parts of the brain active. It was included in the Touch! Generations series of video games, a series which features games for a more casual gaming audience. Brain Age uses the touch screen and microphone for many puzzles. It has received both commercial and critical success, selling 19.00 million copies worldwide (as of March 31, 2013) and has received multiple awards for its quality and innovation. There has been controversy over the game's scientific effectiveness" (Wikipedia article on Brain Age: Train Your Brain in Minutes a Day, accessed 07-31-2014).

View Map + Bookmark Entry

Reddit is Founded June 2005

In June 2005 Steve Huffman and Alexis Ohanian founded founded the social news and entertainment website Reddit.com in Medford, Massachusetts.

View Map + Bookmark Entry

Proposal for a World Digital Library June 6, 2005

The UNESCO logo

James H. Billington

The World Digital Library homepage

At the Plenary Session of the U.S. National Commission for UNESCO held at Georgetown University on June 6, 2005 Librarian of Congress James H. Billington offered a Proposal for a World Digital Library. A link to the World Digital Library site can be found here.

"The invention of the printing press with movable type fanned religious wars in the 16th century. The onset of telegraphy, photography, and the power-driven printing press in the 19th century created mass journalism that fulminated nationalistic passions and world wars in the 20th century. The arrival in the late 20th century of instantaneous, networked, global communication may well have facilitated the targeted propaganda, recruitment, and two-way communication of transnational terrorist organizations more than it has helped combat them.

"We are now discovering—painfully and much too slowly—that deep conflict between cultures is in many ways being fired up rather than cooled down by this revolution in communications, as was the case in the 16th and 19th centuries. Whenever new technology suddenly brings different peoples closer together and makes them aware of certain commonalities, it seems simultaneously to create a compensatory psychological need for the different peoples to define—and even assert aggressively—what is unique and distinctive about their own historic cultures."

View Map + Bookmark Entry

The BBC May be the First Mainsteam Industrial Medium to Adopt User-Generated Content July 7, 2005

The Buncefield Fuel Depot fire, taken 10 minutes after the explosion

The BBC logo

In the wake of the July 7, 2005 London bombings and the Buncefield oil depot fire, the British Broadcast Corporation (BBC) expanded its user-generated content team, established in April 2005. After the Buncefield disaster the BBC received over 5,000 photos from viewers. This may be the beginning of adoption of citizen-generated journalism by mainstream industrial media.

In 2006 CNN launched CNN iReport, a project designed to bring user generated news content to CNN. Its rival Fox News Channel launched its project to bring in user-generated news, similarly titled "uReport". This was typical of major television news organizations during 2005 and 2006. They realized, particularly in the wake of the London 7 July bombings, that citizen journalism could now become a significant part of broadcast news.

View Map + Bookmark Entry

"Peer to Patent" July 14, 2005

Beth Noveck

New York Law School

On July 14, 2005 Beth Noveck, director of New York Law School's Institute for Information Law and Policy, issued “Peer to Patent” (PtoP): A Modest Proposal in her blog. The proposal "would shift the patent-application process away from individual examiners to an internet-based, peer-review method."

View Map + Bookmark Entry

Wikimania!: The First International Wikimedia Conference Takes Place August 4 – August 8, 2005

The Wikimedia logo

A simulated Wikimania banner in Frankfurt

Wikimania 2005: The First International Wikimedia Conference was held in Frankfurt am Main from August 4-8, 2005.

View Map + Bookmark Entry

The First Intelligible Word from an Extinct South American Civilization? August 12, 2005

Gary Urton with some khipu

Carrie Brezine studying khipu

An example of khipu

On August 12, 2005 anthropologists Gary Urton and Carrie Brezine published "Khipu Accounting in Ancient Peru," Science 309(2005)1065 - 1067.

"Khipu [quipu] are knotted-string devices that were used for bureaucratic recording and communication in the Inka [Inca] Empire. We recently undertook a computer analysis of 21 khipu from the Inka administrative center of Puruchuco, on the central coast of Peru. Results indicate that this khipu archive exemplifies the way in which census and tribute data were synthesized, manipulated, and transferred between different accounting levels in the Inka administrative system" (Science).

"Researchers in the US believe they have come closer to solving a centuries-old mystery - by deciphering knotted string used by the ancient Incas.

"Experts say one bunch of knots appears to identify a city, marking the first intelligible word from the extinct South American civilisation.

"The coloured, knotted pieces of string,known as khipu, are believed to have been used for accounting information.

"The researchers say the finding could unlock the meaning of other khipu.

"Harvard University researchers Gary Urton and Carrie Brezine used computers to analyse 21 khipu.

"They found a three-knot pattern in some of the strings which they believe identifies the bunch as coming from the city of Puruchuco, the site of an Inca palace.

" 'We hypothesize that the arrangement of three figure-eight knots at the start of these khipu represented the place identifier, or toponym, Puruchuco,' they wrote in their report, published in the journal Science.

" 'We suggest that any khipu moving within the state administrative system bearing an initial arrangement of three figure-eight knots would have been immediately recognisable to Inca administrators as an account pertaining to the palace of Puruchuco.' (http://news.bbc.co.uk/2/hi/americas/4143968.stm, accessed 04-28-2009).

View Map + Bookmark Entry

The Million Dollar Homepage August 25, 2005 – January 11, 2006

A screenshot of the Million Dollar Homepage

On August 25, 2005 Alex Tew, a student from Wiltshire, England, launched The Million Dollar Homepage to pay for his university education.

"The home page consists of a million pixels arranged in a 1000 × 1000 pixel grid; the image-based links on it were sold for $1 per pixel in 10 × 10 blocks. The purchasers of these pixel blocks provided tiny images to be displayed on them, a Uniform Resource Locator (URL) to which the images were linked, and a slogan to be displayed when hovering a cursor over the link. The aim of the site was to sell all of the pixels in the image, thus generating a million dollars of income for the creator. The Wall Street Journal has commented that the site inspired other websites that sell pixels.

"Launched on 26 August 2005, the website became an Internet phenomenon. The Alexa ranking of web traffic peaked at around 127; as of 18 February 2009 (2009 -02-18)[update], it is 42,735. On 1 January 2006, the final 1,000 pixels were put up for auction on eBay. The auction closed on 11 January with a winning bid of $38,100 that brought the final tally to $1,037,100 in gross income" (Wikipedia article on The Million Dollar Homepage, accessed 05-08-2009).

View Map + Bookmark Entry

LibraryThing is Founded August 29, 2005

The LibraryThing homepage

Tim Spalding

On August 29, 2005 Tim Spalding, of Portland, Maine made LibraryThing operational. LibraryThing is a social cataloging website for storing and sharing personal and historic library catalogs and book lists. 

"By its one-year anniversary in August 2006, LibraryThing had attracted more than 73,000 registered users who had cataloged 5.1 million individual books, representing nearly 1.2 million unique works; in May 2008 they reached over 400,000 users and 27 million books" (Wikipedia article on LibraryThing, accessed 12-15-2008).

View Map + Bookmark Entry

A University Library Intended to Contain Very Few Physical Books September 6, 2005

Classes began at the University of California, Merced on September 6, 2005. At the opening of this new campus focused on math, science, and engineering the library included approximately 10,000 journal subscriptions—all available online, with no print journals. This "21st century research library" contained a limited collection of about 30,000 physical books, and offered interlibrary loans from other University of Calfornia libraries. It emphasized providing access to digital books and the "deep web"—databases available by subscription:

"The Internet is wide-ranging, but the bulk of the information needed for scholarly study and research is not freely available and cannot be found in a Google search. The UC Merced Library acquires and manages subscriptions to millions of scholarly articles in electronic journals, tens of thousands of electronic books, and hundreds of databases. Thanks to the Library, UC Merced students and faculty can access these scholarly electronic resources at any time with a connection to the Internet.

"The collection has what you want.

"The Library has many books and DVD movies on the shelves to support study in the areas of UC Merced specialization and to also provide a break from study with recreational reading and viewing. If what you need is not in the building, then use the University of California systemwide library catalog to request free, overnight courier delivery for any of the 32 million volumes at the other UC campuses" (UC Merced Library website, accessed 01-28-09).

View Map + Bookmark Entry

Electronic Records Archives System September 8, 2005

The Lockheed Martin logo

The National Archives seal

On September 8, 2005 the National Archives and Records Administration (NARA) selected Lockheed Martin Corporation to build the Electronic Records Archives (ERA) system, a permanent electronic archives system to preserve, manage, and make accessible the electronic records created by the federal government. The ERA system would capture electronic information – regardless of its format – save it permanently, and make it accessible on whatever future hardware or software is currently in use. Development of the system would continue over the next six years, and cost $308,000,000.

View Map + Bookmark Entry

Second International Conference on the Preservation of Digital Objects September 15 – September 16, 2005

On September 15-6, 2005 the second International Conference of the Preservation of Digital Objects took place in Göttingen, Germany.  (The first international conference in this series took place in 2004 in Beijing.)

View Map + Bookmark Entry

Connectomes: Elements of Connections Forming the Human Brain September 30, 2005

Olaf Sporns

Giulio Tononi

Neuroscientists Olaf Sporns of Indiana University, Giulio Tononi of the University of Wisconsin, and Rolf Köttler of Heinrich Heine University, Düsseldorf, Germany, published "The Human Connectome: A Structural Description of the Human Brain," PLoS Computational Biology I (4). This paper and the PhD thesis of Patric Hagmann from the Université de Lausanne, From diffusion MRI to brain connectomics, coined the term connectome:

In their 2005 paper  Sporns et al. wrote:

"To understand the functioning of a network, one must know its elements and their interconnections. The purpose of this article is to discuss research strategies aimed at a comprehensive structural description of the network of elements and connections forming the human brain. We propose to call this dataset the human 'connectome,' and we argue that it is fundamentally important in cognitive neuroscience and neuropsychology. The connectome will significantly increase our understanding of how functional brain states emerge from their underlying structural substrate, and will provide new mechanistic insights into how brain function is affected if this structural substrate is disrupted."

In his 2005 Ph.D. thesis, From diffusion MRI to brain connectomics, Hagmann wrote:

"It is clear that, like the genome, which is much more than just a juxtaposition of genes, the set of all neuronal connections in the brain is much more than the sum of their individual components. The genome is an entity it-self, as it is from the subtle gene interaction that [life] emerges. In a similar manner, one could consider the brain connectome, set of all neuronal connections, as one single entity, thus emphasizing the fact that the huge brain neuronal communication capacity and computational power critically relies on this subtle and incredibly complex connectivity architecture" (Wikipedia article on Connectome, accessed 12-28-2010).

View Map + Bookmark Entry

Google Print Morphs in Two October 2005

In October 2005 Google Print morphed into the Google Print Publisher Program and the Google Print Library Program.

View Map + Bookmark Entry

300,000,000 Printed Copies of Harry Potter October 5, 2005

J.K. Rowling

The Harry Potter series

On October 5, 2005 Global sales of J. K. Rowling's Harry Potter book series surpassed 300,000,000 printed copies.

View Map + Bookmark Entry

The Genetic Code of Avian Flu Virus H5N1 is Deciphered October 5, 2005

The Armed Forces Institute of Pathology logo

Colorized transmission electron micrograph of Avian influenza A H5N1 viruses (seen in gold) grown in MDCK cells (seen in green)

On October 5,2005 scientists at the Armed Forces Institute of Pathology announced that they deciphered the genetic code of the 1918 avian flu virus H5N1, which killed as many as 50,000,000 people worldwide, from a victim exhumed in 1997 from the Alaskan permafrost. The scientists reconstructed the virus in the laboratory and published the genetic sequence.

View Map + Bookmark Entry

It Could Take 300 Years to Index All the World's Information October 8, 2005

Google CEO Eric Schmidt

Google CEO Eric Schmidt speculated on October 8, 2005 that it may take three hundred years to index all the world's information and make it searchable.

" 'We did a math exercise and the answer was 300 years,' Schmidt said in response to an audience question asking for a projection of how long the company's mission will take. 'The answer is it's going to be a very long time.'

"Of the approximately 5 million terabytes of information out in the world, only about 170 terabytes have been indexed, he said earlier during his speech."

View Map + Bookmark Entry

Decoding Printer Tracking Dots October 19, 2005

The Electronic Frontier Foundation logo

An image of one repetion of the dot grid from the Xerox DocuColor 12 page, magnified 10x and photographed by the QX5 microscope under illumination from a Photon blue LED flashlight

The dot grid under 60x magnification

In October 2005 the Electronic Frontier Foundation decoded printer tracking dots.

View Map + Bookmark Entry

The Open Content Alliance is Founded October 25, 2005

The Open Content Alliance logo

Brewster Kahle

The Google Books homepage

Microsoft announced on October 25, 2005 that it was joining the Open Content Alliance founded by Brewster Kahle of the Internet Archive. The Open Content Alliance was formed partly in response to Google Print, renamed Google Books.

View Map + Bookmark Entry

The Amazon Mechanical Turk November 2, 2005

Wolfgang von Kempelen

The Amazon Mechanical Turk logo

A diagram explanation of Amazon's Mechanical Turk 

Alluding to Wolfgang von Kempelen's eighteenth-century automaton, The Turk, which purported to automate chessplaying when this was impossible, on November 2, 2005 Amazon.com launched the Amazon Mechanical Turk:

"a crowdsourcing marketplace that enables computer programs to co-ordinate the use of human intelligence to perform tasks which computers are unable to do."

This was  the first business application using Collaborative Human Interpreter, a programming language "designed for collecting and making use of human intelligence in a computer program. One typical usage is implementing impossible-to-automate functions."

View Map + Bookmark Entry

Massively Distributed Collaboration November 9, 2005

Mitchell Kapor

At the UC Berkeley School of Information, on November 9, 2005  Mitchell Kapor delivered an address entitled Content Creation by Massively Distributed Collaboration.

"The sudden and unexpected importance of the Wikipedia, a free online encyclopedia created by tens of thousands of volunteers and coordinated in a deeply decentralized fashion, represents a radical new modality of content creation by massively distributed collaboration. This talk will discuss the unique principles and values which have enabled the Wikipedia community to succeed and will examine the intriguing prospects for application of these methods to a broad spectrum of intellectual endeavors."

View Map + Bookmark Entry

A Plan to Create a World Digital Library November 11, 2005

The World Digital Library homepage

On November 11, 2005 the Library of Congress announced a plan to create the World Digital Library of works in the public domain. Google donated $3,000,000 toward the costs of planning this project.

View Map + Bookmark Entry

Google Books December 2005

The Google Books logo

The shift from Google Print to Google Books

In December 2005 the Google Print project morphed into Google Books.

View Map + Bookmark Entry

Maybe the World's Largest Physical Library December 2005

The exterior of the British Library

The interior of the British Library

A view of the statue outside the British Library

Bookshelves within the British Library

The British Library with about 150,000,000 physical items on 625km of shelves might have been the world's largest physical library in 2005, though the U.S. Library of Congress also made this claim. The British Library added about 3,000,000 physical items per year, which occupied about 12km of new shelving. At the end of 2005 the Library of Congress held about 130,000,000 physical items and had more than 8,000,000 digital items online.

View Map + Bookmark Entry

The Heritage Health Index Report on the State of America's Collections December 2005

The Heritage Preservation logo

The Institute of Museum and Library Services logo

In December 2005 Heritage Preservation, the U.S. National Institute for Conservation, and the Institute of Museum and Library Services published The Heritage Health Index Report on the State of America's Collections. Among the conclusions of this report were that there were 4.8 billion cultural heritage materials in the U.S. and over 1.3 billion of those items were at risk.  Forty percent of the surveyed institutions that housed those items reported no budget allocated for preservation while 80% of the institutions had no disaster plan.

View Map + Bookmark Entry

The Wikipedia is Rated Nearly as Accurate as Encyclopedia Brittanica December 14, 2005

A cover of the journal Nature

The Encyclopaedia Britannica logo

The Wikipedia logo

Academics were often critical of articles in the Wikipedia, as the quality of the articles in that encyclopedia varied greatly in quality. However, n December 15, 2005 the journal Nature, published in London, conducted a peer-review comparison of selected science articles in the printed Encyclopedia Britannica, published in Chicago, which contains 65,000 articles by 4,000 contributors, and the online user-edited Wikipedia. The reviewers rated the Wikipedia nearly as accurate as Britannica.

View Map + Bookmark Entry

Pixar at MOMA December 14, 2005

The Pixar logo

A poster for Pixar at the Moma

The Moma

On December 14, 2005 the Museum of Modern Art (MoMA), New York, opened PIXAR: 20 Years of Animation:

"The Most Extensive Gallery Exhibition that MoMA has ever devoted to Animation along with a Retrospective of Pixar Features and Shorts."

Notably MoMA found it unnecessary to characterize the exhibition as "computer animation" since by this time virtually all animation was done by computer. They published a 175 page printed catalogue of the exhibition.

View Map + Bookmark Entry

Data Curation as a Profession 2006

In 2006 The Center for Informatics Research in Science and Scholarship (CIRSS), formerly the Library Research Center (LRC), of the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign, began funding the Data Curation Education Program (DCEP).

"Data curation is the active and on-going management of data through its lifecycle of interest and usefulness to scholarly and educational activities across the sciences, social sciences, and the humanities. Data curation activities enable data discovery and retrieval, maintain data quality, add value, and provide for re-use over time. This new field includes representation, archiving, authentication, management, preservation, retrieval, and use. Our program offers a focus on data collection and management, knowledge representation, digital preservation and archiving, data standards, and policy, providing the theory and skills necessary to work directly with academic and industry researchers who need data curation expertise. To this end, DCEP has established a number of educational collaborations with premier science, social science, and humanities data centers across the country to prepare a new generation of library and information science professionals to curate materials from databases and other formats. We anticipate that our graduates will be employed across a range of information-oriented institutions, including museums, data centers, libraries, institutional repositories, archives, and private industry."

The program began with a focus on "data curation curriculum and best practices for the LIS and scientific communities. IMLS provided additional funding in 2008 to extend the curriculum to include humanities data" (Data Curation Education Program website, accessed 01-28-2009).

View Map + Bookmark Entry

Springer Published 50,000 eBooks 2006 – January 19, 2012

The Springer logo

The SpringerLink website

Springer, which initiated its eBook program in 2006, announced the publication of its 50,000 eBook on January 19, 2012, available through SpringerLink.com.  Springer also stated that it would digitize nearly all books it had published since its foundation in 1842.  This would increase the number of titles to over 100,000. Based on the number of titles available in January 2012, Springer claimed to be the "largest eBook publisher."

View Map + Bookmark Entry

The National Endowment for the Humanities "Office of Digital Humanities" Begins 2006

In 2006 the National Endowment for the Humanities (NEH), the federal granting agency for scholarships in the humanities, launched the Digital Humanities Initiative; this was renamed Office of Digital Humanities in 2008.

View Map + Bookmark Entry

Filed under: Digital Humanities

A More Efficient Way to Teach Individual Layers of Neurons for Deep Learning 2006

In the mid-1980s, British-born computer scientist and psychologist Geoffrey Hinton and others helped revive resarch interest in neural networks with so-called “deep” models that made better use of many layers of software neurons. But the technique still required major human intervention: programmers had to label data before feeding it to the network, and complex speech or image recognition required more computer power than was available.

During the first decade of the 21st century Hinton and colleagues at the University of Toronto made some fundamental conceptual breakthroughs that have led to advances in unsupervised learning procedures for neural networks with rich sensory input.

"In 2006, Hinton developed a more efficient way to teach individual layers of neurons. The first layer learns primitive features, like an edge in an image or the tiniest unit of speech sound. It does this by finding combinations of digitized pixels or sound waves that occur more often than they should by chance. Once that layer accurately recognizes those features, they’re fed to the next layer, which trains itself to recognize more complex features, like a corner or a combination of speech sounds. The process is repeated in successive layers until the system can reliably recognize phonemes or objects" (Robert D. Hof, "Deep Learning," MIY Technology Review, April 23, 2013, accessed 11-10-2014).

Hinton, G. E.; Osindero, S.; Teh, Y., "A fast learning algorithm for deep belief nets", Neural Computation 18 #7 (2006) 1527–1554.

View Map + Bookmark Entry

The Highest Price Paid for a Domain Name January 16, 2006

Gary Kremen

Having initially registered the domain name for free, after which he temporarily lost it to a con man, Gary Kremen won a lawsuit and sold Sex.com for Boston-based Escom LLC $14,000,000 or  "$15 million in cash and stock." This was the highest price obtained for a domain name at the time. Maybe ever?

View Map + Bookmark Entry

File-Sharing Exceeds Sales of Digital Music Downloads January 22, 2006

In 2006 free file-sharing of digital music on the web exceeded the sale of digital music downloads by many fold:

"Total music sales - including online - are off some 20 percent from five years ago. Songs traded freely over unlicensed Internet sites swamp the number of legal sales by thousands to one."

View Map + Bookmark Entry

Disney Acquires Pixar January 24, 2006

The Pixar version of the Disney logo, used in Pixar movies

The Pixar logo, including a number of popular Pixar characters

Steve Jobs

The Walt Disney Company, born in the days of manual animation, acquired Pixar, the computer animation company, making Steve Jobs the largest Disney stockholder.

View Map + Bookmark Entry

Using Currency Movements to Predict the Spread of Infectious Disease January 26, 2006

Dirk Brockman

A bill that has can be tracked on Where's George?

On January 26, 2006 Dirk Brockmann, a theoretical physicist and computational epidemiologist at Northwestern University in Evanston, Illinois, L. Hufnagel, and T. Geisel published "The scaling laws of human travel," Nature 439 (2006) 462-65. 

Using statistical data from the American currency tracking website, Where's George?, the paper described statistical laws of human travel in the United States, and developed a mathematical model of the spread of infectious disease.

[By January 31, 2009, Where's George? tracked over 149 million bills totaling more than $810 million. (Wikipedia).]

View Map + Bookmark Entry

Western Union Discontinues Telegram Transmissions January 27, 2006

On January 27, 2006 Western Union, headquartered in Englewood, Colorado, which once had a virtual monopoly on telegraph transmission in the United States, discontinued Telegram and Commercial Messaging services, acknowledging that the service had been nearly entirely replaced by email.  

"This ended the era of telegrams which began in 1851 with the founding of the New York and Mississippi Valley Printing Telegraph Company, and which spanned 155 years of continuous service. Western Union reported that telegrams sent had fallen to a total of 20,000 a year, because of competition from other communication services such as email" (Wikipedia article on Western Union, accessed 08-25-2013). 

Nevertheless, a new company, iTelegram (International Telegram), continued telegram services in a limited manner.

View Map + Bookmark Entry

College-Level Lectures Via Podcasts January 28, 2006

The iTunes U logo

The iTunes U section of the iTunes application

Apple launched iTunes U, a service that offered college-level lectures via podcasts.

View Map + Bookmark Entry

Hepting v. AT&T January 31, 2006

The Electronic Frontier Foundation logo

The AT&T logo

On January 31, 2006 The Electronic Frontier Foundation (EFF) filed a class-action lawsuit against AT&T accusing the telecom giant of violating the law and the privacy of its customers by collaborating with the National Security Agency (NSA) in "its massive illegal program to wiretap and data-mine Americans' communications."

"In Hepting v. AT&T, EFF sued the telecommunications giant on behalf of its customers for violating privacy law by collaborating with the NSA in the massive, illegal program to wiretap and data-mine Americans’ communications.  

"Evidence in the case includes undisputed evidenceprovided by former AT&T telecommunications technician Mark Klein showing AT&T has routed copies of Internet traffic to a secret room in San Francisco controlled by the NSA.  

"In June of 2009, a federal judge dismissed Hepting and dozens of other lawsuits against telecoms, ruling that the companies had immunity from liability under the controversial FISA Amendments Act (FISAAA), which was enacted in response to our court victories in Hepting. Signed by President Bush in 2008, the FISAAA allows the Attorney General to require the dismissal of the lawsuits over the telecoms' participation in the warrantless surveillance program if the government secretly certifies to the court that the surveillance did not occur, was legal, or was authorized by the president -- certification that was filed in September of 2008. EFF is planning to appeal the decision to the 9th U.S. Circuit Court of Appeals, primarily arguing that FISAAA is unconstitutional in granting to the president broad discretion to block the courts from considering the core constitutional privacy claims of millions of Americans" (https://www.eff.org/nsa/hepting, accessed 01-14-2014).

View Map + Bookmark Entry

A Research Library Based on Historical Collections of the Internet Archive February 2006

A screenshot of the D-Lib Magazine homepage

In the February 2006 issue of D-Lib Magazine researchers at Cornell University from the departments of Computer Science, Information Science, and the Cornell Theory Center described plans for A Research Library Based on the Historical Collections of the Internet Archive. The library, a super-computing application consisting of 10 billion web pages, was intended to be used by social scientists.

View Map + Bookmark Entry

92% of Cameras Sold are Digital February 2006

The Canon A530, considered by many to be one of the best digital cameras available in 2006

By some estimates 92 percent of all cameras sold in 2006 were digital.

View Map + Bookmark Entry

"The Greatest 200 Videogames of Their Time" February 2, 2006

The cover of the 200th issue of Electronic Gaming Monthly

A screenshot from Super Mario Bros, listed at No. 1 in the list of "The Greatest 200 Videogames of Their Time"

In February 2006, as part of their celebration of their 200th issue, Electronic Gaming Monthly ranked, in ascending order of importance, "The Greatest 200 Videogames of their Time."

View Map + Bookmark Entry

The "Cyber Storm" War Game February 6 – February 10, 2006

The Department of Homeland Security seal

From February 6-10, 2006 vital US infrastructure, including power grids and banking systems, were put under simulated attack in a week-long security exercise called Cyber Storm.

FROM THE U.S. GOVERNMENT'S PUBLISHED INTERPRETATION OF THE RESULTS

"The U.S. Department of Homeland Security’s (DHS) National Cyber Security Division (NCSD) successfully executed Cyber Storm, the first national cyber exercise Feb. 6 thru Feb. 10, 2006. The exercise was the first government-led, full-scale cyber security exercise of its kind. NCSD, a division within the department’s Preparedness Directorate, provides the federal government with a centralized cyber security coordination and preparedness function called for in the National Strategy for Homeland Security, the National Strategy to Secure Cyberspace and Homeland Security Presidential Directive 7. NCSD is the focal point for the federal government’s interaction with state and local government, the private sector and the international community concerning cyberspace vulnerability reduction efforts."

"The Scenario

"The exercise simulated a sophisticated cyber attack campaign through a series of scenarios directed at several critical infrastructure sectors. The intent of these scenarios was to highlight the interconnectedness of cyber systems with physical infrastructure and to exercise coordination and communication between the public and private sectors. Each scenario was developed with the assistance of industry experts and was executed in a closed and secure environment.

"Cyber Storm scenarios had three major adversarial objectives:

"* To disrupt specifically targeted critical infrastructure through cyber attacks

"* To hinder the governments' ability to respond to the cyber attacks

"* To undermine public confidence in the governments' ability to provide and protect service" (http://www.dhs.gov/xnews/releases/pr_1158340980371.shtm, accessed 08-09-2009).

The Department of Homeland Security has information of Cyber Storm I here.

♦ A LESS OPTIMISTIC INTERPRETATION FROM THE WIKIPEDIA

"The Cyber Storm exercise was a simulated exercise overseen by the Department of Homeland Security that took place February 6 through February 10, 2006 with the purpose of testing the nations defenses against digital espionage. The simulation was targeted primarily at American security organizations but officials from Britain, Canada, Australia and New Zealand participated as well.

"Simulation

"The exercise simulated a large scale attack on critical digital infrastructure such as communications, transportation, and energy production. The simulation took place a series of incidents which included.

" * Washington's metro trains mysteriously shutting down.

" * Bloggers revealing locations of railcars containing hazardous materials. * The airport control towers of Philadelphia and Chicago mysteriously shutting down.

" * A mysterious liquid appearing on a London subway.

" * Significant numbers of people on "no fly" lists suddenly appearing at airports all over the nation.

" * Planes flying too close to the White House. * Water utilities in Los Angeles getting compromised.

"Internal difficulties

"During the exercise the computers running the simulation came under attack by the players themselves. Heavily censored files released to the Associated Press reveal that at some time during the exercise the organizers sent every one involved an e-mail marked "IMPORTANT!" telling the participants in the simulation not to attack the game's control computers.

"Performance of participants

"The Cyber Storm exercise highlighted the gaps and shortcomings of the nation's cyber defenses. The cyber storm exercise report found that institutions under attack had a hard time getting the bigger picture and instead focused on single incidents treating them as 'individual and discrete.'

"In light of the test the Department of Homeland Security raised concern that the relatively modest resources assigned to cyber-defense would be 'overwhelmed in a real attack' (Wikipedia article on Cyber Storm Exercise, accessed 08-09-2009).

 

View Map + Bookmark Entry

Zillow.com is Launched February 8, 2006

The Zillow.com logo

Rich Barton

Lloyd Frink

On February 8, 2006 Rich Barton and Lloyd Frink, former Microsoft executives and founders of Expedia, launched the online real estate service company, Zillow.com in Seattle, Washington.

"Zillow allows users to see the value of millions of homes across the United States, not just those up for sale. In addition to giving value estimates of homes, it offers several unique features including value changes of each home in a given time frame (such as 1, 5, or 10 years), aerial views of homes, and prices of homes in the area. Where it can access appropriate data, it also provides basic information on a given home, such as square footage and the number of bedrooms and bathrooms. Users can also get current estimates of homes if there was a significant change made, such as a recently remodeled kitchen. Zillow provides an application programming interface (API) and developer support network.

"As a part of its API, Zillow assigns a numerical integer to each of the 70 million homes in its database, which is plainly visible as CGI parameters to the URLs to individual entries on its website. The identifier is not obfuscated and is assigned in sequence for each house or condo on the side of a street. Zillow reports on individual units, such as providing street address, latitude and longitude. When integrated with the features of a typical online reverse telephone directory and wiki-mapping services such as WikiMapia, it allows for nationwide "seating assignments" of U.S. neighborhoods for each house that has a listed phone number with a real human name" (Wikipedia article on Zillow.com.)

View Map + Bookmark Entry

Making Handwritten Manuscripts Searchable February 9, 2006

Professor Alan Smeaton of DCU, who worked on the Google funded research project

Using object detection technology, in February 2006 researchers at the University of Buffalo, the University of Massachusetts at Amherst, and the Adaptive Information Cluster at Dublin City University, in association with Google, developed software for scanning historical manuscripts in a way that recognized handwriting to make electronic texts of these manuscripts searchable.

View Map + Bookmark Entry

Over One Billion iTunes Downloads February 22, 2006

Steve Jobs speaking about the one billionth iTunes download

The countdown on the iTunes homepage as the one billionth download drew near

On February 22, 2006 the Apple iTunes Store surpassed one billion iTunes downloads.

View Map + Bookmark Entry

Access to Nearly One Million Archive Collection Descriptions March 2006

A screenshot of the ArchiveGrid homepage

In March 2006 RLG opened ArchiveGrid, a new search engine providing access to nearly a million archive collection descriptions in thousands of libraries, museums, and archives.

View Map + Bookmark Entry

World Wide Web History Center is Founded March 2006

The World Wide Web History Center logo

William B. Prickett

In March 2006 Marc Weber and William B. Pickett founded the World Wide Web History Center.

View Map + Bookmark Entry

Studies on Digital Library Evolution March 2006

In March 2006D-Lib Magazine, produced by the Corporation for National Research Initiatives, Reston, Virginiapublished a special issue on "Digital Library Evolution."

View Map + Bookmark Entry

The Changing Nature of the Catalogue. . . . March 17, 2006

Karen Calhoun

Reflecting the influence of the Internet on physical library access and usage, on March 17, 2006 the Library of Congress published The Changing Nature of the Catalogue and its Integration with Other Discovery Tools by Karen Calhoun.

View Map + Bookmark Entry

Damage to Codex Atlanticus Caused by Efforts at Preservation April 2006

A self-portrait by Leonardo da Vinci in red chalk

A unique edition of the Codex Atlanticus as it was in the 1600s. The book is a box made by Pompeo Leoni to collect all of the pages made by Mario Taddei in 2007

In April 2006 Carmen Bambach of the Metropolitan Museum of Art, New York discovered an extensive invasion of molds of various colors, "including black, red, and purple, along with swelling of pages" on the priceless manuscripts of Leonardo da Vinci's Codex Atlanticus, preserved in the Bibliotheca Ambrosiana in Milan. 

In 2008 The Opificio delle Pietre Dure, in Florence "determined that the colors found on the pages weren't the product of mold, but instead caused by mercury salts added to protect the Codex from mold."

View Map + Bookmark Entry

The Most Viewed Video on YouTube as of 2009 April 2006 – May 9, 2009

Judson Laipply

American motivational speaker, inspirational comedian, and dancer Judson Laipply from Bucyrus, Ohio, posted the video clip Evolution of Dance on YouTube.

By May 9, 2009 the video had been viewed 119,378,381 times.  AT this date it was the Most Viewed (All Time) Video, the Most Favorited (All Time) Video, and the eighth Most Discussed (All Time) Video on YouTube.

View Map + Bookmark Entry

The Espresso "On Demand" Book Machine April 2006

The Espresso Book Machine

In April 2006 the first experimental beta Espresso Book Machine was installed at the World Bank InfoShop in Washington, D.C. to print and bind World Bank publications on demand.

"In September 2006 ODB installed a second beta machine at The Library of Alexandria, Egypt, to print books in Arabic. The first EBM Version 1.5 was introduced for ninety days at the New York Public Library during the summer of 2007."

In September 2008 the first Espresso Book Machine in a retail commercial setting was installed at Angus & Robertson in Melbourne, Australia.

Link to the PDF brochure for Espresso Book Machine 2.0 at ondemandbooks.com, accessed 08-31-2009.

♦ In November 2012 it was my pleasure to see the Espress Book Machine in operation at the privately owned Harvard Book Store in Cambridge, Massachusetts. Humorously nicknamed "Paige M. Gutenborg," the machine produced remarkably high quality paperback books at the speed of around 5 minutes per book. Customers supplied fully formatted black and white text as a PDF plus a separate PDF containing their design for a full color cover. The machine combined a double-sided xerographic laser printer with an ingenious binding and trimming mechanism. It printed the text on regular book paper and the color cover on coated cover stock. Since the binding machine was enclosed in plexiglass it was possible to observe the various binding processes, concluding with the machine dropping each finished copy out a small chute. When I watched the machine in operation it was being observed by a human operator.  My impression was that the machine required certain adjustments and worked best when "supervised" by a human.

 

 

View Map + Bookmark Entry

A Critical Review at the Library of Congress April 3, 2006

Representing the Library of Congress Professional Guild, Thomas Mann published A Critical Review of Karen Calhoun's paper published on March 17. This review rebutted various assertions in the Calhoun report.

View Map + Bookmark Entry

The Biggest Music Retailer in the World: Apple's iTune Store April 23, 2006

On April 23, 2006 Apple's iTunes Store was acknowledged as the biggest music retailer in the world, able to dictate its 99 cent per track retail price to music wholesalers.

View Map + Bookmark Entry

"The entire works of humankind, from the beginning of recorded history, in all languages" would amount to 50 petabytes of data. May 14, 2006

Kevin Kelly

In the New York Times Magazine on May 14, 2006 Kevin Kelly of Pacifica, California, published Scan this Book!—an account of developments leading to the "universal" digital library on the Internet.

"From the days of Sumerian clay tablets till now, humans have "published" at least 32 million books, 750 million articles and essays, 25 million songs, 500 million images, 500,000 movies, 3 million videos, TV shows and short films and 100 billion public Web pages. All this material is currently contained in all the libraries and archives of the world. When fully digitized, the whole lot could be compressed (at current technological rates) onto 50 petabyte hard disks. Today you need a building about the size of a small-town library to house 50 petabytes. With tomorrow's technology, it will all fit onto your iPod. When that happens, the library of all libraries will ride in your purse or wallet — if it doesn't plug directly into your brain with thin white cords. Some people alive today are surely hoping that they die before such things happen, and others, mostly the young, want to know what's taking so long. (Could we get it up and running by next week? They have a history project due.)"

View Map + Bookmark Entry

The Word Crowdsourcing is Coined June 2006

Jeff Howe

Cover art for Crowdsourcing by Jeff Howe

In an article published in Wired in June 2006 Jeff Howe coined the term crowdsourcing

"for the act of taking a job traditionally performed by an employee or contractor, and outsourcing it to an undefined, generally large group of people, in the form of an open call. For example, the public may be invited to develop a new technology, carry out a design task, refine an algorithm or help analyze large amounts of data."

View Map + Bookmark Entry

Like Teleporting in Star Trek June 2006

John Chambers

A Telepresence session between residents in Ghana, Africa, and Newark, NJ

In June 2006 the Chairman of Cisco systems, John Chambers, compared telepresence to teleporting in Star Trek, and said it will be potentially a billion dollar market.

View Map + Bookmark Entry

The "Print Clock" Method for Dating Printing June 20, 2006

S. Blair Hedges

Borrowing a technique from genetics,on June 20, 2006 S. Blair Hedges, professor at biology at Penn State, University Park, Pennsylvania published  "A method for dating early books and prints using image analysis," Proc. R. Soc. Lond. A: Mathematical, Physical, and Engineering Sciences 462 (2006) 3555-3573, describing the "print clock" method for dating examples of printing, including books and copperplates, issued from hand-operated presses. A supplementary appendix was available from Hedges' website

View Map + Bookmark Entry

OCLC Merges with RLG July 1, 2006

The OCLC logo

The RLG logo

On July 1, 2006 OCLC merged with RLG. The combination of programs and services was expected to "advance offerings and drive efficiencies for libraries, archives, museums and other research organizations worldwide."

View Map + Bookmark Entry

Reborn Digital: The First Fully Digital University Press: A 3 Year Experiment in the United States July 13, 2006 – September 30, 2010

The Rice University Press logo

The Connexions logo

Rice University Press, which shut down in 1996, announced that it was re-opening as an entirely digital operation:

"As money-strapped university presses shut down nationwide, Rice University is turning to technology to bring its press back to life as the first fully digital university press in the United States.  

"Using the open-source e-publishing platform Connexions, Rice University Press is returning from a decade-long hiatus to explore models of peer-reviewed scholarship for the 21st century. The technology offers authors a way to use multimedia -- audio files, live hyperlinks or moving images -- to craft dynamic scholarly arguments, and to publish on-demand original works in fields of study that are increasingly constrained by print publishing.  

" 'Rice University Press is using Rice's strength in technology to innovatively overcome increasingly common obstacles to publication of scholarly works,' Rice University President David Leebron said. 'The nation's first fully digital academic press provides not only a solution for scholars -- particularly those in the humanities -- who are limited by the dearth of university presses, but also a venue for publishing multimedia essays, articles, books and scholarly narratives.'

Charles Henry, Rice University vice provost, university librarian and publisher of Rice University Press during the startup phase, said, 'Our decision to revive Rice's press as a digital enterprise is based on both economics and on new ways of thinking about scholarly publishing. On the one hand, university presses are losing money at unprecedented rates, and technology offers us ways to decrease production costs and provide nearly ubiquitous delivery system, the Internet. We avoid costs associated with backlogs, large inventories and unsold physical volumes, and we greatly speed the editorial process.  

" 'We don't have a precise figure for our startup costs yet, but it's safe to say our startup costs and annual operating expenses will be at least 10 times less than what we'd expect to pay if we were using a traditional publishing model,' Henry said.  

"The digital press will operate just as a traditional press, up to a point. Manuscripts will be solicited, reviewed, edited and resubmitted for final approval by an editorial board of prominent scholars. But rather than waiting for months for a printer to make a bound book, Rice University Press's digital files will instead be run through Connexions for automatic formatting, indexing and population with high-resolution images, audio and video and Web links.  

" 'We don't print anything,' Henry explained. 'It will go online as a Rice University Press publication in a matter of days and be available for sale as a digital book.' Users will be able to view the content online for free or purchase a copy of the book for download through the Rice University Press Web site. Alternatively, thanks to Connexions' partnership with on-demand printer QOOP, users will be able to order printed books if they want, in every style from softbound black-and-white on inexpensive paper to leather-bound full-color hardbacks on high-gloss paper.  

"As with a traditional press, our publications will be peer-reviewed, professionally vetted and very high quality,' Henry said. 'But the choice to have a printed copy will be up to the customer.'

"Authors published by Rice University Press will retain the copyrights for their works, in accordance with Connexions' licensing agreement with Creative Commons. Additionally, because Connexions is open-source, authors will be able to update or amend their work, easily creating a revised edition of their book. W. Joseph King, executive director of Connexions and co-director of the Rice University Press project, said, 'Connexions' mission is to support open education in all forms, including the publication of original scholarly works. We believe that Connexions has the ability to change the university press at Rice and in general.'

"In the coming months, Rice University Press will name its board of directors and appoint an editorial board in one or two academic disciplines that are especially constrained by the current print model. Over time, Rice University Press will focus on:

"1. Putting out original scholarly work in fields particularly impacted by the high costs and distribution models of the printed book. One such field is art history, in which printing costs are exceptionally high. Over the years, many university presses have slashed the number of art history titles, severely limiting younger scholars' prospects of publication, Henry said. Rice University Press has identified art history as a field that would benefit immediately and therefore it will be the press's first area of major effort.  

"2. Fostering new models of scholarship: With the rise of digital environments, scholars are increasingly attempting to write book-length studies that use new media -- images, video, audio and Web links -- as part of their arguments. Rice University Press will easily accommodate these new forms of scholarship, Henry said.

"3. Providing more affordable publishing for scholarly societies and centers: Often disciplinary societies and smaller centers, especially in the humanities, publish annual reports, reflections on their field of study or original research resulting from grants. For smaller organizations, the printing costs of these publications are prohibitive. Rice University Press will partner with organizations to provide more affordable publishing.  

"4. Partnering with large university presses: In the wake of rising production costs and overhead, many university presses have closed or reduced the number of titles they publish, especially in the humanities and social sciences. As a result many peer-reviewed, high quality books are waiting on backlog. Rice University Press will work with selected university publishers to inexpensively publish approved works. Henry said two major university presses have already expressed an interest in working with Rice University Press to reduce backlogged titles. Rice University Press plans to partner with these and other presses to produce such works as dual publications.  

" 'Technological innovations suffuse academia, but institutional innovation often seems more challenging. The initiative to resuscitate Rice University Press as a fully digital university press is thus doubly exciting,' said Steve Wheatley, vice president of the American Council of Learned Societies, an umbrella organization of 70 scholarly societies in the humanities and social sciences. 'It is particularly encouraging to note that the revived press will give special attention to scholarship that is born digital. Equally commendable -- and perhaps even more important -- is the commitment of the university to support this initiative at this crucial phase for scholarly publishing " (http://media.rice.edu/media/NewsBot.asp?MODE=VIEW&ID=8654, accessed 05-23-2010)/

♦ "Rice University Press ceased operations on September 30, 2010. Certain publications continue to be available on Connexions."

View Map + Bookmark Entry

Molecular Animation July 30 – August 3, 2006

At Siggraph2006, held in Boston, Massachusetts from July 30 to August 3, 2006, BioVisions, a scientific visualization program at Harvard’s Department of Molecular and Cellular Biology, and Xvivo, a Connecticut-based scientific animation company, introduced the three-minute molecular animation video, The Inner Life of the Cell.

The film depicted marauding white blood cells attacking infections in the body. 

 

 

 

View Map + Bookmark Entry

Google Apps are Introduced August 2006

The Google Apps logo, including a diagram of some of the applications offered

In August 2006 Google began introduction of web-based Google Apps productivity software.

View Map + Bookmark Entry

100,000,000 Users Within Three Years August 9, 2006

The Myspace login page layout from 2006

In 2006 MySpace, founded in August 2003, had 100,000,000 users.

View Map + Bookmark Entry

"The Document in the Digital Era" by Web-Footed September 2006

Le Document a la Lumiere du Numerique (The Document in the Digital Era) was published in print in September 2006 by a collaborating group of information researchers under the collective pseudonym of Roger T. Pedauque. The surname of the pseudonym meant "web-footed."

View Map + Bookmark Entry

The Sony Reader PRS-500 is Introduced Circa September – October 2006

The Sony Reader PRS-500

In September or October 2006 Sony announced the Sony Reader PRS-500 — another attempt to provide an acceptable e-book (ebook; electronic book) reader. 

A feature of the PRS-500 was that it only used power when a page was turned. Thus, theoretically 7500 pages could be read on the device with one battery charge.

View Map + Bookmark Entry

Nature Announces Peer to Peer Review September 14, 2006

The journal Nature announced on September 14, 2006 that it was opening the peer review process to comments online in the form of a blog.

View Map + Bookmark Entry

Publishing Patent Filings on the Web September 26, 2006

The IBM logo

IBM, the largest patent holder in the U.S., announced on September 26, 2006 that it would publish its patent filings on the Web for public review, as part of a new policy that the company hoped would be a model for others.

View Map + Bookmark Entry

Twitter: "What Are You Doing?" October 2006

The Twitter logo

An example of a "tweet"

In October 2006 the start-up company Obvious, in San Francisco, launched the social networking and micro-blogging service Twitter: What are you doing?. Twitter "allows its users to send and read other users' updates (otherwise known as tweets), which are text-based posts of up to 140 characters in length." This was under the 160 character limit of the SMS communication protocol for mobile phones.

View Map + Bookmark Entry

Will it Blend?: Viral Marketing October 2006

The "Will It Blend?" campaign logo

In October 2006 Tom Dickson, the founder of Blendtec, a blender manufacturer in Orem, Utah, began the Will it Blend? viral marketing campaign on the Internet. Between downloads on YouTube and on the Will it Blend? website, the advertising program became one of the most successful Internet marketing campaigns, surpassing 100,000,000 hits by May 2009. Ads featured blending of many absurd items, such as blending an iPhone. Many of the bizarre ads were listed and linked-to in the Wikipedia article on Will it Blend?.

 

 

 

View Map + Bookmark Entry

The Royal Society Digital Journal Archive October 29, 2006

The entrance to the Royal Society of London

On October 29, 2006 The Royal Society of London announced that The Royal Society Digital Journal Archive, dating back to 1665 and containing the full text and illustrations of more than 60,000 articles published in the Philosophical Transactions of the Royal Society was available online.

View Map + Bookmark Entry

More than 100,000,000 Websites November 1, 2006

The Netcraft logo

In November 2006 there were more than 100 million websites on the Internet. Between January and November of this year 27.4 million sites were added to the web. (According to Netcraft.com there were 101,435,253 sites on the Internet.)

View Map + Bookmark Entry

Google's AdWords to Place Ads in Print Newspapers November 6, 2006

Tom Phillips, head of print operations at Google

The New York Times logo

On November 6, 2006 Google and various print newspapers, including The New York Times, announced that they would test a modified version of Google's AdWords program to place advertisements in print newspapers.

View Map + Bookmark Entry

Google Buys YouTube November 6, 2006

Youtube co-founders Chad Hurley and Steve Chen

On November 6, 2006 Google completed the purchase of YouTube for $1.65 billion in Google stock. Youtube co-founders Chad Hurley and Steve Chen posted a video to YouTube about the purchase.

 

View Map + Bookmark Entry

Newspaper Advertising in Partnership with Yahoo November 20, 2006

Terry T. Semel, left, the Chief of Yahoo at the time. Dean Singleton, right, the chief of MediaNews.

The homepage for Yahoo's HotJobs before it was acquired by Monster

On November 20, 2006 "A consortium of seven newspaper chains representing 176 daily papers across the country is announcing a broad partnership with Yahoo to share content, advertising and technology . . . . In the first phase of the deal, the newspaper companies will begin posting their employment classified ads on Yahoo’s classified jobs site, HotJobs, and start using HotJobs technology to run their own online career ads.

"But the long-term goal of the alliance with Yahoo, according to one senior executive at a participating newspaper company, is to be able to have the content of these newspapers tagged and optimized for searching and indexing by Yahoo."

View Map + Bookmark Entry

The EPA Begins to Close its Scientific Libraries November 20, 2006

The Environmental Protection Agency seal

The Boston Globe reported on November 20, 2006 that the The Environmental Protection Agency (EPA) had begun to close its nationwide network of scientific libraries, effectively preventing EPA scientists and the public from accessing vast amounts of data and information on issues from toxicology to pollution. Several libraries were already dismantled, with their contents either destroyed or shipped to repositories where they were uncataloged and inaccessible.

View Map + Bookmark Entry

"Anshe Chung Becomes First Virtual World Millionaire" November 26, 2006

Ailin Graef

Ailin Graef's character, Anshe Chung

Available virtual real estate for purchase from Anshe Chung

The cover of businessweek featuring Anshe Chung

On November 26, 2006 it was announced that "Anshe Chung [Real life: Ailin Graef] has become the first online personality to achieve a net worth exceeding one million US dollars from profits entirely earned inside a virtual world.

"Recently featured on the cover of Business Week Magazine, Anshe Chung is a resident in the virtual world Second Life. Inside Second Life, Anshe buys and develops virtual real-estate in an official currency, known as Linden Dollars, which is convertible to US Dollars. There is also a liquid market in virtual real estate, making it possible to assess the value of her total holdings using publicly available statistics. 

"The fortune Anshe Chung commands in Second Life includes virtual real estate that is equivalent to 36 square kilometers of land – this property is supported by 550 servers or land "simulators". In addition to her virtual real estate holdings, Anshe has 'cash' holdings of several million Linden Dollars, several virtual shopping malls, virtual store chains, and she has established several virtual brands in Second Life. She also has significant virtual stock market investments in Second Life companies.

"Anshe Chung's achievement is all the more remarkable because the fortune was developed over a period of two and a half years from an initial investment of $9.95 for a Second Life account by Anshe's creator, Ailin Graef. Anshe/Ailin achieved her fortune by beginning with small scale purchases of virtual real estate which she then subdivided and developed with landscaping and themed architectural builds for rental and resale. Her operations have since grown to include the development and sale of properties for large scale real world corporations, and have led to a real life "spin off" corporation called Anshe Chung Studios, which develops immersive 3D environments for applications ranging from education to business conferencing and product prototyping.

"Ailin Graef was born and raised in Hubei, China, but is currently a citizen of Germany. She runs Anshe Chung Studios with her husband Guntram Graef, who serves as CEO of the company. Anshe Chung Studios has offices in Wuhan, China and is currently seeking to expand its workforce from 25 to 50" (http://www.anshechung.com/include/press/press_release251106.html, accessed 01-27-2010).

View Map + Bookmark Entry

Demanding that the U.S. EPA Desist from Destroying its Libraries November 30, 2006

Stephen Johnson

On November 30, 2006 ranking members of congressional committees wrote to Stephen Johnson, Administrator of the U.S. Environmental Protection Agency, demanding that the agency desist from destroying its libraries:

"Over the past 36 years, EPA's libraries have accumulated a vast and invaluable trove of public health and environmental information, including at least 504,000 books and reports, 3,500 journal titles, 25,000 maps, and 3.6 million information objects on microfilm, according to the report issued in 2004: Business Case for Information Services: EPA's Regional Libraries and Centers prepared for the Agency by Stratus Consulting. Each one of EPA's libraries also had information experts who helped EPA staff and the public access and use the Agency's library collection and information held in other library collections outside of the Agency. It now appears that EPA officials are dismantling what is likely one of our country's most comprehensive and accessible collections of environmental materials.
The press has reported on the concerns over the library reorganization plan voiced by EPA professional staff of the Office of Enforcement and Compliance Assurance (OECA), 16 local union Presidents representing EPA employees, and the American Library Association. In response to our request of September 19, 2006, (attached), the Government Accountability Office has initiated an investigation of EPA's plan to close its libraries. Eighteen Senators sent a letter on November 3, 2006, to leaders of the Senate Appropriations Committee asking them
to direct EPA "to restore and maintain public access and onsite library collections and services at EPA's headquarters, regional, laboratory and specialized program libraries while the Agency solicits and considers public input on its plan to drastically cut its library budget and services"
(attached). Yet, despite the lack of Congressional approval and the concerns expressed over this plan, your Agency continues to move forward with dismantling the EPA libraries. It is imperative that the valuable government information maintained by EPA's libraries
be preserved. We ask that you please confirm in writing by no later than Monday, December 4, 2006, that the destruction or disposition of all library holdings immediately ceased upon the Agency's receipt of this letter and that all records of library holdings and dispersed materials are being maintained."

View Map + Bookmark Entry

U.S. Publishers Sell 3.1 Billion Books Circa December 2006

In 2006 publishers in the U.S. sold 3.1 billion books. This was up just 0.5 percent from the 3. 09 billion sold in 2005. Of the 3.1 billion, 263.4 million were religious books, then the fastest growing category in U.S. book publishing.

View Map + Bookmark Entry

"An Uncensorable System for Mass Document Leaking" December 2006

Julian Assange

The Wikileaks logo

In December 2006 Julian Assange and others founded Wikileaks, a website, with no official headquarters, that published anonymous submissions and leaks of sensitive governmental, corporate, or religious documents, while attempting to preserve the anonymity and untraceability of its contributors. Within one year of its foundation the site grew to 1,200,000 documents.

"The site states that it was 'founded by Chinese dissidents, journalists, mathematicians and startup company technologists, from the US, Taiwan, Europe, Australia and South Africa". The creators of Wikileaks were unidentified as of January 2007, although it has been represented in public since January 2007 by non-anonymous speakers such as Julian Assange, who had described himself as a member of Wikileaks' advisory board and was later referred to as the 'founder of Wikileaks.' "

"Wikileaks describes itself as 'an uncensorable system for untraceable mass document leaking'. Wikileaks is hosted by PRQ, a Sweden-based company providing 'highly secure, no-questions-asked hosting services'. PRQ is said to have 'almost no information about its clientele and maintains few if any of its own logs'. PRQ is owned by Gottfrid Svartholm and Fredrik Neij who, through their involvement in The Pirate Bay, have significant experience in withstanding legal challenges from authorities. Being hosted by PRQ makes it difficult to take Wikileaks offline. Furthermore, 'Wikileaks maintains its own servers at undisclosed locations, keeps no logs and uses military-grade encryption to protect sources and other confidential information.' Such arrangements have been called 'bulletproof hosting' (Wikipedia article on Wikileaks, accessed 11-25-2009).

"WikiLeaks was originally launched as a user-editable wiki site, but has progressively moved towards a more traditional publication model, and no longer accepts either user comments or edits. The site is available on multiple online servers and different domain names following a number of denial-of-service attacks and its severance from different Domain Name System (DNS) providers" (Wikipedia article on Wikileaks, accessed 12-08-2010).

View Map + Bookmark Entry

Yahoo and Reuters Found "YouWitnessNews" December 5, 2006

The Reuters logo

The You Witness News logo

On December 5, 2006 Yahoo and Reuters introduced programs to place photographs and videos of news events submitted by the public, including cell phone photos and videos, throughout Reuters.com and Yahoo's new service entitled YouWitnessNews. Reuters said that it in 2007 would also start to distribute some of the submissions to the thousands of print, online and broadcast media outlets that subscribed to its news service. Reuters also said that it hoped to develop a service devoted entirely to user-submitted photographs and video.

View Map + Bookmark Entry

Journalistic Acknowledgment of the Significance of Social Networking on the Internet December 16, 2006

The cover of Time Magazine when the magazine named "You" as the person of the year

Time Magazine issue of December 26, 2006 named "You" as the Person of the Year, reflecting the growing importance of social networking on the Internet:

"The "Great Man" theory of history is usually attributed to the Scottish philosopher Thomas Carlyle, who wrote that 'the history of the world is but the biography of great men.' He believed that it is the few, the powerful and the famous who shape our collective destiny as a species. That theory took a serious beating this year.

"To be sure, there are individuals we could blame for the many painful and disturbing things that happened in 2006. The conflict in Iraq only got bloodier and more entrenched. A vicious skirmish erupted between Israel and Lebanon. A war dragged on in Sudan. A tin-pot dictator in North Korea got the Bomb, and the President of Iran wants to go nuclear too. Meanwhile nobody fixed global warming, and Sony didn't make enough PlayStation3s.

"But look at 2006 through a different lens and you'll see another story, one that isn't about conflict or great men. It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes."

View Map + Bookmark Entry

Wikileaks Manifesto December 31, 2006

Julian Assange

The Wikileaks logo

Shortly after the foundation of Wikileaks, Julian Assange published a kind of informal, awkwardly written Wikileaks manifesto on the Internet: 

"The non linear effects of leaks on unjust systems of governance

"You may want to read The Road to Hanoi or Conspiracy as Governance [second essay following]; an obscure motivational document, almost useless in light of its decontextualization and perhaps even then. But if you read this latter document while thinking about how different structures of power are differentially affected by leaks (the defection of the inner to the outer) its motivations may become clearer.

"The more secretive or unjust an organization is, the more leaks induce fear and paranoia in its leadership and planning coterie. This must result in minimization of efficient internal communications mechanisms (an increase in cognitive "secrecy tax") and consequent system-wide cognitive decline resulting in decreased ability to hold onto power as the environment demands adaption.

"Hence in a world where leaking is easy, secretive or unjust systems are nonlinearly hit relative to open, just systems. Since unjust systems, by their nature induce opponents, and in many places barely have the upper hand, mass leaking leaves them exquisitely vulnerable to those who seek to replace them with more open forms of governance.

"Only revealed injustice can be answered; for man to do anything intelligent he has to know what's actually going on" (http://cryptome.org/0002/ja-conspiracies.pdf, accessed 12-08-2010).

View Map + Bookmark Entry

A Printed Book on Preserving Digital Information 2007

The cover art of Preserving Digital Information by Henry M. Gladney

In 2007 Henry M. Gladney, Saratoga, California, issued his monograph, Preserving Digital Information, as a printed book.

View Map + Bookmark Entry

The Universal Digital Library has Scanned over 1,000,000 Books 2007

The Universal Digital Library announces that it surpassed its original goal of one million books

By 2007 the Universal Digital Library at Carnegie Mellon University and partners had scanned over 1,000,000 books, surpassing its original goal set in 2001.

View Map + Bookmark Entry

Filed under: Libraries

More than 4.7 Billion Bibles Were Been Printed Between 1455 and 2007 2007

In 2007 it was estimated that more than 4.7 billion Bibles (in whole or in part) had been printed since the publication of the Gutenberg Bible in 1455-56.

4.7 billion was more than five times the estimated number of 900 million printed copies of Quotations from Chairman Mao Zedong, the enormous distribution of which occurred in the second half of the 20th century becuase it was "an unoffical requirement for every Chinese ciitzen to own, read and carry it at all times under the latter half of Mao's rule, and especially during the Cultural Revolution."

View Map + Bookmark Entry

No More than 10,000,000 Unique Editions before 1900 2007

In 2007 The Universal Digital Library estimated that there were "no more than 10,000,000 unique book and document editions before the year 1900, and perhaps 300 million since the beginning of recorded history."

View Map + Bookmark Entry

Sales of Books in America in 2007 2007

The Book Industry Study Group logo

The Association of American Publishers logo

According to the Book Industry Study Group in 2007 3,200,000,000 books were sold in the United States. According to The Association of American Publishers net book sales in the U.S. were $25,000,000,000, an increase of 2.5 percent over 2006.

View Map + Bookmark Entry

976,000 New Book Titles Published in 2007 2007

Robert Darnton

The Bowker logo

According to Bowker, as cited by Robert Darnton in Publisher's Weekly, 976,000 new book titles were published worldwide in 2007. This represented a significant increase over the 859,000 published in 2003, and the 700,000 published in 1998.

View Map + Bookmark Entry

IBM Begins Development of Watson, the First Cognitive Computer 2007

David Ferrucci

The Watson deep question answering computing system lab

The Watson Research Center

In 2007 David Ferrucci, leader of the Semantic Analysis and Integration Department at IBM’s Watson Research Center, Yorktown Heights, New York,  and his team began development of Watson, a special-purpose computer system designed to push the envelope on deep question and answering, deep analytics, and the computer's understanding of natural language. "Watson" became the firstg cognitive computer, combinding machine learning and artificial intelligence.

View Map + Bookmark Entry

The Robot Operating System ROS 2007

In 2007 Stanford Artificial Intelligence Laboratory developed the Robot Operating System (ROS) in support of the STanford Artificial Intelligence Robot STAIR. "ROS is a software framework for robot software development, providing operating system-like functionality on a heterogeneous computer cluster."

In 2008 development of ROS continued primarily at Willow Garage, a robotics research institute/incubator in Menlo Park, California, with more than twenty institutions collaborating in a federated development model.

View Map + Bookmark Entry

Goodreads is Founded January 2007

In January 2007 Otis and Elizabeth Chandler launched the "social cataloguing" website Goodreads in San Francisco.

"The website allows individuals to freely search Goodreads' extensive user-populated database of books, annotations, and reviews. Users can sign up and register books to generate library catalogs and reading lists. They can also create their own groups of book suggestions and discussions" (Wikipedia article on Goodreads, accessed 10-29-2013).

In March 2013 Goodreads was acquired by Amazon.com.

View Map + Bookmark Entry

The Oldest Continuously Published Newspaper Moves to the Web January 1, 2007

Post- och Inrikes Tidningarno.15, 9 April 1645

Front page from the 19 January 2009 edition of Post- och Inrikes Tidningarno online

On January 1, 2007 the government newspaper and gazette of Sweden, Post- och Inrikes Tidningar (Post and Domestic Newspaper) of Stockholm, the oldest continuously published newspaper in the world, published on paper without interruption since 1645, switched to web publication exclusively.

View Map + Bookmark Entry

The First One Terabyte Hard Disk Drive January 4, 2007

The Deskstar 7k1000, the first 1-terabyte hard disk drive sold by Hitachi Global Storage Technologies

The Hitatchi Global Storage Technologies logo

On January 4, 2007 Hitachi Global Storage Technologies [San Jose, California] announced the first 1-terabyte hard disk drive. 

"According to Hitachi, the drive ships in the first quarter of 2007, and will cost $399--less than the price of two individual 500GB hard drives today. The drive, called the Deskstar 7K1000, will be shown this weekend in Las Vegas at the 2007 International CES, also known as the Consumer Electronics Show, as well as at the Storage Visions storage conference" (http://www.pcworld.com/article/128400/hitachi_introduces_1terabyte_hard_drive.html, accessed 06-04-2009).

View Map + Bookmark Entry

Information is Expanding 10X Faster than Any Product on this Planet February 2007

Kevin Kelly

In February 2007 Kevin Kelly wrote in Wired Magazine:

"Information is expanding 10 times faster than any product on this planet - manufactured or natural. According to Hal Varian, an economist at UC Berkeley and a consultant to Google, worldwide information is increasing at 66 percent per year - approaching the rate of Moore's Law - while the most prolific manufactured stuff - paper, let’s say, or steel - averages only as much as 7 percent annually."

View Map + Bookmark Entry

Is the Universe Made of Information? February 2007

The February 2007 issue of Wired

James Gleick

In the February 2007 issue of Wired James Gleick wrote:

"Is the universe actually made of information? Humans have talked about atoms since the time of the ancients, and ever-smaller fundamental particles of matter followed. But no one even conceived of bits until the middle of the 20th century. The bit is a fundamental particle, too, but of different stuff altogether: information. It is not just tiny, it is abstract - a flip-flop, a yes-or-no. Now that scientists are finally starting to understand information, they wonder whether it’s more fundamental than matter itself. Perhaps the bit is the irreducible kernel of existence; if so, we have entered the information age in more ways than one."

View Map + Bookmark Entry

In 2007 There Were 12,000,000 U.S. Blogs February 2007

The Pew Internet and American Life Project logo

The Pew Research Center logo

According to the Pew Internet and American Life Project, a product of the PewResearch Center, Washington, D.C.,  in February 2007 about 12 million Americans maintained a blog.

View Map + Bookmark Entry

My.BarackObama.com February 11, 2007

Barack Obama, the 44th president of the United States

An example of a homepage on my.BarackObama.com

An iPhone app used as another social networking tool during the election by Barack Obama

On his main website, barackobama.com on February 11, 2007 Presidential candidate Barack Obama launched my.barackobama.com. This social networking site built an online community of over a million members before the presidential election.

View Map + Bookmark Entry

Data-Storing Bacteria Could Last Thousands of Years February 27, 2007

The Keio University crest

Bacillius Subtilis, the bacteria on which the data was stored

A technology developed at Keio University, Tokyo, Japan, and announced on February 27, 2007, carried with it the possibility that bacterial DNA could be used as a medium for storing digital information long-term—potentially thousands of years.

"Keio University Institute for Advanced Biosciences and Keio University Shonan Fujisawa Campus announced the development of the new technology, which creates an artificial DNA that carries up to more than 100 bits of data within the genome sequence, according to the JCN Newswire. The universities said they successfully encoded "e= mc2 1905!" -- Einstein's theory of relativity and the year he enunciated it -- on the common soil bacteria,  Bacillius subtilis."

View Map + Bookmark Entry

Photosynth Demonstrated March 2007

Blaise Agüera y Arcas

A Seadragon browser demo

The Photosynth interface

In March 2007 physicist and software engineer Blaise Agüera y Arcas, architect of Seadragon, and co-creator of Photosynth, demonstrated Photosynth in a video dowloadable at the TED website at this link.

Using techniques of computational bibliography, in collaboration with Paul Needham at Princeton's Scheide Library, Agüera y Arcas also did significant original research in the technology of the earliest printing from movable type.

 

 

View Map + Bookmark Entry

It Could Take 1800 Years to Convert the Paper Records . . . . March 10, 2007

Bookshelves inside the Library of Congress

On March 10, 2007 the U.S. National Archives estimated that at the current rate of digitization of its 9 billion text records, it could take 1800 years to convert the paper text records in the National Archives to digital form. This estimate came from an article in The New York Times entitled History Digitized (and Abridged), which pointed out that economic and copyright considerations required the digitization of library and archival collections to be very selective. 

View Map + Bookmark Entry

Checkers is "Solved" April 29, 2007

Jonathan Shaeffer with a checkers board after "solving" the game of checkers

The University of Alberta seal

Jonathan Schaeffer and his team at the University of Alberta announced on April 29, 2007 that the game of checkers was "solved". Perfect play led to a draw.

"The crucial part of Schaeffer's computer proof involved playing out every possible endgame involving fewer than 10 pieces. The result is an endgame database of 39 trillion positions. By contrast, there are only 19 different opening moves in draughts. Schaeffer's proof shows that each of these leads to a draw in the endgame database, providing neither player makes a mistake.  

"Schaeffer was able to get his result by searching only a subset of board positions rather than all of them, since some of them can be considered equivalent. He carried out a mere 1014 calculations to complete the proof in under two decades. 'This pushes the envelope as far as artificial intelligence is concerned,' he says.  

"At its peak, Schaeffer had 200 desktop computers working on the problem full time, although in later years he reduced this to 50 or so. 'The problem is such that if I made a mistake 10 years ago, all the work from then on would be wrong,' says Schaeffer. 'So I've been fanatical about checking for errors.' " (http://www.newscientist.com/article/dn12296-checkers-solved-after-years-of-number-crunching.html, accessed 01-24-2010).

Based on this proof, Schaeffer's checkers-playing program Chinook, could no longer be beaten. The best an opponent could hope for is a draw.

View Map + Bookmark Entry

Google Introduces Street View in Google Maps May 25, 2007 – May 12, 2008

Google Street View image of St Johns Street in Manchester UK showing 8 different possible views

An exmple of blurred faces in Google Street View

One of the vehicles used to record the images for Google Street View

On May 25, 2007 Google introduced the Street View feature of Google Maps in the United States.  It provided panoramic views from positions along many streets, eventually including even views of the very small road on which I live in Novato, California, suggesting that coverage of many parts of the United States became extremely comprehensive.  

On April 16, 2008, Google fully integrated Street View into Google Earth 4.3.

In response to complaints about privacy, on May 12, 2008 Google announced in its "latlong" blog that it had introduced face-blurring technology for its images of Manhattan. It eventually applied the technology to all locations.

View Map + Bookmark Entry

The First Embassy of a Real Country in a Virtual World May 30, 2007

Carl Bildt

The Second Swedish Embasy in the virtual world of Second Life

In a real-world announcement, on May 30, 2007 Carl Bildt, Foreign Minister of Sweden, opened the Second House of Sweden, an embassy in the virtual world of Second Life. A replica of the Swedish Embassy to the United States, this was the first embassy of a real country in a virtual world.

View Map + Bookmark Entry

Watson's Genome is Sequenced May 31, 2007

James D. Watson

An example of DNA sequencing

On May 31, 2007 the genome of James D. Watson, co-discoverer of the double-helical structure of DNA, was sequenced and presented to Watson. It was the second individual human genome to be sequenced; the first was that of J. Craig Venter, which was sequenced in the Human Genome Project, the first working draft of which was completed and published in February 2001

View Map + Bookmark Entry

Steve Jobs Introduces the iPhone June 29, 2007

The iPhone 3G

On June 29, 2007 Apple introduced the iPhone, an internet-connected multimedia smartphone with a virtual keypad and a virtual keyboard.

View Map + Bookmark Entry

Second Life is Used for Teaching Foreign Languages July 2007

A virtual classroom in Second Life where players can learn new languages, among other studies

According to an article in LeMonde.fr in July 2007, the virtual reality site, Second Life, was being used for teaching foreign languages.

View Map + Bookmark Entry

The CNN/ YouTube Presidential Debates: The First Internet to Television Debate Partnership July 23 – November 28, 2007

The CNN/YouTube presidential debates, the first web-to-television debate partnership, were a series of televised debates in which United States presidential hopefuls fielded questions submitted through the video sharing site YouTube. They were conceived by David Bohrman, then Washington Bureau Chief of CNN, and Steve Grove, then Head of News and Politics at YouTube. YouTube was then a new platform on the political scene, rising to prominence in the 2006 midterm elections after Senator George Allen's Macaca Controversy, in which the Senator was captured calling his opponent Jim Webb's campaign worker a "Macaca" on video, which went viral on YouTube and damaged a campaign that narrowly lost at the polls. Media companies were looking for new ways to harness the possibilities of web video and YouTube was looking for opportunities to give its users access to the national political stage, so Bohrman and Grove formed a unique partnership in the CNN/YouTube DebatesThe Democratic Party installment took place in Charleston, South Carolina and aired on July 23, 2007. The Republican Party installment took place in St. Petersburg, Florida and aired on November 28, 2007. 

View Map + Bookmark Entry

The World Wide Telecom Web for Illiterate Populations August 2007

Arun Kumar

A diagram of the World Wide Telecom Web, also known as "Spoken Web"

In August 2007 Arun Kumar and others at IBM Research - India, New Delhi,  published "WWTW: The World Wide Telecom Web", a voice-driven Internet designed for illiterate populations:

"our vision of a voice-driven ecosystem parallel to that of the WWW. WWTW is a network of interconnected voice sites that are voice driven applications created by users and hosted in the network. It has the potential to enable the underprivileged population to become a part of the next generation converged networked world. We present a whole gamut of existing technology enablers for our vision as well as present research directions and open challenges that need to be solved to not only realize a WWTW but also to enable the two Webs to cross leverage each other."

View Map + Bookmark Entry

The First Healthcare Course Taught in Second Life September 2007

A player's avatar stands infront of the virtual campus of Coventry University in Second Life

In September 2007 England's Coventry University developed a MSc course in clinical management that held problem-based learning groups for students in Second Life. The course trained students in managing healthcare facilities, and was the first healthcare course to use Second Life as a learning platform.

 

 

 

View Map + Bookmark Entry

The PRISM Surveillance Program September 11, 2007 – June 6, 2013

A BoundlessInformant global heat map of data collection. The color scheme ranges from green (areas least subjected to surveillance) through yellow and orange to red (areas most subjected to surveilance).

The PRISM program logo

On September 11, 2007, U.S. President George W. Bush signed the Protect America Act of 2007, allowing the National Security Administration (NSA) to start a massive domestic surveillance data-collection program known officially by the SIGAD US-984XN, code name PRISM.

"The program is operated under the supervision of the U.S. Foreign Intelligence Surveillance Court (FISC) pursuant to the Foreign Intelligence Surveillance Act (FISA). Its existence was leaked five years later by NSA contractor Edward Snowden, who claimed the extent of mass data collection was far greater than the public knew, and included 'dangerous' and 'criminal' activities in law. The disclosures were published by [by Glenn Greenwald in] The Guardian and The Washington Post on June 6, 2013.

A document included in the leak indicated that PRISM was 'the number one source of raw intelligence used for NSA analytic reports.' The leaked information came to light one day after the revelation that the FISC had been ordering a business unit of the telecommunications company Verizon Communications to turn over to the NSA logs tracking all of its customers' telephone calls on an ongoing daily basis." (Wikipedia article on PRISM (surveillance program) accessed 07-07-2013).

Here is the link to Glenn Greenwald's article in www.guardian.co.uk publishing the first of Snowden's disclosures.  When I linked to this on July 7, 2013 it had been friended on Facebook by 141,922 people.

♦ A more general survey of the extent of what was characterized as the "2013 mass surveillance scandal," with a summary of NSA spying programs, was available from the Wikipedia in August 2013.

♦ On August 13, 2013 The New York Times published an article by Peter Maass regarding the work of the documentary film maker Laura Poitras, telling how she and lawyer / journalist Glenn Greenwald helped Edward Snowden publish his secrets.

View Map + Bookmark Entry

DROID, an Archives Analysis and Identification Tool September 27, 2007

The National Archives in London

"An innovative tool to analyse and identify computer file formats has won the 2007 Digital Preservation Award. DROID, developed by The National Archives in London, can examine any mystery file and identify its format. The tool works by gathering clues from the internal 'signatures' hidden inside every computer file, as well as more familiar elements such as the filename extension (.jpg, for example), to generate a highly accurate 'guess' about the software that will be needed to read the file. . . .

"Now, by using DROID and its big brother, the unique file format database known as PRONOM, experts at the National Archives are well on their way to cracking the problem. Once DROID has labelled a mystery file, PRONOM's extensive catalogue of software tools can advise curators on how best to preserve the file in a readable format. The database includes crucial information on software and hardware lifecycles, helping to avoid the obsolescence problem. And it will alert users if the program needed to read a file is no longer supported by manufacturers.

"PRONOM's system of identifiers has been adopted by the UK government and is the only nationally-recognised standard in its field."

View Map + Bookmark Entry

28,578,000 Copies Printed Semi-Monthly November 2007

A cover of Watchtower

In November 2007 The Watchtower had an average semi-monthly printing on paper of 28,578,000 copies in 161 languages. This may have been the largest and most linguistically diverse circulation printed on paper of any periodical at that time.

View Map + Bookmark Entry

Brainbow: A Colorful Technique to Visualize Brain Circuitry November 2007

Jeff W. Lichtman

Joshua R. Sanes

Three brainbows of mouse neurons from Lichtman and Sanes, 2008.

A. A motor nerve innervating ear muscle

B. An axon tract in the brain stem

C. The hippocampul dentate gyrus

In November 2007 Jeff W. Lichtman and Joshua R. Sanes, both professors of Molecular & Cellular Biology in the Department of Neurobiology at Harvard Medical School, and colleagues, published "Transgenic strategies for combinatorial expression of fluorescent proteins in the nervous system," Nature 450 (7166): 56–62. doi:10.1038/nature06293. This described the visualization process they called "Brainbow."

"Detailed analysis of neuronal network architecture requires the development of new methods. Here we present strategies to visualize synaptic circuits by genetically labelling neurons with multiple, distinct colours. In Brainbow transgenes, Cre/lox recombination is used to create a stochastic choice of expression between three or more fluorescent proteins (XFPs). Integration of tandem Brainbow copies in transgenic mice yielded combinatorial XFP expression, and thus many colours, thereby providing a way to distinguish adjacent neurons and visualize other cellular interactions. As a demonstration, we reconstructed hundreds of neighbouring axons and multiple synaptic contacts in one small volume of a cerebellar lobe exhibiting approximately 90 colours. The expression in some lines also allowed us to map glial territories and follow glial cells and neurons over time in vivo. The ability of the Brainbow system to label uniquely many individual cells within a population may facilitate the analysis of neuronal circuitry on a large scale." (From the Nature abstract).

View Map + Bookmark Entry

Anthony Grafton's "Codex in Crisis" November 5, 2007 – 2008

Anthony Grafton

The cover of Codex in Crisis

On November 5, 2007 historian Anthony Grafton of Princeton University published "Future Reading. Digitization and its Discontents" in The New Yorker Magazine. This was revised and reissued as a small book entitled Codex in Crisis (2008). It was reprinted as the last chapter in Grafton's, Worlds Made by Words. Scholarship and Community in the Modern West (2009).

On December 18, 2008 Grafton spoke about Codex in Crisis at Google, Montain View, in the Authors@Google series:

View Map + Bookmark Entry

The Amazon Kindle is Introduced November 19, 2007

The Amazon logo

A first generation Amazon Kindle

Amazon.com introduced the Kindle on November 19, 2007. This unconventionally-named ebook reader differed from other ebook readers because it incorporated a wireless service for purchasing and delivering electronic texts from Amazon.com without a computer. The 6 inch wide electronic-paper screen was limited to grayscale at 167ppi resolution. At its introduction 90,000 titles were available for download to the 10 oz. device. The first Kindle could store about 200 books.

View Map + Bookmark Entry

Kindle Direct Publishing Introduced November 19, 2007

Concurrently with the Kindle ebook reader, on November 19, 2007 Amazon launched Kindle Direct Publishing for authors and publishers to publish their books directly to Kindle and Kindle Apps worldwide. This publishing platform was in open beta testing as of late 2007.

"Authors can upload documents in several formats for delivery via Whispernet and charge between $0.99 and $200.00 per download.

"In a December 5, 2009 interview with The New York Times, CEO Jeff Bezos revealed that Amazon.com keeps 65% of the revenue from all ebook sales for the Kindle. The remaining 35% is split between the book author and publisher. After numerous commentators observed that Apple's popular App Store offers 70% of royalties to the publisher, Amazon began a program that offers 70% royalties to Kindle publishers who agree to certain conditions.

"Other criticisms involve the business model behind Amazon's implementation and distribution of e-books. Amazon introduced a software application allowing Kindle books to be read on an iPhone or iPod Touch. Amazon soon followed with an application called "Kindle for PCs" that can be run on a Windows PC. Due to the book publisher's DRM policies, Amazon claims that there is no right of first sale with e-books. Amazon states they are licensed, not purchased; so unlike paper books, buyers do not actually own their e-books according to Amazon. This has however never been tested in the courts and the outcome of any action by Amazon is by no means certain. The law is in a state of flux in jurisdictions around the world " (Wikipedia article on Amazon Kindle, accessed 12-29-2011).

View Map + Bookmark Entry

Drama in the Context of a Telephone Exchange (1928) 2008

In Changeling, an American historical drama film set in Los Angeles in 1928, the central figure played by Angelina Jolie worked as a supervisor in a telephone exchange, then a manual operation. In the film the operation of the exchange appeared to be accurately depicted. Based on a true story, the drama focussed on police corruption and the covering up of police incompetence in the context of heart-wrenching child abductions and murders. It was produced and directed by Clint Eastwood, written by J. Michael Straczynski, and starred Angelina Jolie and John Malkovich. It was introduced in 2008.

"Later exchanges consisted of one to several hundred plug boards staffed by telephone operators. Each operator sat in front of a vertical panel containing banks of ¼-inch tip-ring-sleeve (5-conductor) jacks, each of which was the local termination of a subscriber's telephone line. In front of the jack panel lay a horizontal panel containing two rows of patch cords, each pair connected to a cord circuit. When a calling party lifted the receiver, a signal lamp near the jack would light. The operator would plug one of the cords (the "answering cord") into the subscriber's jack and switch her headset into the circuit to ask, "number please?" Depending upon the answer, the operator might plug the other cord of the pair (the "ringing cord") into the called party's local jack and start the ringing cycle, or plug into a trunk circuit to start what might be a long distance call handled by subsequent operators in another bank of boards or in another building miles away. In 1918, the average time to complete the connection for a long-distance call was 15 minutes. In the ringdown method, the originating operator called another intermediate operator who would call the called subscriber, or passed it on to another intermediate operator. This chain of intermediate operators could complete the call only if intermediate trunk lines were available between all the centers at the same time. In 1943 when military calls had priority, a cross-country US call might take as long as 2 hours to request and schedule in cities that used manual switchboards for toll calls" (Wikipedia article on Telephone exchange, accessed 04-26-2009).

(This entry was last revised on 04-30-2014.)

View Map + Bookmark Entry

The Bayerische Staatsbibliothek in "Second Life" 2008

On its 450th anniversary in 2008 the Bayerische Staatsbibliothek offered selected web services and highlights of its unique collections, as well as a communication forum for library users in Second Life.

"The virtual presence of the Bayerische Staatsbibliothek is a reproduction of the famous library building in Ludwigstrasse, Munich, which is almost true to the original. The floor plan and the outside facades are true to the scale of the original building that was erected from 1832 to 1843. On the inside of the building the historical staircase, the Fürstensaal (prince's hall), the Friedrich von Gärtner hall and the marble hall were reproduced in accordance with the originals by means of state-of-the-art 3D technology. The reproduction of the staircase is particularly impressive in that it is accurate in every detail.

"The Fürstensaal contains an exhibition of a number of valuable manuscripts and historical printed works from the collections of the Bayerische Staatsbibliothek, which can even be browsed virtually. Further virtual exhibits can be seen in the major reading room, which was also reproduced taking account of all details of the original room in Ludwigstrasse. A virtual exhibition of presentation boards informs about the eventful history of the Staatsbibliothek since its foundation in the year 1558. //However, the presence in Second Life primarily offers in-world access to the most frequently used web services of the Bayerische Staatsbibliothek: access to the online catalogue and the web site, a link to the virtual information service "question point" and the comprehensive offer of digital texts of the "Munich Digitisation Centre". Moreover, every avatar can directly access the "Bayerische Landesbibliothek Online" offering a broad variety of information and digital sources on Bavarian culture and history.

"Just like its role model in real life, the virtual Staatsbibliothek is intended to become a place of communication and interaction. Therefore the virtual inner courtyards offer a conference centre for virtual specialized and information events and a coffeehouse inviting visitors to interact casually. Regular in-world events are planned, which will introduce, among others, the multifaceted offers and services of the Bayerische Staatsbibliothek" (http://www.bsb-muenchen.de/Virtual-Services-in-Second-Lif.2264+M57d0acf4f16.0.html, accessed 01-03-2010)

View Map + Bookmark Entry

ImageNet, an Image Database and Ontology 2008

In 2008 Principal Investigators Li Fei-Fei of Stanford Vision Lab and Kai Li of the Department of Computer Science at Princeton, and associates, advisors and friends, began building ImageNet, an image database and ontology, through a crowdsourcing process. In October 2013 the database contained 14,197,122 images, with 21,841 synsets indexed.

The ImageNet database is organized according to the WordNet hierarchy.

"Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a 'synonym set' or 'synset'. There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). In ImageNet, we aim to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated. In its completion, we hope ImageNet will offer tens of millions of cleanly sorted images for most of the concepts in the WordNet hierarchy."

Among its many applications, ImageNet provides a standard by which the accuracy of image recognition software can be measured.

View Map + Bookmark Entry

The SyNAPSE Neuromorphic Machine Technology Project Begins 2008

Traditional stored-program von Neumann computers are constrained by physical limits, and require humans to program how computers interact with their environments. In contrast the human brain processes information autonomously, and learns from its environment. Neuromorphic electronic machines— computers that function more like a brain— may enable autonomous computational solutions for real-world problems with many complex variables. In 2008 DARPA awarded the first funding to HRL Laboratories, Hewlett-Packard and IBM Research for SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics)—an attempt to build a new kind of cognitive computer with form, function and architecture similar to the mammalian brain. The program sought to create electronic systems inspired by the human brain that could understand, adapt and respond to information in ways fundamentally different from traditional computers.

"The initial phase of the SyNAPSE program developed nanometer scale electronic synaptic components capable of adapting the connection strength between two neurons in a manner analogous to that seen in biological systems (Hebbian learning), and simulated the utility of these synaptic components in core microcircuits that support the overall system architecture" (Wikipedia article on SyNAPSE, accessed 10-20-2013).

View Map + Bookmark Entry

Using Remote Presence Telemedicine Robots in Medicine 2008 – 2013

Since around 2008 physicians used remote presence telemedicine robots to "beam" themselves into hospitals to diagnose patients and offer medical advice during emergencies. By 2013 a growing number of hospitals in California and other states were using telepresence robots to expand access to medical specialists, especially in rural areas with a shortage of doctors.

These mobile video-conferencing machines, such as the RP-VITA Remote Presence Robot produced by InTouch Health of Santa Barbara, California, moved on wheels and typically stood about 5 feet high, with a large screen that projected a doctor's face. They featured cameras, microphones and speakers that allowed physicians and patients to see and talk to each other.

"Dignity Health, which runs Arizona, California and Nevada hospitals, began using the telemedicine machines five years ago to diagnose patients suspected of suffering strokes — when every minute is crucial to prevent serious brain damage.

"The San Francisco-based health care provider now uses the telemedicine robots in emergency rooms and intensive-care units at about 20 California hospitals, giving them access to specialists in areas such as neurology, cardiology, neonatology, pediatrics and mental health.

" 'Regardless of where the patient is located, we can be at their bedside in several minutes,' said Dr. Alan Shatzel, medical director of the Mercy Telehealth Network. 'Literally, we compress time and space with this technology. No longer does distance affect a person's ability to access the best care possible.'....

"Nearly 1,000 hospitals in the U.S. and abroad have installed InTouch telemedicine devices, including about 50 RP-VITA robots launched in May, according to company officials. The company rents out the RP-VITA for $5,000 per month.

"When a doctor is needed at a remote hospital location, he can log into the RP-VITA on-site by using a computer, laptop or iPad. The robot has an auto-drive function that allows it to navigate its way to the patient's room, using sensors to avoid bumping into things or people.

"Once inside the hospital room, the doctor can see, hear and speak to the patient, and have access to clinical data and medical images. The physician can't touch the patient, but there is always a nurse or medical assistant on-site to assist.

"On a recent morning, Dr. Asad Chaudhary, a stroke specialist at Dignity Health, beamed into a robot at the neuro-intensive care unit at Mercy San Juan Medical Center in Carmichael to evaluate Linda Frisk, a patient who recently suffered a stroke.

"With his face projected on the robot screen, Chaudhary asked Frisk to smile, open and close her eyes, make a fist and lift her arms and legs — common prompts to test a patient's neurological functioning.

" 'If you develop any weakness, any numbness, any problem with your speech or anything else, let us know right away,' Chaudhary told Frisk before the robot turned around and left the room.

" 'It's just like being with the patient in the room,' Chaudhary said. 'Of course, nothing can replace seeing these patients in person, but it's the next best thing' " (http://www.nytimes.com/aponline/2013/11/17/us/ap-us-telemedicine-robots.html?hp, accessed 11-17-2013).

View Map + Bookmark Entry

The First Virtual Currency Has a Real Value of 7 Billion Dollars 2008 – November 18, 2013

On November 13, 2013 The New York Times reported that Bitcoin, a peer-to-peer digital currency, and the first virtual currency or cryptocurrency, had a real value of more than seven billion dollars. This was the financial markets' response to acknowledgement by U.S. federal officials in a Senate hearing that virtual financial networks offered real benefits for the financial system, even as they acknowledged that new forms of digital currency had been used for money laundering and other illegal activity.

Bitcoin originated in November 2008 when a paper was posted on the Internet under the pseudonym Satoshi Nakamoto entitled Bitcoin: A Peer-to-Peer Electronic Cash System. This paper detailed methods of using a peer-to-peer network to generate what was described as "a system for electronic transactions without relying on trust". In January 2009, the Bitcoin network became operational with the release of the first open source Bitcoin client and the issuance of the first bitcoins, with Satoshi Nakamoto mining the first block of bitcoins ever —known as the "genesis block".  

"Investigations into the real identity of Satoshi Nakamoto have been attempted by The New Yorker and Fast Company. Fast Company's investigation brought up circumstantial evidence linking an encryption patent application filed by Neal King, Vladimir Oksman and Charles Bry on 15 August 2008, and the bitcoin.org domain name which was registered 72 hours later. The patent application (#20100042841) contained networking and encryption technologies similar to Bitcoin's, and textual analysis revealed that the phrase "...computationally impractical to reverse" appeared in both the patent application and bitcoin's whitepaper. All three inventors explicitly denied being Satoshi Nakamoto...." (Wikipedia article on History of Bitcoin, accessed 11-18-2013).

View Map + Bookmark Entry

The World's Oldest Oil Paintings Restored After Taliban Dynamite February 19, 2008

"The oldest known oil painting, dating from 650 A.D., has been found in caves in Afghanistan's Bamiyan Valley, according to a team of Japanese, European and U.S. scientists.

"The discovery reverses a common perception that the oil painting, considered a typically Western art, originated in Europe, where the earliest examples date to the early 12th century A.D.

"Famous for its 1,500-year-old massive Buddha statues, which were destroyed by the Taliban in 2001, the Bamiyan Valley features several caves painted with Buddhist images.

"Damaged by the severe natural environment and Taliban dynamite, the cave murals have been restored and studied by the National Research Institute for Cultural Properties in Tokyo, as a UNESCO/Japanese Fund-in-Trust project.

"Since most of the paintings have been lost, looted or deteriorated, we are trying to conserve the intact portions and also try to understand the constituent materials and painting techniques," Yoko Taniguchi, a researcher at the National Research Institute for Cultural Properties in Tokyo, told Discovery News.

" 'It was during such analysis that we discovered oily and resinous components in a group of wall paintings.'

"Painted in the mid-7th century A.D., the murals have varying artistic influences and show scenes with knotty-haired Buddhas in vermilion robes sitting cross-legged amid palm leaves and mythical creatures.

"Most likely, the paintings are the work of artists who traveled on the Silk Road, the ancient trade route between China, across Central Asia's desert to the West" (http://dsc.discovery.com/news/2008/02/19/oldest-oil-painting.html, accessed 07-11-2009).

View Map + Bookmark Entry

Game-Based Learning for Virtual Patients March 2008

In March 2008 Imperial College Medical School, London, announced the development of Phase I - Game-based learning for Virtual Patients in Second Life.

"The four-dimensional framework described by De Freitas and Martin (2006), plus the learning types described by Helmer (2007), as well as the different aspects of emergent narrative described by Murray (1997) have provided the basis for the design of these game-based learning activities for virtual patients under two different categories: context and learner specification, and narrative and modes of representation. Phase I of this project focused on the delivery of a virtual patient in the area of Respiratory Medicine following a game-based learning model in Second Life."

In December 2013 a video of Phase I was available from YouTube at at this link.

View Map + Bookmark Entry

Statistical Analysis Correctly Forecasts the Election of Obama March 3, 2008

On March 3, 2008 Statistical analyst and "sabermetrician" Nate Silver of Brooklyn, New York, founded fivethirtyeight.com. Roughly eight months before the election, Silver correctly predicted on March 7, 2008, that Barack Obama would be elected President of the United States.

View Map + Bookmark Entry

Cyber Storm II March 10 – March 14, 2008

"The U.S. Department of Homeland Security (DHS) is conducting the largest cyber security exercise ever organized. Cyber Storm II is being held from March 10-14 in Washington, D.C. and brings together participants from federal, state and local governments, the private sector, and the international community.

"Cyber Storm II is the second in a series of congressionally mandated exercises that will examine the nation’s cyber security preparedness and response capabilities. The exercise will simulate a coordinated cyber attack on information technology, communications, chemical, and transportation systems and assets.

" 'Securing cyberspace is vital to maintaining America’s strategic interests, public safety, and economic prosperity,' said Greg Garcia, Homeland Security Assistant Secretary for Cyber Security and Communications. 'Exercises like Cyber Storm II help to ensure that the public and private sectors are prepared for an effective response to attacks against our critical systems and networks.'

"Cyber Storm II will include 18 federal departments and agencies, nine states (Calif., Colo., Del., Ill., Mich., N.C., Pa., Texas and Va.), five countries (United States, Australia, Canada, New Zealand and the United Kingdom), and more than 40 private sector companies. They include ABB, Inc., Air Products, Cisco, Dow Chemical Company Inc., Harris Corporation, Juniper Networks, McAfee, Microsoft, NeuStar, PPG Industries, and Wachovia" (http://www.dhs.gov/xnews/releases/pr_1205180340404.shtm, accessed 08-09-2009).

View Map + Bookmark Entry

About 200 Million People in the U.S. Have Broadband Connections May 2008

By 2008 broadband technologies had spread to more than 90% of all residential Internet connections in the United States.

"When one considers a Nielsen’s study conducted in June 2008, which estimated the number of U.S. Internet users as 220,141,969, one can calculate that there are presently about 199 million people in the United States utilizing broadband technologies to surf the Web" (Wikipedia article on Internet marketing, accessed 05-10-2009).

View Map + Bookmark Entry

Encyclopedia Brittanica Will Include Wiki-Style Collaboration June 2008

Encyclopaedia Brittanica, first published in 3 volumes in 1771, announced in its blog in June 2008 that it would include wiki-style collaboration from users in it's online edition. At Britannica,

“readers and users will also be invited into an online community where they can work and publish at Britannica’s site under their own names.”

The core encyclopedia itself

"will continue to be edited according to the most rigorous standards and will bear the imprimatur ‘Britannica Checked’ to distinguish it from material on the site for which Britannica editors are not responsible.”

View Map + Bookmark Entry

The Effect of Decay Fungi on Wood Used in the Production of Violins June 28, 2008

On June 28, 2008 Francis W. M. R. Schwartze of the Section of Wood Protection and Biotechnology, Wood Laboratory, Swiss Federal Laboratories for Materials Testing and Research (Empa) St. Gallen, and Melanie Spycher, and Siegfried Fink published "Superior wood for violins – wood decay fungi as a substitute for cold climate," New Phytologist 179 (2008) 1095-1104.

ABSTRACT 

"• Violins produced by Antonio Stradivari during the late 17th and early 18th centuries are reputed to have superior tonal qualities. Dendrochronological studies show that Stradivari used Norway spruce that had grown mostly during the Maunder Minimum, a period of reduced solar activity when relatively low temperatures caused trees to lay down wood with narrow annual rings, resulting in a high modulus of elasticity and low density. 

"• The main objective was to determine whether wood can be processed using selected decay fungi so that it becomes acoustically similar to the wood of trees that have grown in a cold climate (i.e. reduced density and unchanged modulus of elasticity). 

"• This was investigated by incubating resonance wood specimens of Norway spruce (Picea abies) and sycamore (Acer pseudoplatanus) with fungal species that can reduce wood density, but lack the ability to degrade the compound middle lamellae, at least in the earlier stages of decay. 

"• Microscopic assessment of the incubated specimens and measurement of five physical properties (density, modulus of elasticity, speed of sound, radiation ratio, and the damping factor) using resonance frequency revealed that in the wood of both species there was a reduction in density, accompanied by relatively little change in the speed of sound. Thus, radiation ratio was increased from 'poor' to 'good', on a par with 'superior' resonance wood grown in a cold climate."

View Map + Bookmark Entry

21.9% of the World's People Use the Internet June 30, 2008

According to World Internet Stats  in June 2008 1,463,632,361 people used the Internet, out of a  total world population of 6,676,120,288.

View Map + Bookmark Entry

Over One Trillion Unique URLs Indexed July 2008

Google announced in its blog that it was indexing over one trillion (1,000,000,000,000) unique URLs.

View Map + Bookmark Entry

Opening of the iTunes App Store: the First App Distribution Service July 10, 2008

On July 10, 2008 Apple opened its online iTunes App Store. It was the first app distribution service. At launch it contained 522 Apps for the iPhone, including 135 free programs.


View Map + Bookmark Entry

Toward a World Digital Mathematics Library July 27, 2008

Petr Sojka of the Department of Computer Graphics and Design of Faculty of Informatics, Masaryk University, Czech Republic, organized the first conference entitled DML 2008 Towards a Digital Mathematics Library. Held at the University of Birmingham on July 27, 2008, it was part of the Conferences on Intelligent Computer Mathematics (CICM) and Mathematics Knowledge Management (MKM).

"Mathematicians dream of a digital archive containing all peer-reviewed mathematical literature ever published, properly linked and validated/verified. It is estimated that the entire corpus of mathematical knowledge published over the centuries does not exceed 100,000,000 pages, an amount easily manageable by current information technologies.

"The workshop's objectives are to formulate the strategy and goals of a global mathematical digital library and to summarize the current successes and failures of ongoing technologies and related projects, asking such questions as:

"* What technologies, standards, algorithms and formats should be used and what metadata should be shared?

"* What business models are suitable for publishers of mathematical literature, authors and funders of their projects and institutions?

"* Is there a model of sustainable, interoperable, and extensible mathematical library that mathematicians can use in their everyday work?

* What is the best practice for

"o retrodigitized mathematics (from images via OCR to MathML and/or TeX);

"o retro-born-digital mathematics (from existing electronic copy in DVI, PS or PDF to MathML and/or TeX);

"o born-digital mathematics (how to make needed metadata and file formats available as a side effect of publishing workflow [CEDRAM model])?"

View Map + Bookmark Entry

Sirus XM Satellite Radio is Formed July 29, 2008

On July 29, 2008 Sirius Satellite Radio and XM Radio merged to form Sirius XM Radio.

View Map + Bookmark Entry

"Computer Criminal Number One" August 5, 2008 – March 26, 2010

On August 6, 2008 United States District Court, District of Massachusetts in Boston indicted Albert Gonzalez,  a/k/a cumbajohny, a/k/a cj, a/k/a UIN 20167996, a/k/a UIN 476747, a/ak/a soupnazi, a/k/a segvec, a/k/a klngchilli, a/k/a stanozololz, for masterminding a crime ring to use malware to steal and sell more than 170,000,000 credit card and ATM numbers from retail stores during 2005 to 2007. 

"On August 28, 2009, his [Gonzalez's] attorney filed papers with the United States District Court for the District of Massachusetts in Boston indicating that he would plead guilty to all 19 charges in the U.S. v. Albert Gonzalez, 08-CR-10223, case (the TJ Maxx case). According to reports this plea bargain would "resolve" issues with the New York case of U.S. v. Yastremskiy, 08-CR-00160 in United States District Court for the Eastern District of New York (the Dave and Busters case).

"Gonzalez could serve a term of 15 years to 25 years. He would forfeit more than $1.65 million, a condominium in Miami, a blue 2006 BMW 330i automobile, IBM and Toshiba laptop computers, a Glock 27 firearm, a Nokia cell phone, a Tiffany diamond ring and three Rolex watches. "

"His sentence would run concurrent with whatever comes out of the case in the United States District Court for the District of New Jersey (meaning that he would serve the longest of the sentences he receives)" (Wikipedia article on Albert Gonzalez, accessed 01-18-2010).

On March 26, 2010 U.S. District Court Judge Douglas P. Woodcock sentenced Gonzalez to twenty years in prison with three twenty year sentences running concurrently.

"The sentence imposed by U.S. District Court Judge Douglas P. Woodlock was for Gonzalez's role in a hacking ring that broke into computer networks of Heartland Payment Systems, which processed credit and debit card transactions for Visa and American Express, Hannaford Supermarkets and 7-Eleven. The sentence is actually 20 years and one day, owing to the need to deal with peculiarities in sentencing statutes, because Woodlock had to take into account that Gonzalez was on pretrial release for an unrelated crime when he took up with the international network of hackers responsible for the security breaches. He was at the time supposed to be serving as an informant for the U.S. Secret Service, but he double-crossed the agency, supplying a co-conspirator with information obtained as part of those investigations" (http://www.sfgate.com/cgi-bin/article.cgi?f=/g/a/2010/03/26/urnidgns852573C400693880002576EF004839D0.DTL, accessed 03-27-2010).

View Map + Bookmark Entry

181,277,835 Active Websites September 2008

According to a Netcraft survey in September 2008 there were 181,277,835 active websites on the Internet.

View Map + Bookmark Entry

Craiglist Becomes the Leading Classified Advertising Service Worldwide September 2008

By September 2008 Craigslist was the leading classified advertising service worldwide. It provided free local classifieds and forums for more than 550 cities in over 50 countries, generating more than 12 billion page views per month, used by more than 50 million people each month. Craigslist users self-published more than 30 million new classified ads each month and more than 2 million new job listings each month. Each month Craigslist also posted more than 100 million user postings in more than 100 topical forms. All of this it did with only 25 employees.

Because Craigslist did not charge for classified advertising it replaced a large portion of the classified advertising that historically was placed in print newspapers. By doing so it substantially reduced the significant revenue that print newspapers historically generated from classified advertising. This contributed to an overall reduction of profits for many print newspapers. Similarly, Craigslist's policy of charging below-market rates for job listings impacted that traditional source of newspaper revenue, and impacted profits at physical employment agencies, and the more expensive online employment agencies.

View Map + Bookmark Entry

The First Android Phone is Introduced September 23, 2008

On September 23, 2008 T-Mobile, headquartered in Bonn, Germany, announced the first cell phone powered by the Android operating system, developed by Google in association with the Open Handset Alliance.

View Map + Bookmark Entry

Viewing the Illustrations of a Journal Article in Three Dimensions September 30, 2008

On September 30, 2008 the Optical Society and the National Library of Medicine announced Interactive Science Publishing.

" 'ISP' represents a new direction for OSA publications. The ISP articles, which appear in OSA journals, link out to large 2D and 3D datasets—such as a CT scan of the human chest—that can be viewed interactively with special software developed by OSA in cooperation with Kitware, Inc., and the National Library of Medicine."

View Map + Bookmark Entry

The Largest Atlas Ever Published as a Printed Book October 2008 – March 2012

In 2008 Gordon Cheers of Millennium House, North Narabeen, Australia, published a world atlas called Earth. The World Atlas. Containing 576 pages with 154 maps and 800 photographs, the volume measured 610 x 469 millimeters and weighed over 30 kilos. The publishers described it as the largest atlas ever published as a printed book.

"The book also includes four monster-sized gatefolds which, unfurled, measure six x four feet (1.82 x 1.21 meters) and reveal pinpoint sharp satellite images including shots of the earth and sky at night" (http://www.cnn.com/2008/TECH/science/10/16/earth.atlas/index.html#cnnSTCText, accessed 12-05-2008).

In December 2013 a virtual tour of a few pages of the atlas was avilable from the Millenium House website at this link

The book was offered for sale in two versions: "Royal Blue," limited to 2000 copies, and available in bookstores, and "Imperial Gold," limited to 1000 copies and for sale only by Millenium House. In October 2009 Amazon.com offered a copy of an unspecified version for about $7200 plus $3.99 shipping and handling. There was also a regular trade edition available in a 325 x 250 mm format called Earth Condensed.

When I revisited the Millenium House website in March 2012 I noticed that the publishers had surpassed their previous size record by producing the Platinum edition of their atlas in an enormous 6 foot x 4.5 foot format (1.8m x 1.4m) in an edition limited to 31 copies at the price of $100,000 USD per copy. As they stated:

"Once in a lifetime, the opportunity comes along to acquire something truly exquisite and unique—a piece of history, a rare collectible, a masterpiece... EARTH Platinum Edition is such an acquisition. With only 31 individually numbered copies of this immense, limited edition atlas available, this beautifully presented book will be sought after by fine institutions and discerning collectors. Superb cartography is displayed on the massive pages when opened: each spread measures a breathtaking 6 feet x 9 feet (1.8m x 2.7m), presenting an unsurpassed view of the world...." 

Though I was unsure whether the original 2008 version of Earth. The World Atlas was, as the publisher's claimed "the largest atlas ever published as a printed book," we can safely say that the enormous Platinum edition knocks out any previous competition in the size category.

View Map + Bookmark Entry

Creation of the HathiTrust Digital Library October 2008 – March 2012

In October 2008 thirteen universities in the Committee on Institutional Cooperation and the University of California founded the HathiTrust, a very large scale collaborative repository of digital content from research libraries, including content digitized via the Google Books project, and Internet Archive digitization initiatives, as well as content digitized locally by member libraries. The name came from the Hindu word for elephant, as in "an elephant never forgets."

♦ As of January 2011 over 50 academic research libraries were members of the HathiTrust. Its website published the following statistics:

7,909,950 total volumes,  4,057,969 book titles, 189,013 serial titles 2,768,482,500 pages,  355 terabytes,  94 miles,  6,427 tons,  1,972,865 volumes (~25% of total) in the public domain.

♦ In March 2012 the HathiTrust website published the following statistics:

10,109,695 total volumes,  5,371,919 book titles, 266,508 serial titles 3,538,393,250 pages,  453 terabytes,  120 miles,  8,214 tons,  2,802,045 volumes (~28% of total) in the public domain.

View Map + Bookmark Entry

The Obama-Biden Campaign Launches Facebook Connect Integration on My.BarackObama.com October 20, 2008

"This morning [October 20, 2008], the Obama-Biden campaign announced that has launched Facebook Connect integration at My.BarackObama.com, the grassroots organizing social network set up by the Obama campaign many months ago. The integration will allow users to find their Facebook friends who are also on the site, and will automatically publish users’ activity on the site (like signing up for a campaign event or to make phone calls) on their Facebook wall feed. In some ways it comes as no surprise that the Obama campaign would launch Facebook Connect support early on, as Facebook co-founder Chris Hughes now runs many of Obama’s social media efforts. It will be interesting to see how much of an impact the integration will have in the final 2 weeks of the campaign season, and potentially beyond" (InsideFacebook.com).

View Map + Bookmark Entry

Old Wine in New Bottles? October 24, 2008

The conversion of the old format of the From Gutenberg to the Internet Timeline, begun in 2005, to this interactive database format, was complete on October 24, 2008. Reflecting its coverage of the history of information since the beginning of records, I renamed the it From Cave-Paintings to the Internet.

By the end of the conversion there were 1535 timeline entries, nearly all of which had one or more hyperlinks to reference sources. There were also more than sixty themes by which the timeline could be searched. Timeline items were indexed by up to six themes.

In the process of converting from the old list format to the new interactive database I checked all hyperlinks, corrected mistakes, added new hyperlinks, and added a numerous new entries.

The database remained a work in progress.

View Map + Bookmark Entry

An Encyclopedia with More than Ten Million Articles October 27, 2008

In 2008 the Wikipedia attracted at least 684 million visitors annually. 

"There are more than 75,000 active contributors working on more than 10,000,000 articles in more than 250 languages. As of today, there are 2,603,373 articles in English; every day hundreds of thousands of visitors from around the world make tens of thousands of edits and create thousands of new articles to enhance the knowledge held by the Wikipedia encyclopedia."

View Map + Bookmark Entry

The First National Newspaper to Shift From a Daily Print Format to an Online Publication October 28, 2008

On Cctober 28, 2008. after 100 years of publishing in print, The Christian Science Monitor announced in Boston that in April 2009 it would become "the first newspaper with a national audience to shift from a daily print format to an online publication that is updated continuously each day.

"The changes at the Monitor will include enhancing the content on CSMonitor.com, starting weekly print and daily e-mail editions, and discontinuing the current daily print format."

View Map + Bookmark Entry

Authors, Publishers and Google Reach "Landmark Settlement" October 28, 2008

On October 28, 2008 The Authors Guild, the Association of American Publishers (AAP), and Google announced a groundbreaking settlement agreement

"on behalf of a broad class of authors and publishers worldwide that would expand online access to millions of in-copyright books and other written materials in the U.S. from the collections of a number of major U.S. libraries participating in Google Book Search. The agreement, reached after two years of negotiations, would resolve a class-action lawsuit brought by book authors and the Authors Guild, as well as a separate lawsuit filed by five large publishers as representatives of the AAP’s membership. The class action is subject to approval by the U.S. District Court for the Southern District of New York.

"If approved by the court, the agreement would provide:

  • More Access to Out-of-Print Books – Generating greater exposure for millions of in-copyright works, including hard-to-find out-of-print books, by enabling readers in the U.S. to search these works and preview them online;
  • Additional Ways to Purchase Copyrighted Books – Building off publishers’ and authors’ current efforts and further expanding the electronic market for copyrighted books in the U.S., by offering users the ability to purchase online access to many in-copyright books;
  • Institutional Subscriptions to Millions of Books Online – Offering a means for U.S. colleges, universities and other organizations to obtain subscriptions for online access to collections from some of the world’s most renowned libraries;
  • Free Access From U.S. Libraries – Providing free, full-text, online viewing of millions of out-of-print books at designated computers in U.S. public and university libraries; and
  • Compensation to Authors and Publishers and Control Over Access to Their Works – Distributing payments earned from online access provided by Google and, prospectively, from similar programs that may be established by other providers, through a newly created independent, not-for-profit Book Rights Registry that will also locate rightsholders, collect and maintain accurate rightsholder information, and provide a way for rightsholders to request inclusion in or exclusion from the project."
View Map + Bookmark Entry

Raphael's Madonna of the Goldfinch Restored 450 Years after it was Nearly Destroyed October 30, 2008

The Italian Renaissance painter Raphael's masterpiece, Madonna of the Goldfinch, which survived the collapse of a palace and more than four centuries of decay, reached the completion of a ten year restoration process, and on October 30, 2008 was returned to the Uffizi gallery with a strengthened canvas and its colors restored to their original radiance.

"Raphael painted this work around 1505 for the wedding of his friend Lorenzo Nasi, a rich merchant in Florence. When Nasi’s palace collapsed in 1548, the painting was shredded into 17 pieces. The work was first put together with pieces of wood and long nails. The work later developed a yellowish opaque color. Restorers feared touching it because it was very fragile."

"The painting features a seated Mary with John the Baptist passing on a goldfinch to Jesus as a forewarning of his violent death. The bird has been associated in art with Christ's crucifixion.

"The restoration work began in 1999 using X-rays, microscopes, and lasers to find and seal the ancient fractures."

View Map + Bookmark Entry

An Election Reported Interactively in Real Time November 4, 2008

Apart from the historic election of Barack Obama, the first African American President of the United States, from the standpoint of the history of information and media, one element of this election and the campaign that preceded it was the blending of its coverage by broadcast media and the rapidly evolving interactive media on the Internet. Television networks repeatedly referred viewers to their websites for interactive news stories and additional information. While we watched the election on television or listened to radio we received information in emails, from websites, and from blogging and microblogging sites like Twitter. Within minutes after the election was decided I received an email from the Obama campaign signed by Barack Obama. Online newspapers updated election results in real time. Perhaps most remarkably, even the Wikipedia article on the United States presidential election 2008 was updated in real time on the web as election results were available. This I learned from reading a blog in The New York Times online—an online newspaper blogging about an article in an online encyclopedia. From the standpoint of the history of media this represents a blurring or blending of the historic distinctions that evolved over centuries between news media writing about the moment, and traditionally more static works of reference such as encyclopedias.

An email from info@barackobama.com received 11-04-08 8:18PM PST, 18 minutes after polls closed on the West coast and news media computers declared an Obama victory. Presumbably this email was sent to the millions of people who donated to Obama's campaign:

"Jeremy --


I'm about to head to Grant Park to talk to everyone gathered there, but I wanted to write to you first.
We just made history.
And I don't want you to forget how we did it.
You made history every single day during this campaign -- every day you knocked on doors, made a donation, or talked to your family, friends, and neighbors about why you believe it's time for change.
I want to thank all of you who gave your time, talent, and passion to this campaign.
We have a lot of work to do to get our country back on track, and I'll be in touch soon about what comes next.
But I want to be very clear about one thing...
All of this happened because of you.
Thank you,

Barack"

View Map + Bookmark Entry

CNN Introduces "Hologram-Like" Images in Coverage of the 2008 Presidential Election November 4, 2008

On November 4, 2008 during coverage of the presidential election, CNN executive producer David Bohrman arranged for hologram-like live images of Jessica Yellin in Chicago to be "beamed" into Wolf Blitzer's broadcast stage in New York so that she appeared to have a face-to face conversation with Wolf.

Yellin appeared somewhat fuzzy and her image, apparently projected a few feet in front of Blitzer, appeared to glow around the edges. Yellin explained that her image was being filmed in Chicago by 35 high-definition cameras set in a ring inside a special tent, which were processed and synchronized by 20 computers to the cameras in the New York studio. The network, which made use of three-dimensional imaging technology produced by Norway-based Vizrt and SportVu, billed the interview as a first for television. Later that day CNN aired a second "hologram" interview between anchor Anderson Cooper and rapper Will.I.Am, who was also in Chicago.  

View Map + Bookmark Entry

Change.gov is Founded November 5, 2008

On November 5, 2008, the day after the presdidential election President-Elect Barack Obama launched the website, Change.gov to communicate details of the transition to the presidency.

View Map + Bookmark Entry

Year After Year People Will Share Twice as Much Information November 5, 2008

In 2008 at the Web 2.0 Summit in San Francisco Mark Zuckerberg, founder and chief executive of Facebook, made the following statement in a conversation with John Batelle:

"Four years ago, when Facebook was getting started, most people didn’t want to put up any information about themselves on the Internet. Right? So, we got people through this really big hurdle of wanting to put up their full name, or real picture, mobile phone number …

"I would expect that, you know, next year, people will share twice as much information as they are this year. And then, the year after that, they’ll share twice as much information as they are next year …

"… as long as the stream of information is just constantly increasing, and we’re doing our job, and, and our, and our role, and kind of like pushing that forward, then I think that, you know, that’s, that’s just been the best strategy for us" (http://chronicle.com/blogs/wiredcampus/the-zuckerberg-files-new-scholarly-archive-scrutinizes-facebook-ceo/47777?cid=wc&utm_source=wc&utm_medium=en, accessed 10-29-2013).

View Map + Bookmark Entry

Discovery of a Set of Mutations that Might Have Caused a Cancer November 6, 2008

On November 6, 2008 Timothy J. Ley and numerous collaborators from different countries published in the journal Nature, "DNA sequencing of a cytogenetically normal acute myeloid luekaemia genome". This was first time that researchers decoded all the genes of a person with cancer and found a set of mutations that might have caused the disease or aided its progression. The New York Times online reported:

"Using cells donated by a woman in her 50s who died of leukemia, the scientists sequenced all the DNA from her cancer cells and compared it to the DNA from her own normal, healthy skin cells. Then they zeroed in on 10 mutations that occurred only in the cancer cells, apparently spurring abnormal growth, preventing the cells from suppressing that growth and enabling them to fight off chemotherapy.

"The findings will not help patients immediately, but researchers say they could lead to new therapies and would almost certainly help doctors make better choices among existing treatments, based on a more detailed genetic picture of each patient’s cancer. Though the research involved leukemia, the same techniques can also be used to study other cancers."

View Map + Bookmark Entry

Analysis of Web Search Queries Track the Spread of Flu Faster than Traditional Surveillance Methods November 11, 2008

On November 11, 2008 Google.org unveiled Google Flu Trends, using aggregated Google search data to estimate flu activity up to two weeks faster than traditional flu surveillance systems.

"Each week, millions of users around the world search for online health information. As you might expect, there are more flu-related searches during flu season, more allergy-related searches during allergy season, and more sunburn-related searches during the summer. You can explore all of these phenomena using Google Trends. But can search query trends provide an accurate, reliable model of real-world phenomena?

"We have found a close relationship between how many people search for flu-related topics and how many people actually have flu symptoms. Of course, not every person who searches for "flu" is actually sick, but a pattern emerges when all the flu-related search queries from each state and region are added together. We compared our query counts with data from a surveillance system managed by the U.S. Centers for Disease Control and Prevention (CDC) and discovered that some search queries tend to be popular exactly when flu season is happening. By counting how often we see these search queries, we can estimate how much flu is circulating in various regions of the United States.

"During the 2007-2008 flu season, an early version of Google Flu Trends was used to share results each week with the Epidemiology and Prevention Branch of the Influenza Division at CDC. Across each of the nine surveillance regions of the United States, we were able to accurately estimate current flu levels one to two weeks faster than published CDC reports" (Google Flu Trends website).

View Map + Bookmark Entry

First Images of Extra-Solar Planets Taken from the Visible Spectrum: Planets Located 130 Light-Years from Earth November 13, 2008

On November 13, 2008 NASA and the Lawrence Livermore National Laboratory announced the first-ever pictures taken from the visible spectrum of extrasolar planets. The images were glimpsed by the Gemini North and Keck telescopes on the Mauna Kea mountaintop in Hawaii. 

"British and American researchers snapped the first ever visible-light pictures of three extrasolar planets orbiting the star HR8799.  HR8799 is about 1.5 times the size of the sun, located 130 light-years away in the Pegasus constellation.  Observers can probably see this star through binoculars, scientists said.

"To identify the planets, researchers compared images of the system, known to contain planets HF8799b, HF8799c, and HF8799d.  In each image faint objects were detected, and by comparing images from over the years, it was confirmed that these were the planets in their expected positions and that they orbit their star in a counterclockwise direction.

"NASA's Hubble Space Telescope at about the same time picked up images of a fourth planet, somewhat unexpectedly.  The new planet, Fomalhaut b orbits the bright southern star Fomalhaut, part of the constellation Piscis Australis (Southern Fish) and is relatively massive -- about three times the size of Jupiter.  The planet orbits 10.7 billion miles from its home star and is approximately 25 light-years from Earth."  (quoations from Daily Tech November 16, 2008).

View Map + Bookmark Entry

"Tango with Cows" : A Virtual Exhibition November 18, 2008

Cover of Tango with Cows, Vasily Kamensky.  Please click on image to view larger image.

On November 18, 2008 The Getty Museum opened an exhibition entitled Tango with Cows: Book Art of the Russian Avant-Garde 1910-1917. On the website of the show you could turn the pages of virtual copies of rare art books exhibited, view English translations, and hear readings of the text in Russian. (I last accessed the website in December 2013).

View Map + Bookmark Entry

PC Magazine Becomes an Online-Only Publication November 19, 2008

On November 19, 2008 PC Magazine announced that the January 2009 issue (Volume 28, Issue 1) would be the last printed edition of this "venerable publication," after which it moved to an online only format.

"While most magazines make most of their money from print advertising, PC Magazine derives most of its profit from its Web site. More than 80 percent of the profit and about 70 percent of the revenue come from the digital business, Mr. Young said, and all of the writers and editors have been counted as part of the digital budget for two years" (NY Times online 11-19-08).

View Map + Bookmark Entry

Europeana, the European Digital Library, Museum and Archive is Launched November 20, 2008

Europeana, the European digital library, museum and archive, was launched on November 20, 2008, giving users direct access to some two million digital objects, including film material, photos, paintings, sounds, maps, manuscripts, books, newspapers and archival papers.

"The digital content will be selected from that which is already digitised and available in Europe's museums, libraries, archives, and audio-visual collections. The prototype aims to have representative content from all four of these cultural heritage domains, and also to have a broad range of content from across Europe."

"We launched the European.eu site on 20 November and huge use - 10 million hits an hour - meant it crashed. We are doing our best to reopen Europeana.eu in a more robust version" (Europeana website accessed 11-21-2008).

Note: the site re-opened on or before January 1, 2009 after quadrupling server capacity.

View Map + Bookmark Entry

"From Book Fluency to Screen Fluency, from Literacy to Visuality" November 21, 2008

On November 21, 2008 Kevin Kelly of Pacifica, California, published an article in The New York Times Magazine entitled "Becoming Screen Literate," arguing that literacy was undergoing a paradigm shift from "book fluency to screen fluency." Summarizing historical developments in the book that led to what he called "book fluency," Kelly argued that digital tools making authorship of films accessible to everyone, were changing the nature of both production of films, and scholarship about films, much as they changed the nature of authorship and scholarship of books. Sections are quoted below:

"When technology shifts, it bends the culture. Once, long ago, culture revolved around the spoken word. The oral skills of memorization, recitation and rhetoric instilled in societies a reverence for the past, the ambiguous, the ornate and the subjective. Then, about 500 years ago, orality was overthrown by technology. Gutenberg’s invention of metallic movable type elevated writing into a central position in the culture. By the means of cheap and perfect copies, text became the engine of change and the foundation of stability. From printing came journalism, science and the mathematics of libraries and law. The distribution-and-display device that we call printing instilled in society a reverence for precision (of black ink on white paper), an appreciation for linear logic (in a sentence), a passion for objectivity (of printed fact) and an allegiance to authority (via authors), whose truth was as fixed and final as a book. In the West, we became people of the book.

"Now invention is again overthrowing the dominant media. A new distribution-and-display technology is nudging the book aside and catapulting images, and especially moving images, to the center of the culture. We are becoming people of the screen. The fluid and fleeting symbols on a screen pull us away from the classical notions of monumental authors and authority. On the screen, the subjective again trumps the objective. The past is a rush of data streams cut and rearranged into a new mashup, while truth is something you assemble yourself on your own screen as you jump from link to link. We are now in the middle of a second Gutenberg shift — from book fluency to screen fluency, from literacy to visuality.

"The overthrow of the book would have happened long ago but for the great user asymmetry inherent in all media. It is easier to read a book than to write one; easier to listen to a song than to compose one; easier to attend a play than to produce one. But movies in particular suffer from this user asymmetry. The intensely collaborative work needed to coddle chemically treated film and paste together its strips into movies meant that it was vastly easier to watch a movie than to make one. A Hollywood blockbuster can take a million person-hours to produce and only two hours to consume. But now, cheap and universal tools of creation (megapixel phone cameras, Photoshop, iMovie) are quickly reducing the effort needed to create moving images. To the utter bafflement of the experts who confidently claimed that viewers would never rise from their reclining passivity, tens of millions of people have in recent years spent uncountable hours making movies of their own design. Having a ready and reachable audience of potential millions helps, as does the choice of multiple modes in which to create. Because of new consumer gadgets, community training, peer encouragement and fiendishly clever software, the ease of making video now approaches the ease of writing.

"But merely producing movies with ease is not enough for screen fluency, just as producing books with ease on Gutenberg’s press did not fully unleash text. Literacy also required a long list of innovations and techniques that permit ordinary readers and writers to manipulate text in ways that make it useful. For instance, quotation symbols make it simple to indicate where one has borrowed text from another writer. Once you have a large document, you need a table of contents to find your way through it. That requires page numbers. Somebody invented them (in the 13th century). Longer texts require an alphabetic index, devised by the Greeks and later developed for libraries of books. Footnotes, invented in about the 12th century, allow tangential information to be displayed outside the linear argument of the main text. And bibliographic citations (invented in the mid-1500s) enable scholars and skeptics to systematically consult sources. These days, of course, we have hyperlinks, which connect one piece of text to another, and tags, which categorize a selected word or phrase for later sorting.

"All these inventions (and more) permit any literate person to cut and paste ideas, annotate them with her own thoughts, link them to related ideas, search through vast libraries of work, browse subjects quickly, resequence texts, refind material, quote experts and sample bits of beloved artists. These tools, more than just reading, are the foundations of literacy."

 

View Map + Bookmark Entry

Downloads Trump CDs November 25, 2008

On November 25, 2008 Atlantic Records, a unit of Warner Music Group, New York, reported that more than half its revenue came from downloads and ringtones sold over the Internet, rather than CDs. This was the first major record label to document this change.

View Map + Bookmark Entry

Over 5,000,000 Articles Posted on the HighWire Press e-Publishing Platform. December 2, 2008

On December 2, 2008 Stanford University Libraries' HighWire Press, announced over the DIGLIB newsgroup that it 

"reached a significant milestone this week with the posting of the five millionth article on its e-Publishing platform.  HighWire, a division of the Stanford University Libraries, provides technology and customized online services to 140 publishing partners ranging from independent non-profit societies and associations, to university presses and large commercial publishers.

"The milestone occurred while loading a substantial amount of journal backfiles on behalf of the American Medical Association. Bringing the HighWire total article count over the 5 million mark was an article dating from 1884, “Dermatitis Herpetiformis” by Louis A. Duhring, MD1, published in JAMA: The Journal of the American Medical Association. The JAMA & Archives Journals Backfiles Collection will ensure that 125 years of high quality medical research will be available online at the journals’ Web sites on the HighWire platform."

At this time Highwire Press

"hosts the largest repository of high impact, peer-reviewed content, with 1186 journals and 5,006,835 full text articles from over 140 scholarly publishers. HighWire-hosted publishers have collectively made 2,015,269 articles free. With our partner publishers we produce 71 of the 200 most-frequently-cited journals."

View Map + Bookmark Entry

Probably the Most Expensive Single Volume Printed Edition Ever Published December 2, 2008

On December 2, 2008, the day after the U.S. government officially declared the U.S. in recession, visitors to the New York Public Library viewed the book, Michelangelo: La Dotta Mano (The Learned Hand) published by FMR (Franco Maria Ricci), Milan, Italy, and donated to the library by the FMR Foundation.

Limited to 99 copies on hand-made paper, with a cover incorporating a marble relief, and offered at a list price of 100,000 Euros per copy, this may be the most expensive, and also possibly the most over-priced, single volume printed edition ever issued. According to The New York Times online, 33 copies were produced by this date, of which 20 were sold.

In January 2009 you could take a virtual tour of the book at FMR online. This site appeared to be down in March 2012. 

View Map + Bookmark Entry

"Securing Cyberspace for the 44th Presidency" December 8, 2008

On December 8, 2008 the Center for Strategic and International Studies Commission on Cybersecurity for the 44th Presidency issued the report by James Andrew Lewis entitled Securing Cyberspace for the 44th Presidency.

"The Commission’s three major findings are: cybersecurity is now one of the major national security problems facing the United States; decisions and actions must respect American values related to privacy and civil liberties; and only a comprehensive national security strategy that embraces both the domestic and international aspects of cybersecurity will improve the situation."

According to the New York Times online:

"A government and technology industry panel on cyber-security is recommending that the federal government end its reliance on passwords and enforce what the industry describes as “strong authentication.”

"Such an approach would probably mean that all government computer users would have to hold a device to gain access to a network computer or online service. The commission is also encouraging all nongovernmental commercial services use such a device.

“' We need to move away from passwords,' said Tom Kellermann, vice president for security awareness at Core Security Technologies and a member of the commission that created the report." (http://www.nytimes.com/2008/12/09/technology/09security.html?_r=1, accessed 12-09-2008).

View Map + Bookmark Entry

Pulitzer Prizes Will be Awarded for Online Journalism December 8, 2008

On December 8, 2008 pulitzer.org announced that "The Pulitzer Prizes in journalism, which honor the work of American newspapers appearing in print, have been expanded to include many text-based newspapers and news organizations that publish only on the Internet." 

The Board also decided to allow entries made up entirely of online content to be submitted in all 14 Pulitzer journalism categories. 

View Map + Bookmark Entry

The First Reported Case of ZZZ-Mailing December 15, 2008

"A WOMAN in a deep sleep sent emails to friends asking them over for wine and caviar in what doctors believe is the first reported case of 'zzz-mailing' - using the internet while asleep.

"The case of the 44-year-old woman is reported by researchers from the University of Toledo in the latest edition of the medical journal Sleep Medicine.

"They said the woman went to bed about 10pm but got up two hours later and walked to her computer in the next room, Britain's Daily Mail newspaper reports.

"She turned it on, connected to the internet, and logged on before composing and sending three emails.

"Each was in a random mix of upper and lower cases, not well formatted and written in strange language, the researchers said.

"One read: "Come tomorrow and sort this hell hole out. Dinner and drinks, 4pm,. Bring wine and caviar only."

"Another said simply, "What the…".

"The new variation of sleepwalking has been described as "zzz-mailing".

"We believe writing an email after turning the computer on, connecting to the internet and remembering the password displayed by our patient is novel," the researchers said.

"To our knowledge this type of complex behaviour requiring co-ordinated movements has not been reported before in sleepwalking" (http://www.news.com.au/technology/story/0,28348,24802639-5014239,00.html, accessed 12-30-2008)

View Map + Bookmark Entry

"The Future of Learning Institutions in a Digital Age" 2009

In 2009 American educator Cathy N. Davidson of Duke University, and David Theo Goldberg, of the University of California at Irvine, with support of the John D. and Catherine T. MacArthur Foundation grant making initiative on Digital Media and Learning, published The Future of Learning Institutions in a Digital Age.

View Map + Bookmark Entry

"Readability" is Launched 2009

Readability was launched by Arc90 in New York.  This service automatically stripped websites of advertising and other distractions, providing a customized reading view, and method of saving articles for future reading.

View Map + Bookmark Entry

U.S. Households Received and Sent 176 Billion Pieces of Physical Mail in 2009 2009

According to United States Postal Service’s Household Diary Study (HDS) for Fiscal Year (FY) 2009, U.S. households sent and received 176 billion pieces of physical mail in 2009, not including international mail:

"Table E.1: Mail Received and Sent by Households

"(Billions of Pieces) Mail

"Classification                 Received     Sent

"First-Class Mail                  53.1         18.3

"Standard Regular Mail        58.2            —

"Standard Nonprofit Mail      12.5           —

"Periodicals                          6.0           —

"Package & Shipping Services 1.8          0.5

      "Total                         131.6        18.8

"Household to Household             5.4

"Total Mail Received and

Sent by Households                    145.0

       "FY 2009 RPW Total*           176.3

"Non-household to

"Non-household Residual              31.3

"Unaddressed                                1.6 —

"Source: HDS Diary Sample, FY 2009. *Does not include     international mail."

View Map + Bookmark Entry

The First iPhone and iPad Apps for the Visually Impaired 2009 – 2010

Because of the convenience of carrying smart phones it was probably inevitable that their features would be applied to support the visually impaired. iBlink Radio introduced in July 2010 by Serotek Corporation of Minneapolis, Minnesota, calls itself the first iOS application for the visually impaired. It provides access to radio stations, podcasts and reading services of special interest to blind and visually impaired persons, as well as their friends, family, caregivers and those wanting to know what life is like without eyesight.

SayText, also introduced in 2010 by Haave, Inc. of Vantaa, Finland, reads out loud text that is photographed by a cell phone camera.

VisionHunt, by VI Scientific of Nicosia, Cyprus, introduced in 2009, is a vision aid tool for the blind and the visually impaired that uses the phone’s camera to detect colors, paper money and light sources. VisionHunt identifies about 30 colors. It also detects 1, 5, 10, 20, 50 US Dollar bills. Finally, VisionHunt detects sources of light, such as switched-on lamps or televisions. VisionHunt is fully accessible to the blind and the visually impaired through Voice Over or Zoom.

Numerous other apps for the visually impaired were introduced after the above three.

View Map + Bookmark Entry

Social TV Begins: Rebuilding TV Audiences Through Social Networks Circa 2009 – 2013

Though viewership for live television broadcasts was declining, around 2010 it was observed that events such as the winter Olympics and the Grammys were drawing more viewers and more buzz because of the phenomenon of social television.  The rebound happened at least partly because of new viewing habits: while people watched TV they used smart phones or laptops to swap texts, tweets, and status updates about celebrities, characters, and even commercials.

"The MIT Media Lab has held a graduate class on Social TV since 2009. In 2012, faculty at the Wharton School of Business launched a Social TV Lab to study the link between what is said on television and what is shared simultaneously with the public on social media about shows and advertisements. Other research organizations active in Social TV include British Telecom, Motorola Research and Microsoft Research" (Wikipedia article on Social Television, accessed 10-07-2013).

"Marie-José Montpetit, an invited scientist at MIT's Research Lab for Electronics, has been working for several years on social TV--a way to seamlessly combine the social networks that are boosting TV ratings with the more passive experience of traditional TV viewing. Her goal is to make watching television something that viewers in different places can share and discuss--and to make it easier to find something to watch.

"Carriers, networks, and content producers hope that making it easier for viewers to link up with friends will help them hold on to their audiences rather than losing them to services like Hulu, which stream shows over the Internet. And opening TV to social networking could make it easier for companies to provide personalized programming" (http://www2.technologyreview.com/article/418541/tr10-social-tv/, accessed 10-07-2013).

On October 6, 2013 The New York Times published an article announcing that Nielsen had begun measuring Twitter posts about television, providing a more complete view of social TV for advertisers.

View Map + Bookmark Entry

Apple Eliminates Anticopying Restrictions from iTunes January 6, 2009

Having sold over a billion songs through the iTunes store in 2008, Apple announced that it reached agreements with record companies to remove anticopying restrictions on all tunes in the iTunes store. It also allowed record companies to set a range of prices for the songs.

View Map + Bookmark Entry

In 2008 China Becomes the Top User of the Internet January 14, 2009

"BEIJING, China (CNN) January 14, 2009 -- China surpassed the United States in 2008 as the world's top user of the Internet, according to a government-backed research group.

"The number of Web surfers in the country grew by nearly 42 percent to 298 million, according to the China Internet Network Information Center's January report. And there's plenty of room for growth, as only about 1 in every 4 Chinese has Internet access.  

"The rapid growth in China's Internet use can be tied to its swift economic gains and the government's push for the construction of telephone and broadband lines in the country's vast rural areas, the report says.  

"The Chinese government wants phone and broadband access in each village by 2010.

"Nearly 91 percent of China's Internet users are surfing the Web with a broadband connection -- an increase of 100 million from 2007. Mobile phone Internet users totaled 118 million by the end of 2008" (http://www.cnn.com/2009/TECH/01/14/china.internet/index.html, accessed 01-13-2010).

View Map + Bookmark Entry

The BBC Intends to Place 200,000 Oil Paintings on the Internet January 28, 2009

"The BBC is to put every one of the 200,000 oil paintings in public ownership in the UK on the internet as well as opening up the Arts Council's vast film archive online as part of a range of initiatives that it has pledged will give it a 'deeper commitment to arts and music'."

"A partnership with the Public Catalogue Foundation charity will see all the UK's publicly owned oil paintings – 80% of which are not on public display – placed on the internet by 2012. 'The BBC said it wanted to establish a new section of its bbc.co.uk website, called Your Paintings, where users could view and find information on the UK's national collection.

"The Public Catalogue Foundation, launched in 2003, is 30% of the way through cataloguing the UK's collection of oil paintings.

"In addition the BBC said it was talking to the Arts Council about giving the public free online access to its archive for the first time, including its wide-ranging film collection dating back to the 1950s" (quotations from http://www.guardian.co.uk/media/2009/jan/28/bbc-digitalmedia)

View Map + Bookmark Entry

BitTorrent was Responsible for 27-55% of All Internet Traffic February 2009

Ipoque, based in Leipzig, Germany, estimated that in February 2009 BitTorrent, based in San Francisco, California, was responsible for more than 45-78% of all P2P traffic and 27-55% of all Internet traffic, depending on geographical location.

View Map + Bookmark Entry

Google Earth Incorporates Historical Imagery February 2, 2009

On February 2, 2009 Google launched Google Earth 5.0. Among the most significant features were Historical Imagery, Touring, and 3D Mars.

" ♦ Historical Imagery: Until today, Google Earth displayed only one image of a given place at a given time. With this new feature, you can now move back and forth in time to reveal imagery from years and even decades past, revealing changes over time. Try flying south of San Francisco in Google Earth and turning on the new time slider (click the "clock" icon in the toolbar) to witness the transformation of Silicon Valley from a farming community to the tech capital of the world over the past 50 years or so.  

" ♦ Touring: One of the key challenges we have faced in developing Google Earth has been making it easier for people to tell stories. People have created wonderful layers to share with the world, but they have often asked for a way to guide others through them. The Touring feature makes it simple to create an easily sharable, narrated, fly-through tour just by clicking the record button and navigating through your tour destinations.

" ♦ 3D Mars: This is the latest stop in our virtual tour of the galaxies, made possible by a collaboration with NASA. By selecting "Mars" from the toolbar in Google Earth, you can access a 3D map of the Red Planet featuring the latest high-resolution imagery, 3D terrain, and annotations showing landing sites and lots of other interesting features" (Official Google Blog, http://googleblog.blogspot.com/2009/02/dive-into-new-google-earth.html, accessed 11-29-2010).

View Map + Bookmark Entry

"Google and the Future of Books" February 12, 2009

On February 12, 2009 cultural historian, book historian, and librarian Robert Darnton, of Harvard University, published an insightful and critical article, "Google and the Future of Books" in the New York Review of Books. From it I quote:

"How can we navigate through the information landscape that is only beginning to come into view? The question is more urgent than ever following the recent settlement between Google and the authors and publishers who were suing it for alleged breach of copyright. For the last four years, Google has been digitizing millions of books, including many covered by copyright, from the collections of major research libraries, and making the texts searchable online. The authors and publishers objected that digitizing constituted a violation of their copyrights. After lengthy negotiations, the plaintiffs and Google agreed on a settlement, which will have a profound effect on the way books reach readers for the foreseeable future. What will that future be?

"No one knows, because the settlement is so complex that it is difficult to perceive the legal and economic contours in the new lay of the land. But those of us who are responsible for research libraries have a clear view of a common goal: we want to open up our collections and make them available to readers everywhere. How to get there? The only workable tactic may be vigilance: see as far ahead as you can; and while you keep your eye on the road, remember to look in the rearview mirror." (quotations from the beginning of Darnton's longish article, accessed 01-28-2009).

View Map + Bookmark Entry

Discovery of a Previously Unknown Self- Portrait of Leonardo February 28, 2009

On February 28, 2009 Italian researchers reported the discovery of a previously unknown self-portrait by Leonardo da Vinci drawn when the artist was a young man. The faint pencil sketch was recognized underneath writing on a sheet of the “Codex on the Flight of Birds”, written between 1490 and 1505 and preserved in the Biblioteca Reale, Torino, Italy.

Piero Angela, an Italian scientific journalist, studying the document noticed the faint outline of a human nose hidden underneath lines of ink handwriting. It struck him as being similar in shape and drawing style to a later self-portrait of Leonardo. It is thought that Leonardo first made the drawing during the 1480s and reused the sheet for his manuscript on bird flight.

"Over months of micro-pixel work, graphic designers gradually 'removed' the text by making it white instead of black, revealing the drawing beneath. "What emerged was the face of a young to middle-aged man with long hair, a short beard and a pensive gaze.

"Mr Angela was struck by similarities to a famous self-portrait of Leonardo, made when the artist was an old man around 1512. The portrait, in red chalk, is kept in Turin’s Biblioteca Reale, or Royal Library.

"The research team used criminal investigation techniques to digitally correlate the newly-discovered sketch with the well-known portrait.

"They employed facial reconfiguration technology to age the drawing of the younger man, hollowing the cheeks, darkening the eyes and furrowing the brow.

"The two portraits were so similar 'that we may regard the hypothesis that the images portray the same person as reasonable', police photo-fit experts declared.

"To make doubly sure, the ageing process was reversed, with researchers using a digital 'facelift' to rejuvenate the older self-portrait.

"After removing the older Leonardo’s wrinkles and filling out his cheeks, the image that emerged was almost identical to the newly discovered sketch.

" 'When I actually tried to age the face [of the newly discovered portrait], and to put the hair and the beard of the famous self-portrait around it, a shiver ran down my spine,' said Mr Angelo. 'It resembled Leonardo like a twin brother. To uncover a new Leonardo drawing was astonishing.'

"The similarities were also studied by a facial reconstruction surgeon in Rome. '[He] said the two faces could well belong to the same man at different times in his life', said Mr Angelo.

"A world expert on Leonardo, Carlo Pedretti from the University of California, described the sketch as 'one of the most important acquisitions in the study of Leonardo, in the study of his image, and in the study of his thought too' (http://www.telegraph.co.uk/news/worldnews/europe/italy/4884789/Leonardo-da-Vinci-self-portrait-discovered-hidden-in-manuscript.html, accessed 02-28-2009).

View Map + Bookmark Entry

The Largest Municipal Archive in Germany Collapses During Underground Construction March 3, 2009

On March 3, 2009 the building containing the Historic Archive of the City of Cologne (Historisches Archiv der Stadt Köln) collapsed in a pile of rubble. The building was apparently constructed in 1971.

"Fortunately, staffers, researchers, and onsite construction workers inside the building were alarmed by strange noises and left immediately before the structure collapsed earlier today. However, at the time of this writing, three [people who were in buildings adjacent to the archives are still missing.

"At present, the cause of the building's collapse is unknown. A new subway line is being built under the street in front of the facility, but the section of the tunnel adjacent to the building is apparently complete. The building may also have had structural problems.

"Until today, the repository in Cologne was the largest municipal archives in Germany. It held 500,000 photographs and 65,000 documents dating back to 922, including manuscripts by Karl Marx and Friedrich Engels and materials relating to 20th-century writer Heinrich Böll. Government officials have promised to help salvage the archives' records, but street-level and aerial photographs of the building's remains suggest that many of the records are beyond recovery" (http://larchivista.blogspot.com/2009/03/collapse-of-historic-archive-of-city-of.html).

As of March 4, 2009 it was thought that two people from an adjacent building were missing; the Historic Archive of the City of Cologne was successfully evacuated before the building collapsed.

News stories were referenced at http://archiv.twoday.net/stories/5558898/. 

In December 2013 a detailed story in Spiegel Online International was available at this link.

View Map + Bookmark Entry

A Higher Resolution Map of Knowledge Than Can be Produced from Citation Analysis March 11, 2009

On March 11, 2009 Johan Bollen of Los Alamos National Laboratory and six co-authors published "Clickstream Data Yields High Resolution Maps of Science" in the open access online journal Plos ONE.  The map was based on clickstream data collected when online readers switched from one journal to another, allowing the collection of about one billion data points—a far greater number, and presumably more reflective of actual reading patterns, than the prior method of citation analysis developed by the Institute for Scientific Information (now Thomson Scientific's Web of Science). That method traces the relationship of footnotes in scholarly journals.

"Maps of science derived from citation data visualize the relationships among scholarly publications or disciplines. They are valuable instruments for exploring the structure and evolution of scholarly activity. Much like early world charts, these maps of science provide an overall visual perspective of science as well as a reference system that stimulates further exploration. However, these maps are also significantly biased due to the nature of the citation data from which they are derived: existing citation databases overrepresent the natural sciences; substantial delays typical of journal publication yield insights in science past, not present; and connections between scientific disciplines are tracked in a manner that ignores informal cross-fertilization.

"Scientific publications are now predominantly accessed online. Scholarly web portals provide access to publications in the natural sciences, social sciences and humanities. They routinely log the interactions of users with their collections. The resulting log datasets have a set of attractive characteristics when compared to citation datasets. First, the number of logged interactions now greatly surpasses the volume of all existing citations. This is illustrated by Elsevier's announcement, in 2006, of 1 billion (1×109) article downloads since the launch of its Science Direct portal in April 1999. In contrast, around the time of Elsevier's announcement, the total number of citations in Thomson Scientific's Web of Science from the year 1900 to the present does not surpass 600 million (6×108). Second, log datasets reflect the activities of a larger community as they record the interactions of all users of scholarly portals, including scientific authors, practitioners of science, and the informed public. In contrast, citation datasets only reflect the activities of scholarly authors. Third, log datasets reflect scholarly dynamics in real-time because web portals record user interactions as soon as an article becomes available at the time of its online publication. In contrast, a published article faces significant delays before it eventually appears in citation datasets: it first needs to be cited in a new article that itself faces publication delays, and subsequently those citations need to be picked up by citation databases.

"Given the aforementioned characteristics of scholarly log data, we investigated a methodological issue: can valid, high resolution maps of science be derived from clickstream data and can clickstream data be leveraged to yield meaningful insights in the structure and dynamics of scholarly behavior? To do this we first aggregated log datasets from a variety of scholarly web portals, created and analyzed a clickstream model of journal relationships from the aggregate log dataset, and finally visualized these journal relationships in a first-ever map of science derived from scholarly log data" (http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0004803#pone.0004803-Brody1, accessed 03-19-2009).

View Map + Bookmark Entry

The Largest U.S. Newspaper to Become an Internet-Only News Source March 17, 2009

The Seattle Post-Intelligencer newspaper issued its last printed edition on March 17, 2009, and became an Internet-only news source, seattlepi.com. This was the largest U.S. newspaper to date to convert entirely to digital distribution. 

View Map + Bookmark Entry

"Computers vs. Brains" April 1, 2009

According to the article referenced below, the entire archived content of the Internet occupied three petabytes (3 x 1000 terabytes) in April 2009. 

It is thought that one human brain may store roughly one petabyte. Though there may be some similarity in storage capacity between the quantity of information on the Internet and information stored in the human brain, quantity is the main point of similarity, since the information is stored and processed in totally different ways by people and computers.

Sandra Aamodt and Sam Wang, "Guest Column: Computers vs. Brains," New York Times Blogs, 03-31-2009.

View Map + Bookmark Entry

Using Automation to Find "Fundamental Laws of Nature" April 3, 2009

Michael Schmidt and Hod Lipson of Cornell University published "Distilling Free-Form Natural Laws from Experimental Data," Science 3 April 2009: Vol. 324. no. 5923, pp. 81 - 85 DOI: 10.1126/science.1165893.  The paper described a computer program that sifted raw and imperfect data to uncover fundamental laws of nature.

"For centuries, scientists have attempted to identify and document analytical laws that underlie physical phenomena in nature. Despite the prevalence of computing power, the process of finding natural laws and their corresponding equations has resisted automation. A key challenge to finding analytic relations automatically is defining algorithmically what makes a correlation in observed data important and insightful. We propose a principle for the identification of nontriviality. We demonstrated this approach by automatically searching motion-tracking data captured from various physical systems, ranging from simple harmonic oscillators to chaotic double-pendula. Without any prior knowledge about physics, kinematics, or geometry, the algorithm discovered Hamiltonians, Lagrangians, and other laws of geometric and momentum conservation. The discovery rate accelerated as laws found for simpler systems were used to bootstrap explanations for more complex systems, gradually uncovering the "alphabet" used to describe those systems" (Abstract from Science)

View Map + Bookmark Entry

Robot Scientist becomes the First Machine to Discover New Scientific Knowledge April 3, 2009

Ross D. King, Jem Rowland and 11 co-authors from the Department of Computer Science at Aberystwyth University, Aberystwyth, Wales, and the University of Cambridge, published "The Automation of Science," Science 3 April 2009: Vol. 324. no. 5923, pp. 85 - 89 DOI: 10.1126/science.1165620. In this paper they described a Robot Scientist which the researchers believed was the first machine to have independently discovered new scientific knowledge. The robot, called Adam, was a computer system that fully automated the scientific process. 

"Prof Ross King, who led the research at Aberystwyth University, said: 'Ultimately we hope to have teams of human and robot scientists working together in laboratories'. The scientists at Aberystwyth University and the University of Cambridge designed Adam to carry out each stage of the scientific process automatically without the need for further human intervention. The robot has discovered simple but new scientific knowledge about the genomics of the baker's yeast Saccharomyces cerevisiae, an organism that scientists use to model more complex life systems. The researchers have used separate manual experiments to confirm that Adam's hypotheses were both novel and correct" (http://www.eurekalert.org/pub_releases/2009-04/babs-rsb032709.php).

"The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge" (Abstract in Science).

View Map + Bookmark Entry

Australia to Build National Fiber Optic 100 Megabit Network April 7, 2009

According to the New York Times, on April 7, 2009 the government of Australia, Canberra, said that it

"would create a publicly owned company to build a national high- speed broadband network, spending 43 billion Australian dollars in one of the largest state-sponsored Internet infrastructure upgrades in the world. 

"Prime Minister Kevin Rudd said the eight-year, $31 billion project would create up to 37,000 jobs at the peak of construction, giving a lift to the economy as retail spending slumps and mining companies cut workers amid weakening demand for Australian metals. The plan is 'the most ambitious, far-reaching and long-term nation-building infrastructure project ever undertaken by an Australian government,' Mr. Rudd told reporters.

"The government’s announcement was a surprise rebuff to five private telecommunications firms, including Optus of Singapore and Axia NetMedia of Canada, that had been bidding to build a slower, less expensive network, with fiber-optic cables reaching as far as local nodes, worth around 10 billion dollars.

"But Mr. Rudd scrapped those proposals in favor of a superior but more expensive network that will deliver broadband speeds of up to 100 megabits per second — fast enough to download multiple movies simultaneously — to 90 percent of Australian buildings through fiber-optic cables that extend directly to the premises. The remaining 10 percent will receive upgraded wireless access."

View Map + Bookmark Entry

The First Collaborative Online Orchestra April 15, 2009

On April 15, 2009 the YouTube Symphony Orchestra, under the direction of San Francisco Symphony conductor, Michael Tilson Thomas, debuted at Carnegie Hall in New York. Considered the first collaborative online orchestra, promoted on YouTube, auditioned entirely through YouTube videos, and sponsored by Google, the owner of YouTube,

"The YouTube Symphony Orchestra's show features soloists, chamber groups, chamber orchestra, large orchestra, electronica and multi-media, and samples diverse periods and styles of classical music, including works by Gabrieli, Bach, Mozart, Brahms, Villa-Lobos, John Cage and Tan Dun’s Internet Symphony No. 1 'Eroica.'

"It could be described as something between a summit conference, scout jamboree or musical get-together. It'll be the first time that people from so many different countries will have had a chance to discover one another online and then actually meet up and make music together." - Michael Tilson Thomas on NPR’s All Things Considered" (Carnegie Hall website, accessed 04-11-2009).

View Map + Bookmark Entry

The World Digital Library Launches April 21, 2009

On April 21, 2009 UNESCO, Paris, France, and 32 partner institutions launched the World Digital Library, a web site that featured unique cultural materials from libraries and archives around the world. The site included manuscripts, maps, rare books, films, sound recordings, and prints and photographs.

"The WDL will function in Arabic, Chinese, English, French, Portuguese, Russian and Spanish, and will include content in a great many other languages. Browse and search features will facilitate cross-cultural and cross-temporal exploration on the site. Descriptions of each item and videos with expert curators speaking about selected items will provide context for users, and are intended to spark curiosity and encourage both students and the general public to learn more about the cultural heritage of all countries. The WDL was developed by a team at the Library of Congress. Technical assistance was provided by the Bibliotheca Alexandrina of Alexandria, Egypt. Institutions contributing content and expertise to the WDL include national libraries and cultural and educational institutions in Brazil, Egypt, China, France, Iraq, Israel, Japan, Mali, Mexico, Morocco, the Netherlands, Qatar, the Russian Federation, Saudi Arabia, Serbia, Slovakia, Sweden, Uganda, the United Kingdom, and the United States" (http://portal.unesco.org/ci/en/ev.php-URL_ID=28484&URL_DO=DO_TOPIC&URL_SECTION=201.html)

View Map + Bookmark Entry

The TV Show "Jeopardy" Provides a Good Model of the Semantic Analysis and Integration Problem April 22, 2009

On April 22, 2009 David Ferrucci, leader of the Semantic Analysis and Integration Department at IBM's T. J. Watson's Research Center, and Eric Nyberg, and several co-authors published the IBM Research Report: Towards the Open Advancement of Question Answering Systems.

Section 4.2.3. of the report included an analysis of why the television game show Jeopardy! provided a good model of the semantic analysis and integration problem.

View Map + Bookmark Entry

IBM's Watson Question Answering System Challenges Humans at "Jeopardy" April 27, 2009

On April 27, 2009 IBM announced that its Watson Question Answering (QA) System will challenge humans in the television quiz show Jeopardy!

"IBM is working to build a computing system that can understand and answer complex questions with enough precision and speed to compete against some of the best Jeopardy! contestants out there.

"This challenge is much more than a game. Jeopardy! demands knowledge of a broad range of topics including history, literature, politics, film, pop culture and science. What's more, Jeopardy! clues involve irony, riddles, analyzing subtle meaning and other complexities at which humans excel and computers traditionally do not. This, along with the speed at which contestants have to answer, makes Jeopardy! an enormous challenge for computing systems. Code-named "Watson" after IBM founder Thomas J. Watson, the IBM computing system is designed to rival the human mind's ability to understand the actual meaning behind words, distinguish between relevant and irrelevant content, and ultimately, demonstrate confidence to deliver precise final answers.

"Known as a Question Answering (QA) system among computer scientists, Watson has been under development for more than three years. According to Dr. David Ferrucci, leader of the project team, 'The confidence processing ability is key to winning at Jeopardy! and is critical to implementing useful business applications of Question Answering.

"Watson will also incorporate massively parallel analytical capabilities and, just like human competitors, Watson will not be connected to the Internet, or have any other outside assistance.  

"If we can teach a computer to play Jeopardy!, what could it mean for science, finance, healthcare and business? By drastically advancing the field of automatic question answering, the Watson project's ultimate success will be measured not by daily doubles, but by what it means for society" (http://www.research.ibm.com/deepqa/index.shtml, accessed 06-16-2010).

On June 16, 2010 The New York Times Magazine published a long article by Clive Thompson on IBM's Watson's challenge of humans in Jeopardy! entitled, in the question response language of Jeopardy!, "What is I.B.M.'s Watson?."

♦ In December 2013 answers to frequently asked questions concerning Watson and Jeopardy! were available from IBM's website at this link.

View Map + Bookmark Entry

Kickstarter.com is Launched April 28, 2009

On April 28, 2009 Perry Chen, Yancey Strickler, and Charles Adler launched Kickstarter.com, originally under the url of KickStartr.com. The company was based in New York City.

"One of a number of fundraising platforms dubbed 'crowd funding,' Kickstarter facilitates gathering monetary resources from the general public, a model which circumvents many traditional avenues of investment. Project creators choose a deadline and a goal minimum of funds to raise. If the chosen goal is not gathered by the deadline, no funds are collected (this is known as a provision point mechanism). Money pledged by donors is collected using Amazon Payments. The platform is open to backers from anywhere in the world and to creators from the US or the UK.

"Kickstarter takes 5% of the funds raised. Amazon charges an additional 3–5%. Unlike many forums for fundraising or investment, Kickstarter claims no ownership over the projects and the work they produce. However, projects launched on the site are permanently archived and accessible to the public. After funding is completed, projects and uploaded media cannot be edited or removed from the site" (Wikipedia article on Kickstarter, accessed 02-21-2013).

View Map + Bookmark Entry

Using YouTube Videos to Study the Origins of Music in Societies April 30, 2009

On April 30, 2009 psychologist Adena Schachner of Harvard University and co-authors published "Spontaneous Motor Entrainment to Music in Multiple Vocal Mimicking Species," Current Biology (30 April 2009).

Basing their research on the examination of more than 1000 YouTube videos of dancing animals, the researchers found 14 parrot species and one elephant genunely capable of keeping time, showing that "an ability to appreciate music and keep a rhythm is not unique to humans.

"Schachner analyzed the videos frame-by-frame, comparing the animals' movements with the speed of the music and the alignment of individual beats. The group also studied another bird, Alex, an African grey parrot, which had exhibited similar abilities to Snowball, nodding its head appreciatively to a series of drum tracks.

" 'Our analyses showed that these birds' movements were more lined up with the musical beat than we'd expect by chance,' says Schachner. 'We found strong evidence that they were synchronizing with the beat, something that has not been seen before in other species.' 

"Aniruddh Patel of The Neurosciences Institute in San Diego, who led another study of Snowball's performance, said that the bird had demonstrated an ability to adjust the tempo of his dancing to stay synchronized to the beat.

"Scientists had previously thought that 'moving to a musical beat might be a uniquely human ability because animals are not commonly seen moving rhythmically in the wild,' Patel said.

"Schachner said there was no evidence to suggest that animals such as apes, dogs or cats could recognize music, despite their extensive experience of humans. That leads researchers to believe that an ability to process musical sounds may be linked to an ability to mimic sounds -- something that each of the parrots studied by researchers was able to do excellently, she said.

"Other 'vocal-learning species' include dolphins, elephants, seals and walruses.

" 'A natural question about these results is whether they generalize to other parrots, or more broadly, to other vocal-learning species,' Schachner said.

"Researchers believe a possible link between vocal mimicry and an ability to hear music may explain the development of music in human societies. advertisement

" 'The question of why music is found in every known human culture is a longstanding puzzle. Many argue that it is an adaptive behaviour that helped our species to evolve. But equally plausible is the possibility that it emerged as a by-product of other abilities -- such as vocal learning,' music psychologist Lauren Stewart of Goldsmiths, University of London told CNN.

" 'Parrots and humans both have the ability to imitate sounds that they hear, unlike our closer simian relatives. Once a species has the neural machinery in place for coupling the perception and production of vocal sounds, it may be only a small step to use the same circuits for synchronizing movements to a beat.' " (http://www.cnn.com/2009/TECH/science/05/01/dancing.parrots/?iref=hpmostpop#cnnSTCText )

View Map + Bookmark Entry

Using Air Traffic and Currency Tracking Data in Epidemiology May 3, 2009

In May 2009 Dirk Brockmann, and the epidemic modeling team at the Northwestern Institute on Complex Systems, used air traffic and commuter traffic patterns for the entire country, and data from the American currency tracking website, Where’s George?, to predict the spread of the H1N1 flu or "swine flu" across the United States.

View Map + Bookmark Entry

Increasing Sales of Digital Books (eBooks) May 5, 2009

In an article entitled "The Future of the Book Turns a Page" published on May 5, 2009 The Christian Science Monitor reported:

"By most measurements, digital books are a mere page in the novel of publishing, which hovers annually around $25 billion. But in the last year, what was a budding niche market has had a major growth spurt.

"The Association of American Publishers (AAP), the industry’s primary trade group, has tracked digital book sales since 2003, when wholesale revenues amounted to $20 million. By 2007, that number had ambled up to $67 million. But in 2008, the figure nearly doubled to some $113 million.

"This year is off to an equally heady start, says Ed McCoyd, director of digital policy for AAP, pointing to the whopping 173 percent jump in sales from January 2008."

View Map + Bookmark Entry

Larger Version of the Amazon Kindle Introduced May 6, 2009

On May 6, 2009 Jeff Bezos of Amazon.com unveiled a larger version of the Amazon Kindle called the Kindle DX (for Deluxe). The larger model had a 

"9.7-inch display with auto-rotation, high-speed wireless access to 275,000 books, 3.3 gigabytes of storage, or room for up to 3,500 books. Native support for PDF documents, with no panning, zooming or scrolling necessary" (http://bits.blogs.nytimes.com/2009/05/06/live-blogging-the-kindle-fest/).

The initial list price of the DX was $489, or $130 more than the previous model, the Kindle 2. The DX was available for sale in the summer of 2009.

View Map + Bookmark Entry

Wolfram/Alpha is Launched May 16, 2009

On May 16, 2009 Stephen Wolfram and Wolfram Research, Champaign, Illinois, launched Wolfram|Alpha, a computational data engine with a new approach to knowledge extraction, based on natural language processing, a large library of algorithms, and an NKS (New Kind of Science) approach to answering queries.

The Wolfram|Alpha engine differed from traditional search engines in that it did not simply return a list of results based on a query, but instead computed an answer.

View Map + Bookmark Entry

"Green Dam Youth Escort" May 19, 2009

On May 19, 2009 the Ministry of Industry and Information Technology of the People's Republic of China issued a directive that, as of July 1, 2009, Green Dam Youth Escort (simplified Chinese: 绿坝-花季护航) must be pre-installed on, or shipped on a compact disc with, all personal computers sold in the mainland of the People's Republic of China, including those imported from abroad.

Using the Golden Shield Project, sometimes called the "Great Firewall of China," China regularly restricted access to certain Internet sites and information that the government deemed sensitive.

"Critics fear this new software could be used by the government to enhance internet censorship. The Computer and Communications Industry Association said the development was 'very unfortunate'. Ed Black, CCIA president criticised the move as 'clearly an escalation of attempts to limit access and the freedom of the internet, [...with] economic and trade as well as cultural and social ramifications.' Black said the Chinese were attempting to 'not only control their own citizens' access to the internet but to force everybody into being complicit and participate in a level of censorship'.

"On 8 June, Microsoft said that appropriate parental control tools was 'an important societal consideration'. However, 'we agree with others in industry and around the world that important issues such as freedom of expression, privacy, system reliability and security need to be properly addressed.'

"A spokesman for the Foreign ministry said the software would filter out pornography or violence. "The Chinese government pushes forward the healthy development of the internet. But it lawfully manages the internet," he added.

"On 11 June, a BBC News article reported that potential faults in the software could lead to a large-scale disaster: The report included comments by Isaac Mao, who said that there were 'a series of software flaws', including the unencrypted communications between the software and the company's servers, which could allow hackers access to people's private data or place malicious script on machines on the network to "affect [a] large scale disaster' " (Wikipedia article on Green Dam Youth Escort, accessed 06-11-2009).

View Map + Bookmark Entry

Changing the Advertising Model for General News Reporting May 21, 2009

In an interview in the Financial Times on May 21, 2009, Google CEO Eric Schmidt revealed

"that Google seriously considered either buying a newspaper as a for-profit enterprise or hiring a pack of smart lawyers to reconfigure the paper as a nonprofit venture. He doesn't name which paper, of course, but the Financial Times reporters pointedly remind their readers that the hedge fund Harbinger Capital Partners offered Google its twenty percent stake in the New York Times. Ultimately, however, the company decided that going so far as owning an outlet that actually produced copy, rather than simply aggregating and organizing it, would be 'crossing the line' between a content company and a technology company. Wall Street Journal writer Jessica Vascellaro argues that this position is growing increasingly flimsy. After all, she writes, both YouTube and Google's Book Search project are awfully close to resembling content production.

"The real reason may be twofold. First, as Schmidt readily concedes, the targeted papers are either far too expensive or burdened with too much debt and liabilities. Second, the advertising model for general news reporting is obsolete, and Google's execs have decided instead to work with papers such as the Washington Post . . .to come up with a new model that can subsidize serious general news gathering. The days when general display ads would float on the page, contextually disconnected from the substance of the stories, are over. But who wants their ads tied to stories of Gitmo torture? Unless the business model radically changes, there will be no revenue stream that props up the most serious and important news stories.

"So what does Schmidt have in mind for the Washington Post? 'It seems to me that the newspaper that I read online should remember what I read. It should allow me to go deeper into the stories. It's that kind of a discussion that we're having.' In other words, the paper will store and archive a catalogue of the stories you read, steer more stories along those lines to your eyeballs, and keep you coming back for more by knowing what you're most interested in. Google already remembers what you search for, in order to more accurately match ads to your search screen. Now, it seems, Schmidt would like to apply this technique to news gathering." 

View Map + Bookmark Entry

Google Will Sell eBooks May 31, 2009

At the BookExpo convention in New York, on May 31, 2009 Google announced its intention to sell ebooks (e-books) directly to consumers through its Google Books service. In contrast to Amazon, which sold ebooks at the fixed price of $9.95 per title and only through its proprietary Kindle ebook reader, Google allowed publishers to set the price of ebook titles and make them available across browsers, cell phones, and other platforms.

View Map + Bookmark Entry

The WARC Format as an International File Preservation Standard June 1, 2009

On June 1, 2009 the International Internet Preservation Consortium (IIPC), netpreserve. org published the WARC file format as an international standard: ISO 28500:2009, Information and documentation—WARC file format.

"For many years, heritage organizations have tried to find the most appropriate ways to collect and keep track of World Wide Web material using web-scale tools such as web crawlers. At the same time, these organizations were concerned with the requirement to archive very large numbers of born-digital and digitized files. A need was for a container format that permits one file simply and safely to carry a very large number of constituent data objects (of unrestricted type, including many binary types) for the purpose of storage, management, and exchange. Another requirement was that the container need only minimal knowledge of the nature of the objects.

"The WARC format is expected to be a standard way to structure, manage and store billions of resources collected from the web and elsewhere. It is an extension of the ARC format , which has been used since 1996 to store files harvested on the web. WARC format offers new possibilities, notably the recording of HTTP request headers, the recording of arbitrary metadata, the allocation of an identifier for every contained file, the management of duplicates and of migrated records, and the segmentation of the records. WARC files are intended to store every type of digital content, either retrieved by HTTP or another protocol" (http://netpreserve.org/press/pr20090601.php).

View Map + Bookmark Entry

The First Magazine Cover Created as iPhone Art June 1, 2009

Artist Jorge Columbo's cover art for the New Yorker magazine of June 1, 2009 drawn entirely on an iPhone using the Brushes app was the first iPhone art published as the cover of a major magazine.

"It has been widely reported that my drawings are now made on an iPhone... Considering all the sketches and watercolors and photographs I have done in the USA for the past twenty years, my output in the Brushes app since I bought a G3 last February is still rather small. It has attracted more attention than anything else I have done: it seems people can't resist a nice tech story. But it's a happy affair. As much as I enjoy and admire other media, drawing on a screen that's always bright even on a dark street, with no paint to carry, no brushes to wash, and countless levels of "undo", seems to agree with me. I always work on location, drawing everything from scratch, with no use of photography whatsoever. (The app churns out Quicktime movies that detail each brushtroke, as seen in The New Yorker's website; it mercifully ignores all the trial-and-errors and failed attempts, making my progression look uncannily flawless. That's so not true.) I could carry a pad or even an easel around. But drawing on a phone is so discreet, so casual" (http://www.drawger.com/jorgecolombo/?section=articles&article_id=9154, accessed 01-07-2010).

♦ On January 07, 2010 you could watch a series of Quicktime movies of Jorge Columbo creating iPhone paintings on the New Yorker website at this link: http://www.newyorker.com/video?videoID=40951183001.

View Map + Bookmark Entry

Costs of Managed Archiving versus Passive Archiving of Data June 4, 2009

"Regarding storage costs -- again its unhelpful to be vague, but equally unhelpful to be too specific. The cost of a 1 TB [terabyte] hard drive from the local IT hyperstore is NOT a useful number for estimating cost of reliable storage. Unfortunately the 'price of reliability' is equally hard to determine.

"The 'rule of thumb' most quoted now is 'one million dollars per year per petabyte' for 'managed server' storage eg disc-based storage from a well-run data centre that does good redundancy and backups. That means of course one thousand dollars per terabyte (per year) and that's a good estimate, in my view, to use for funding request and planning purposes. It can be done more cheaply -- up to ten times cheaper -- but that introduces various risks and requirements that you may or may not want to get into. In the BBC where we know that archive content is, on average, used once per four years, we're happy to put datatape on shelves and go for a much lower cost per terabyte" (Richard Wright, Sr Research Engineer, Research & Development, BBC Future Media & Technology, from: owner-dcc-associates@lists.ed.ac.uk, 06-04-2009).

View Map + Bookmark Entry

"Revenue at Craigslist is Said to Top $100,000,000" as Classified Advertising in Newspapers Declines June 9, 2009

"SAN FRANCISCO — As the newspaper industry and its classified advertising business wither, one company appears to be doing extraordinarily well: Craigslist.

"The Internet classified ads company, which promotes its “relatively noncommercial nature” and “service mission” on its site, is projected to bring in more than $100 million in revenue this year, according to a new study from Classified Intelligence Report, a publication of AIM Group, a media and Web consultant firm in Orlando, Fla.

"That is a 23 percent jump over the revenue the firm estimated for 2008 and a huge increase since 2004, when the site was projected to bring in just $9 million. 'This is a down-market for just about everyone else but Craigslist,' said Jim Townsend, editorial director of AIM Group. The firm counted the number of paid ads on the site for a month and extrapolated an annual figure. It said its projections were conservative.

"By contrast, classified advertising in newspapers in the United States declined by 29 percent last year, its worst drop in history, according to the Newspaper Association of America" (http://www.nytimes.com/2009/06/10/technology/internet/10craig.html?hpw, accessed 06-10-2009).

View Map + Bookmark Entry

The U.S. Converts from Analog to Digital TV Broadcasting June 12, 2009

On June 12, 2009 the United States converted from analog to digital television broadcasting.

"The switch from analog to digital broadcast television is referred to as the digital TV (DTV) transition. In 1996, the U.S. Congress authorized the distribution of an additional broadcast channel to each broadcast TV station so that they could start a digital broadcast channel while simultaneously continuing their analog broadcast channel. Later, Congress set June 12, 2009 as the final date that full power television stations can broadcast analog signals. As of June 13, 2009, full power television stations will only broadcast digital, over-the-air signals. Your local broadcasters may make the transition before then, and some already have.

"The digital transition is underway. Prepare now! On Feb. 17, some full-power broadcast television stations in the United States may stop broadcasting on analog airwaves and begin broadcasting only in digital. The remaining stations may stop broadcasting analog sometime between April 16 and June 12. June 12 is the final deadline for terminating analog broadcasts under legislation passed by Congress.

"Why are we switching to DTV?

"An important benefit of the switch to all-digital broadcasting is that it will free up parts of the valuable broadcast spectrum for public safety communications (such as police, fire departments, and rescue squads). Also, some of the spectrum will be auctioned to companies that will be able to provide consumers with more advanced wireless services (such as wireless broadband).

"Consumers also benefit because digital broadcasting allows stations to offer improved picture and sound quality, and digital is much more efficient than analog. For example, rather than being limited to providing one analog program, a broadcaster is able to offer a super sharp “high definition” (HD) digital program or multiple “standard definition” (SD) digital programs simultaneously through a process called “multicasting.” Multicasting allows broadcast stations to offer several channels of digital programming at the same time, using the same amount of spectrum required for one analog program. So, for example, while a station broadcasting in analog on channel 7 is only able to offer viewers one program, a station broadcasting in digital on channel 7 can offer viewers one digital program on channel 7-1, a second digital program on channel 7-2, a third digital program on channel 7-3, and so on. This means more programming choices for viewers. Further, DTV can provide interactive video and data services that are not possible with analog technology" (http://dtv.gov/whatisdtv.html, accessed 06-12-2009).

View Map + Bookmark Entry

Piracy of Internet Filtering Software? June 13, 2009

Solid Oak Software Inc, developer of CyberSitter, alleged on June 13, 2009 that an Internet-filtering program called Green Dam Youth Escort produced in China, and mandated by the Chinese government, contained stolen portions of the company's code.

"Solid Oak Software, the developer of CyberSitter, claims that the look and feel of the GUI used by Green Dam mimics the style of CyberSitter. But more damning, chief executive Brian Milburn said, was the fact that the Green Dam code uses DLLs identified with the CyberSitter name, and even makes calls back to Solid Oak's servers for updates" (http://www.pcmag.com/article2/0,2817,2348705,00.asp, accessed 06-13-2009).

Solid Oak Software Inc. said it will try to stop PC makers from shipping computers with the software.

"Solid Oak said Friday that it found pieces of its CyberSitter filtering software in the Chinese program, including a list of terms to be blocked, instructions for updating the software, and an old news bulletin promoting CyberSitter. Researchers at the University of Michigan who have been studying the Chinese program also said they found components of CyberSitter, including the blacklist of terms.

"Jinhui Computer System Engineering Co., the Chinese company that made the filtering software, denied stealing anything. "That's impossible," said Bryan Zhang, Jinhui's founder, in response to Solid Oak's charges.

"The allegations come as PC makers such as Dell Inc. and Hewlett-Packard Co. are sorting through a mandate by the Chinese government requiring that all PCs sold in China as of July come with the filtering software. Representatives of the two big U.S. companies said they are working with trade associations to monitor new developments related to the Chinese software" (http://online.wsj.com/article/SB124486910756712249.html, accessed 06-13-2009).

View Map + Bookmark Entry

Employment in the Field of Simulation June 14, 2009

"As employment headlines go from grim to grimmer, it’s appropriate that one job category with expanding demand involves helping people avoid reality. Designers of computer simulations are sought in many fields to help understand complex, multifaceted phenomena that are too expensive or perilous to study in real life."

Bill Waite, chairman of the AEgis Technologies Group, a Huntsville, Ala., company that creates simulations for various military and civilian applications, "estimates that 400,000 people make a living in the United States in one aspect or another of simulation" (http://www.nytimes.com/2009/06/14/jobs/14starts.html?8dpc, accessed 06-22-2009).

View Map + Bookmark Entry

"The Web Pries Lid off Iranian Censorship" June 23, 2009

"At one time, authoritarian regimes could draw a shroud around the events in their countries by simply snipping the long-distance phone lines and restricting a few foreigners. But this is the new arena of censorship in the 21st century, a world where cellphone cameras, Twitter accounts and all the trappings of the World Wide Web have changed the ancient calculus of how much power governments actually have to sequester their nations from the eyes of the world and make it difficult for their own people to gather, dissent and rebel.

"Iran’s sometimes faltering attempts to come to grips with this new reality are providing a laboratory for what can and cannot be done in this new media age — and providing lessons to other governments, watching with calculated interest from afar, about what they may be able to get away with should their own citizens take to the streets.

"One early lesson is that it is easier for Iranian authorities to limit images and information within their own country than it is to stop them from spreading rapidly to the outside world. While Iran has severely restricted Internet access, a loose worldwide network of sympathizers has risen up to help keep activists and spontaneous filmmakers connected.

"The pervasiveness of the Web makes censorship 'a much more complicated job,' said John Palfrey, a co-director of Harvard’s Berkman Center for Internet and Society.

"The Berkman Center estimates that about three dozen governments — as widely disparate as China, Cuba and Uzbekistan — extensively control their citizens’ access to the Internet. Of those, Iran is one of the most aggressive. Mr. Palfrey said the trend during this decade has been toward more, not less, censorship. 'It’s almost impossible for the censor to win in an Internet world, but they’re putting up a good fight,' he said.

"Since the advent of the digital age, governments and rebels have dueled over attempts to censor communications. Text messaging was used to rally supporters in a popular political uprising in Ukraine in 2004 and to threaten activists in Belarus in 2006. When Myanmar sought to silence demonstrators in 2007, it switched off the country’s Internet network for six weeks. Earlier this month, China blocked sites like YouTube to coincide with the 20th anniversary of the Tiananmen Square crackdown.

"In Iran, the censorship has been more sophisticated, amounting to an extraordinary cyberduel. It feels at times as if communications within the country are being strained through a sieve, as the government slows down Web access and uses the latest spying technology to pinpoint opponents. But at least in limited ways, users are still able to send Twitter messages, or tweets, and transmit video to one another and to a world of online spectators.

"Because of the determination of those users, hundreds of amateur videos from Tehran and other cities have been uploaded to YouTube in recent days, providing television networks with hours of raw — but unverified — video from the protests. 

"The Internet has 'certainly broken 30 years of state control over what is seen and is unseen, what is visible versus invisible,'  said Navtej Dhillon, an analyst with the Brookings Institution" (http://www.nytimes.com/2009/06/23/world/middleeast/23censor.html?hp).

View Map + Bookmark Entry

The Death of Michael Jackson Impacts the Internet June 25, 2009

The death of American entertainer Michael Jackson on June 25, 2009 had a remarkably dramatic impact on the Internet:

"The news of Jackson's death spread quickly online, causing websites to crash and slow down from user overload. Both TMZ and the Los Angeles Times, two websites that were the first to confirm the news, suffered outages. Google believed the millions of people searching 'Michael Jackson' meant it was under attack. Twitter reported a crash, as did Wikipedia at 3:15 PDT. The Wikimedia Foundation reported nearly one million visitors to the article Michael Jackson within one hour, which they said may be the most visitors in a one-hour period to any article in Wikipedia's history. AOL Instant Messenger collapsed for 40 minutes. AOL called it a seminal moment in Internet history,' adding, 'We've never seen anything like it in terms of scope or depth.' Around 15 percent of Twitter posts (or 5,000 tweets per minute) mentioned Jackson when the news broke, compared to topics such as the 2009 Iranian election and swine flu, which never rose above 5 percent of total tweets. Overall, web traffic was 11 percent higher than normal" (Wikipedia article on Death of Michael Jackson, accessed 07-04-2009).

View Map + Bookmark Entry

The Human Connectome Project July 2009

The Human Connectome Project, a five-year project sponsored by sixteen components of the National Institutes of Health (NIH) split between two consortia of research institutions, was launched as the first of three Grand Challenges of the National Institutes of Health's Blueprint for Neuroscience Research

The project was described as "an ambitious effort to map the neural pathways that underlie human brain function. The overarching purpose of the Project is to acquire and share data about the structural and functional connectivity of the human brain. It will greatly advance the capabilities for imaging and analyzing brain connections, resulting in improved sensitivity, resolution, and utility, thereby accelerating progress in the emerging field of human connectomics. Altogether, the Human Connectome Project will lead to major advances in our understanding of what makes us uniquely human and will set the stage for future studies of abnormal brain circuits in many neurological and psychiatric disorders" (http://www.humanconnectome.org/consortia/, accessed 12-28-2010).

View Map + Bookmark Entry

Virtual Reunification of the Codex Sinaiticus July 6, 2009

"To mark the online launch of the reunited Codex Sinaiticus, the British Library is staging an exhibition, From Parchment to Pixel: The Virtual reunification of Codex Sinaiticus, which runs from Monday 6 July until Monday 7 September, 2009 in the Folio Society Gallery at the Library's St Pancras building. Visitors will be able to view a range of historic items and artefacts that tell the story of the Codex and its virtual reunification, along with spectacular interactive representations of the manuscript and a digital reconstruction of the changes to a specific page over the centuries. In addition, they will see on display in the Treasures Gallery, for the very first time, both volumes of Codex Sinaiticus held at the British Library.

"The virtual reunification of Codex Sinaiticus is the culmination of a four-year collaboration between the British Library, Leipzig University Library, the Monastery of St Catherine (Mount Sinai, Egypt), and the National Library of Russia (St Petersburg), each of which hold different parts of the physical manuscript.

"By bringing together the digitised pages online, the project will enable scholars worldwide to research in depth the Greek text, which is fully transcribed and cross-referenced, including the transcription of numerous revisions and corrections. It will also allow researchers into the history of the book as a physical object to examine in detail aspects of its fabric and manufacture: pages can be viewed either with standard light or with raking light which, by illuminating each page at an angle, highlights the physical texture and features of the parchment.

" 'The Codex Sinaiticus is one of the world's greatest written treasures,' said Dr Scot McKendrick, Head of Western Manuscripts at the British Library. “This 1600-year-old manuscript offers a window into the development of early Christianity and first-hand evidence of how the text of the bible was transmitted from generation to generation. The project has uncovered evidence that a fourth scribe – along with the three already recognised – worked on the text; the availability of the virtual manuscript for study by scholars around the world creates opportunities for collaborative research that would not have been possible just a few years ago.'

"The Codex Sinaiticus Project was launched in 2005, when a partnership agreement was signed by the four partner organisations that hold extant pages and fragments. A central objective of the project is the publication of new research into the history of the Codex. Other key aims of the project were to undertake the preservation, digitisation and transcription of the Codex and thereby reunite the pages, which have been kept in separate locations for over 150 years.

"Professor David Parker from the University of Birmingham's Department of Theology, who directed the team funded by the UK's Arts and Humanities Research Council (AHRC), which made the electronic transcription of the manuscript said: 'The process of deciphering and transcribing the fragile pages of an ancient text containing over 650,000 words is a huge challenge, which has taken nearly four years.

" 'The transcription includes pages of the Codex which were found in a blocked-off room at the Monastery of St Catherine in 1975, some of which were in poor condition,' added Professor Parker. 'This is the first time that they have been published. The digital images of the virtual manuscript show the beauty of the original and readers are even able to see the difference in handwriting between the different scribes who copied the text. We have even devised a unique alignment system which allows users to link the images with the transcription. This project has made a wonderful book accessible to a global audience.' To mark the successful completion of the project, the British Library is hosting an academic conference on 6-7 July 2009 entitled 'Codex Sinaiticus: text, Bible, book'. A number of leading experts will give presentations on the history, text, conservation, palaeography and codicology of the manuscript. See: http://www.codexsinaiticus.org/en/project/conference.aspx" http://www.artdaily.org/index.asp?int_sec=2&int_new=31895, accessed 07-07-2009)

View Map + Bookmark Entry

Amazon Sends Orwell eBooks Down the "Memory Hole" July 16, 2009

"In George Orwell’s '1984,' government censors erase all traces of news articles embarrassing to Big Brother by sending them down an incineration chute called the 'memory hole.'

"On Friday, it was '1984' and another Orwell book, 'Animal Farm,' that were dropped down the memory hole — by Amazon.com.

"In a move that angered customers and generated waves of online pique, Amazon remotely deleted some digital editions of the books from the Kindle devices of readers who had bought them.

"An Amazon spokesman, Drew Herdener, said in an e-mail message that the books were added to the Kindle store by a company that did not have rights to them, using a self-service function. 'When we were notified of this by the rights holder, we removed the illegal copies from our systems and from customers’ devices, and refunded customers,' he said.

'Amazon effectively acknowledged that the deletions were a bad idea. 'We are changing our systems so that in the future we will not remove books from customers’ devices in these circumstances,' Mr. Herdener said" (http://www.nytimes.com/2009/07/18/technology/companies/18amazon.html, accessed 07-25-2009).

"Books in the real world are covered by a notion of copyright called the 'first sale' doctrine, which allows a purchaser to do pretty much whatever he or she wants with the book–including reselling it or lending it to a friend.

"But digital books–especially if they’re sold as part of access to a networked system such as Amazon’s Kindle Store and Google’s online books collection–don’t necessarily fall under those same rules. 'We have not matured our understanding of copyright to work in a digital environment in way that provides a set of protections and meets people’s expectations for how we use digital content,' said Brantley" (http://blogs.wsj.com/digits/2009/07/17/an-orwellian-moment-for-amazons-kindle/, accessed 07-25-2009).

View Map + Bookmark Entry

USA Today Adds eBook Sales to its Bestsellers List July 22, 2009

On June 22, 2009 USA Today announced that it would add Amazon Kindle e-book (ebook) sales to its weekly Best-Selling Books list in its Best-Selling Books Database:

"Starting today, USA TODAY's Best-Selling Books list becomes the first major list to include Amazon Kindle e-book sales. The move reflects both the growth of e-book sales and Kindle's role in that market. 'Since 1993, USA TODAY's Best-Selling Books list has always evolved to reflect the ways our readers buy books,' says Susan Weiss, managing editor of the Life section. 'Adding Kindle to our group of contributors makes sense given the growth in the e-book platform.' E-books, for all devices, claimed 4.9% of sales in May, according to book audience research firm Codex-Group. That's up from 3.7% in March. This week, Barnes & Noble announced the launch of its own eBookstore with 700,000 titles."

View Map + Bookmark Entry

The Overlap of Innovation and Tradition in the 15th Century Media Revolution August 2009

Bettina Wagner and the Bayerische Staatsbibliothek München, published in print Als die Lettern laufen lernten. Medienwandel im 15. Jahrhundert (When Letters Became Mobile. The Transition of Media in the 15th Century):

"The invention of printing with movable letters by Johann Gutenberg is frequently described as a „media revolution“ and compared to the effects of the „electronic revolution“ of the past decades. While both events had far-reaching consequences on the production and distribution of texts, the exhibition intends to demonstrate that a gradual transition rather than a sudden turnover took place in the second half of the 15th century. Increasingly, printing techniques were employed for the production of books, but the oldest printed books, traditionally referred to as incunabula, still show many individual features which were created by hand. Thus, innovation and tradition overlap in many respects: the modern techniques for multiplication of texts and images in print only gradually superseded handwriting, and for a long time, printed books continued to be corrected by hand and to be decorated with coloured headlines and painted illustrations.

"About 90 items are displayed from the rich holdings of incunabula in the Bayerische Staatsbibliothek, which ranks first among all libraries world-wide with holdings of more than 20,000 15th-century books. The most famous incunabula are on show in the „Schatzkammer“ (treasury), including the Gutenberg-Bible and the ‚Türkenkalender’ of 1454, the earliest printed book in German, which survives in a single copy held at the Bayerische Staatsbibliothek. In addition to illustrated manuscripts and blockbooks, incunabula with painted miniatures and outstanding examples of 15th-century woodcuts can be seen, among them the report by the Mainz canon Bernhard von Breydenbach about his journey to Palestine, Hartmann Schedel’s personal copy of his ‚Nuremberg Chronicle’ and Sebastian Brant’s ‚Ship of Fools’, for which Albrecht Dürer may have designed illustrations. Apart from woodcuts, examples of other techniques for printing illustrations are presented, like copper engravings, metal cuts and printing with colour and gold – still at an experimental stage in the 15th century.

"In the second part of the exhibition, a range of very diverse incunabula give insight into the production and distribution of printed books – starting with the manuscript copy text used for typesetting and ending with the book arriving in the hands of a buyer and reader. Proof-sheets and printed tables of rubrics reveal how early printers organized the production of books. In the first decades of printing, modern conventions of book design like title-pages developed. Texts printed in non-Latin alphabets and unusual formats as well as evidence for 15th-century print-runs demonstrate the effectiveness and capability of early printing workshops. The new medium of the broadside reached entirely new groups of readers. In the printing press, posters and handbills could be produced in large numbers and thus served to disseminate all manners of texts – from pious songs over medical advice up to current news. Early printers also used broadsides to advertise their products in order to achieve financial success. This, however, led to a rapid decrease in book prices: The exhibition ends with a note added to an incunable in 1494 by a buyer who marvels at the low cost of the book. Forty years after Gutenberg published his Bible, the technology of printing finally prevailed over older, competing forms of text reproduction. While conservative circles continued to plead for copying texts by hand, the printed book’s triumph proved unstoppable, even though some readers, like Sebastian Brant’s ‚foolish reader’ could not cope with the massive number of books available" (https://www.bsb-muenchen.de/Detailed-information.403+M56017d4e158.0.html, accessed 09-18-2009).

View Map + Bookmark Entry

"What's a Big City Without a Newspaper?" August 9, 2009

In "What's a Big City Without a Newspaper?" published in The New York Times Magazine on August 9, 2009 Michael Sokelove wrote:

"Many working journalists in the country regularly check a Web site known to most as “Romenesko” (after its creator, Jim Romenesko), which aggregates industry news and these days consists mainly of layoffs and other dire news. It can be excruciating to read. Just this year, The Rocky Mountain News perished. The Seattle Post-Intelligencer became a Web-only publication with a tiny staff. Detroit’s daily newspapers are now delivered just three days a week. The Boston Globe, owned by the New York Times Company, and The San Francisco Chronicle, owned by Hearst, each went through near-death experiences as their owners won labor concessions after threatening to shutter the papers.

"Smaller newspapers, those with circulations under 50,000, are considered the healthiest part of the industry. “They’re not making 30 percent profit margins like they once did, but most of them are doing fine,” John Morton, a newspaper analyst who has followed the industry for decades, told me. Most analysts predict that the papers with a national profile and brand — The New York Times, The Washington Post, The Wall Street Journal and USA Today — will find a way to survive and stay in print. (It must be noted that few can say exactly how this will happen.)"

View Map + Bookmark Entry

The Cost of DeCoding a Human Genome Drops to $50,000 August 10, 2009

In August 2009 it was announced that bioengineer Stephen R. Quake of Stanford University invented a new technology for decoding DNA that could sequence a human genome at a cost of $50,000. 

"Dr. Quake’s machine, the Heliscope Single Molecule Sequencer, can decode or sequence a human genome in four weeks with a staff of three people. The machine is made by a company he founded, Helicos Biosciences, and costs 'about $1 million, depending on how hard you bargain,' he said.

"Only seven human genomes have been fully sequenced. They are those of J. Craig Venter, a pioneer of DNA decoding; James D. Watson, the co-discoverer of the DNA double helix; two Koreans; a Chinese; a Yoruban; and a leukemia victim. Dr. Quake’s seems to be the eighth full genome, not counting the mosaic of individuals whose genomes were deciphered in the Human Genome Project."

"For many years DNA was sequenced by a method that was developed by Frederick Sanger in 1975 and used to sequence the first human genome in 2003, at a probable cost of at least $500 million. A handful of next-generation sequencing technologies are now being developed and constantly improved each year. Dr. Quake’s technology is a new entry in that horse race.

"Dr. Quake calculates that the most recently sequenced human genome cost $250,000 to decode, and that his machine brings the cost to less than a fifth of that.

“ 'There are four commercial technologies, nothing is static and all the platforms are improving by a factor of two each year,' he said. 'We are about to see the floodgates opened and many human genomes sequenced.'

"He said the much-discussed goal of the $1,000 genome could be attained in two or three years. That is the cost, experts have long predicted, at which genome sequencing could start to become a routine part of medical practice" (Nicholas Wade, NY Times, http://www.nytimes.com/2009/08/11/science, /11gene.html?8dpc).

View Map + Bookmark Entry

Displaying Crowdsourced Road Congestion Data on Google Maps August 25, 2009

Google announced in its blog on August 25, 2009 that it was displaying crowdsourced congestion data from GPS enabled cell phones on Google maps.

". . . When you choose to enable Google Maps with My Location, your phone sends anonymous bits of data back to Google describing how fast you're moving. When we combine your speed with the speed of other phones on the road, across thousands of phones moving around a city at any given time, we can get a pretty good picture of live traffic conditions. We continuously combine this data and send it back to you for free in the Google Maps traffic layers. It takes almost zero effort on your part — just turn on Google Maps for mobile before starting your car — and the more people that participate, the better the resulting traffic reports get for everybody.

"This week we're expanding our traffic layer to cover all U.S. highways and arterials when data is available. We're able to do this thanks in no small part to the data contributed by our users. This is exactly the kind of technology that we love at Google because it's so easy for a single person to help out, but can be incredibly powerful when a lot of people use it together. Imagine if you knew the exact traffic speed on every road in the city — every intersection, backstreet and freeway on-ramp — and how that would affect the way you drive, help the environment and impact the way our government makes road planning decisions. This idea, which we geeks call 'crowdsourcing,' isn't new. Ever since GPS location started coming to mainstream devices, people have been thinking of ways to use it to figure out how fast the traffic is moving. But for us to really make it work, we had to solve problems of scale (because you can't get useful traffic results until you have a LOT of devices reporting their speeds) and privacy (because we don't want anybody to be able to analyze Google's traffic data to see the movement of a particular phone, even when that phone is completely anonymous)" (http://googleblog.blogspot.com/2009/08/bright-side-of-sitting-in-traffic.html, accessed 12-18-2011).

View Map + Bookmark Entry

Imaging a Molecule One Million Times Smaller Than a Grain of Sand August 28, 2009

On August 28, 2009 IBM Research – Zurich scientists Leo Gross, Fabian Mohn, Nikolaj Moll and Gerhard Meyer, in collaboration with Peter Liljeroth of Utrecht University, published "The Chemical Structure of a Molecule Resolved by Atomic Force Microscopy," Science, 2009; 325(5944): 1110 DOI: 10.1126/science.1176210

Using an atomic force microscope operated in an ultrahigh vacuum and at very low temperatures ( –268oC or – 451oF) the scientists imaged the chemical structure of individual pentacene molecules. For the first time ever, they were able to look through the electron cloud and see the atomic backbone of an individual molecule.

The abstract of the article is:

"Resolving individual atoms has always been the ultimate goal of surface microscopy. The scanning tunneling microscope images atomic-scale features on surfaces, but resolving single atoms within an adsorbed molecule remains a great challenge because the tunneling current is primarily sensitive to the local electron density of states close to the Fermi level. We demonstrate imaging of molecules with unprecedented atomic resolution by probing the short-range chemical forces with use of noncontact atomic force microscopy. The key step is functionalizing the microscope’s tip apex with suitable, atomically well-defined terminations, such as CO molecules. Our experimental findings are corroborated by ab initio density functional theory calculations. Comparison with theory shows that Pauli repulsion is the source of the atomic resolution, whereas van der Waals and electrostatic forces only add a diffuse attractive background."

♦ In December 2013 a video of the scientists discussing and explaining this discovery at IBM's Press Room was available at this link.

View Map + Bookmark Entry

Confirmation that Fungally-Treated Wood Enables Great Violin Sound September 2009

In September 2009 Swiss scientist Francis Schwarze of Empa, St. Gallen, and the Swiss violin maker Michael Rhonheimer of Baden received confirmation that the violin they had created using wood treated with a specially selected fungus compared favorably in a blind test against an instrument made in 1711 by the master violin maker of Cremona, Antonio Stradivari

"In the test, the British star violinist Matthew Trusler played five different instruments behind a curtain, so that the audience did not know which was being played. One of the violins Trusler played was his own strad, worth two million dollars. The other four were all made by Rhonheimer – two with fungally-treated wood, the other two with untreated wood. A jury of experts, together with the conference participants, judged the tone quality of the violins. Of the more than 180 attendees, an overwhelming number – 90 persons – felt the tone of the fungally treated violin "Opus 58" to be the best. Trusler’s stradivarius reached second place with 39 votes, but amazingly enough 113 members of the audience thought that "Opus 58" was actually the strad! "Opus 58" is made from wood which had been treated with fungus for the longest time, nine months.

"Skepticism before the blind test

"Judging the tone quality of a musical instrument in a blind test is, of course, an extremely subjective matter, since it is a question of pleasing the human senses. Empa scientist Schwarze is fully aware of this, and as he says, 'There is no unambiguous scientific way of measuring tone quality.' He was therefore, understandably, rather nervous before the test. Since the beginning of the 19th century violins made by Stradivarius have been compared to instruments made by others in so called blind tests, the most serious of all probably being that organized by the BBC in 1974. In that test the world famous violinists Isaac Stern and Pinchas Zukerman together with the English violin dealer Charles Beare were challenged to identify blind the 'Chaconne' Stradivarius made in 1725, a "Guarneri del Gesu" of 1739, a 'Vuillaume' of 1846 and a modern instrument made by the English master violin maker Roland Praill. The result was rather sobering – none of the experts was able to correctly identify more than two of the four instruments, and in fact two of the jurors thought that the modern instrument was actually the "Chaconne" stradivarius.

'Biotech wood, a revolution in the art of violin making

"Violins made by the Italian master Antonio Giacomo Stradivarius are regarded as being of unparalleled quality even today, with enthusiasts being prepared to pay millions for a single example. Stradivarius himself knew nothing of fungi which attack wood, but he received inadvertent help from the 'Little Ice Age' which occurred from 1645 to 1715. During this period Central Europe suffered long winters and cool summers which caused trees to grow slowly and uniformly – ideal conditions in fact for producing wood with excellent acoustic qualities.

"Horst Heger of the Osnabruck City Conservatory is convinced that the success of the 'fungus violin' represents a revolution in the field of classical music. 'In the future even talented young musicians will be able to afford a violin with the same tonal quality as an impossibly expensive Stradivarius,' he believes. In his opinion, the most important factor in determining the tone of a violin is the quality of the wood used in its manufacture. This has now been confirmed by the results of the blind test in Osnabruck. The fungal attack changes the cell structure of the wood, reducing its density and simultaneously increasing its homogeneity. 'Compared to a conventional instrument, a violin made of wood treated with the fungus has a warmer, more rounded sound,' explains Francis Schwarze" (http://www.sciencedaily.com/releases/2009/09/090914111418.htm, accessed 10-08-2009).

View Map + Bookmark Entry

The First College Journalism Course Focused on Twitter September 1, 2009

"This fall, DePaul University journalism alumnus Craig Kanalley will teach what is believed to be the first college-level journalism course focused solely on Twitter and its applications. Kanalley is a digital intern at the Chicago Tribune.

"It is one of several innovative courses offered by DePaul’s College of Communication to help prepare students to work in the burgeoning digital landscape. Other journalism courses include niche journalism, reporting for converged newsrooms, backpack reporting and entrepreneurial journalism.

"Kanalley said his course, 'Digital Editing: From Breaking News to Tweets, is really about learning how to make sense of the clutter of the Web, particularly in situations of breaking news or major developing stories, and how to evaluate and verify the authenticity of reports by citizen journalists.'

“ 'Thousands share information about these stories and how they’re affected through Twitter every day, and there’s a need to sift through this data to find relevant information that provides story tips and additional context for these events,' Kanalley said.

"Students will especially focus on the social networking platform Twitter and apply concepts discussed in class to Kanalley’s live journalism Web site Breaking Tweets ( www.breakingtweets.com ), which integrates news and relevant Twitter feedback to create a one-of-a-kind Web experience for readers by providing eyewitness accounts of breaking news stories from around the world" (http://media-newswire.com/release_1098001.html, accessed 09-01-2009).

View Map + Bookmark Entry

An Algorithm to Decipher Ancient Texts September 2, 2009

"Researchers in Israel say they have developed a computer program that can decipher previously unreadable ancient texts and possibly lead the way to a Google-like search engine for historical documents.

"The program uses a pattern recognition algorithm similar to those law enforcement agencies have adopted to identify and compare fingerprints.

"But in this case, the program identifies letters, words and even handwriting styles, saving historians and liturgists hours of sitting and studying each manuscript.

"By recognizing such patterns, the computer can recreate with high accuracy portions of texts that faded over time or even those written over by later scribes, said Itay Bar-Yosef, one of the researchers from Ben-Gurion University of the Negev.

" 'The more texts the program analyses, the smarter and more accurate it gets,' Bar-Yosef said.

"The computer works with digital copies of the texts, assigning number values to each pixel of writing depending on how dark it is. It separates the writing from the background and then identifies individual lines, letters and words.

"It also analyses the handwriting and writing style, so it can 'fill in the blanks' of smeared or faded characters that are otherwise indiscernible, Bar-Yosef said.

"The team has focused their work on ancient Hebrew texts, but they say it can be used with other languages, as well. The team published its work, which is being further developed, most recently in the academic journal Pattern Recognition due out in December but already available online. A program for all academics could be ready in two years, Bar-Yosef said. And as libraries across the world move to digitize their collections, they say the program can drive an engine to search instantaneously any digital database of handwritten documents. Uri Ehrlich, an expert in ancient prayer texts who works with Bar-Yosef's team of computer scientists, said that with the help of the program, years of research could be done within a matter of minutes. 'When enough texts have been digitized, it will manage to combine fragments of books that have been scattered all over the world,' Ehrlich said" (http://www.reuters.com/article/newsOne/idUSTRE58141O20090902, accessed 09-02-2009).

View Map + Bookmark Entry

Darnton's Case for Books: Past, Present and Future September 14, 2009

"In The Case for Books: Past, Present, and Future, Robert Darnton, a pioneer in the field of the history of the book, offers an in-depth examination of the book from its earliest beginnings to its changing—some even say threatened—place in culture, commerce and the academy. But to predict the death of the book is to ignore its centuries-long history of survival. The following are some of Darnton's observations.

"1. The Future. Whatever the future may be, it will be digital. The present is a time of transition, when printed and digital modes of communication coexist and new technology soon becomes obsolete. Already we are witnessing the disappearance of familiar objects: the typewriter, now consigned to antique shops; the postcard, a curiosity; the handwritten letter, beyond the capacity of most young people, who cannot write in cursive script; the daily newspaper, extinct in many cities; the local bookshop, replaced by chains, which themselves are threatened by Internet distributors like Amazon. And the library? It can look like the most archaic institution of all. Yet its past bodes well for its future, because libraries were never warehouses of books. They have always been and always will be centers of learning. Their central position in the world of learning makes them ideally suited to mediate between the printed and the digital modes of communication. Books, too, can accommodate both modes. Whether printed on paper or stored in servers, they embody knowledge, and their authority derives from a great deal more than the technology that went into them.

"2. Preservation. Bits become degraded over time. Documents may get lost in cyberspace, owing to the obsolescence of the medium in which they are encoded. Hardware and software become extinct at a distressing rate. Unless the vexatious problem of digital preservation is solved, all texts “born digital” belong to an endangered species. The obsession with developing new media has inhibited efforts to preserve the old. We have lost 80% of all silent films and 50% of all films made before World War II. Nothing preserves texts better than ink imbedded in paper, especially paper manufactured before the 19th century, except texts written in parchment or engraved in stone. The best preservation system ever invented was the old-fashioned, pre-modern book.

"3. Reading… and Writing. Time was when readers kept commonplace books. Whenever they came across a pithy passage, they copied it into a notebook under an appropriate heading, adding observations made in the course of daily life. The practice spread everywhere in early modern England, among ordinary readers as well as famous writers like Francis Bacon, Ben Jonson, John Milton, and John Locke. It involved a special way of taking in the printed word. Unlike modern readers, who follow the flow of a narrative from beginning to end (unless they are digital natives and click through texts on machines), early modern Englishmen read in fits and starts and jumped from book to book. They broke texts into fragments and assembled them into new patterns by transcribing them in different sections of their notebooks. Then they reread the copies and rearranged the patterns while adding more excerpts. Reading and writing were therefore inseparable activities. They belonged to a continuous effort to make sense of things, for the world was full of signs: you could read your way through it, and by keeping an account of your readings, you made a book of your own, one stamped with your personality. 

"4. Piracy. Voltaire toyed with his texts so much that booksellers complained. As soon as they sold one edition of a work, another would appear, featuring additions and corrections by the author. Customers protested. Some even said that they would not buy an edition of Voltaire's complete works—and there were many, each different from the others—until he died, an event eagerly anticipated by retailers throughout the book trade. Piracy was so pervasive in early modern Europe that bestsellers could not be blockbusters as they are today. Instead of being produced in huge numbers by one publisher, they were printed simultaneously in many small editions by many publishers, each racing to make the most of a market unconstrained by copyright. Few pirates attempted to produce accurate counterfeits of the original editions. They abridged, expanded, and reworked texts as they pleased, without worrying about the authors' intentions. 

"5. E-Books. I want to write an electronic book. Here is how my fantasy takes shape. An “e-book,” unlike a printed codex, can contain many layers arranged in the shape of a pyramid. Readers can download the text and skim the topmost layer, which will be written like an ordinary monograph. If it satisfies them, they can print it out, bind it (binding machines can now be attached to computers and printers), and study it at their convenience in the form of a custom-made paperback. If they come upon something that especially interests them, they can click down a layer to a supplementary essay or appendix. They can continue deeper through the book, through bodies of documents, bibliography, historiography, iconography, background music, everything I can provide to give the fullest possible understanding of my subject. In the end, they will make the subject theirs, because they will find their own paths through it, reading horizontally, vertically, or diagonally, wherever the electronic links may lead. 

"6. Authorship. Despite the proliferation of biographies of great writers, the basic conditions of authorship remain obscure for most periods of history. At what point did writers free themselves from the patronage of wealthy noblemen and the state in order to live by their pens? What was the nature of a literary career, and how was it pursued? How did writers deal with publishers, printers, booksellers, reviewers, and one another? Until those questions are answered, we will not have a full understanding of the transmission of texts. Voltaire was able to manipulate secret alliances with pirate publishers because he did not depend on writing for a living. A century later, Zola proclaimed that a writer's independence came from selling his prose to the highest bidder. How did this transformation take place?

"7. The Book Trade. It may seem hopeless to conceive of book history as a single subject, to be studied from a comparative perspective across the whole range of historical disciplines. But books themselves do not respect limits either linguistic or national. They have often been written by authors who belonged to an international republic of letters, composed by printers who did not work in their native tongue, sold by booksellers who operated across national boundaries, and read in one language by readers who spoke another. Books also refuse to be contained within the confines of a single discipline when treated as objects of study. Neither history nor literature nor economics nor sociology nor bibliography can do justice to all aspects of the life of a book. By its very nature, therefore, the history of books must be international in scale and interdisciplinary in method. But it need not lack conceptual coherence, because books belong to circuits of communication that operate in consistent patterns, however complex they may be. By unearthing those circuits, historians can show that books do not merely recount history; they make it.(http://www.publishersweekly.com/article/CA6696290.html)"

View Map + Bookmark Entry

'Material Degradomics" or, The Sniff Test for a Book's Physical State September 17, 2009

In a paper entitled "Material Degradomics: On the Smell of Old Books", published in the journal Analytical Chemistry in September 2009, Matija Strlic at University College London, and associates at the Tate art museum (U.K.), the University of Ljubljana, and Morana RTD in Ivančna Gorica, (both in Slovenia) introduced a new method for linking a book’s physical state to its corresponding VOC emissions pattern. The goal was to “diagnose” decomposing historical documents noninvasively as a step toward protecting them.

“Ordinarily, traditional analytical methods are used to test paper samples that have been ripped out,” Strlic says. “The advantage of our method is that it’s nondestructive" (http://pubs.acs.org/doi/full/10.1021/ac902143z?cookieSet=1).

"The test is based on detecting the levels of volatile organic compounds. These are released by paper as it ages and produce the familiar 'old book smell'.

"The international research team, led by Matija Strlic from University College London's Centre for Sustainable Heritage, describes that smell as 'a combination of grassy notes with a tang of acids and a hint of vanilla over an underlying mustiness'. 

" 'This unmistakable smell is as much part of the book as its contents,' they wrote in the journal article. Dr Strlic told BBC News that the idea for new test came from observing museum conservators as they worked.

" 'I often noticed that conservators smelled paper during their assessment,' he recalled.  'I thought, if there was a way we could smell paper and tell how degraded it is from the compounds it emits, that would be great.'

"The test does just that. It pinpoints ingredients contained within the blend of volatile compounds emanating from the paper.

"That mixture, the researchers say, 'is dependent on the original composition of the... paper substrate, applied media, and binding' " (http://news.bbc.co.uk/2/hi/science/nature/8355888.stm)

View Map + Bookmark Entry

1.7 Billion Internet Users September 30, 2009

According to Internetworldstats.com there were about 1,733,993,000 Internet users on September 30, 2009. This compared with about 360,985,000 on December 31, 2000.

View Map + Bookmark Entry

The First Historical Thesaurus in Any Language October 2009

In October 2009 Oxford University Press published as a printed book the Historical Thesaurus of the Oxford English Dictionary with Additional Material from A Thesaurus of Old English,edited by Christian Kay, Jane Roberts, Michael Samuels, and Irene Wotherspoon.

Forty years in the making, this 4448-page work was the first historical thesaurus to be compiled for any language, and the first to include almost the entire vocabulary of English, from Old English to the present. It was also the largest thesaurus resource in the world, covering more than 920,000 words and meanings, based on the Oxford English Dictionary.

The Historical Thesaurus lists synonyms listed with dates of first recorded use in English, in chronological order, with earliest synonyms first. For obsolete words, the Thesaurus also included the last recorded use of each word.

The work used a specially devised thematic system of classification. Its comprehensive index enabled complete cross-referencing of nearly one million words and meanings. It contained a comprehensive sense inventory of Old English and a fold-out color chart which showed the top levels of the classification structure. 

View Map + Bookmark Entry

Google CEO Eric Schmidt On Newspapers & Journalism October 3, 2009

The following are quotations from Google CEO Eric Schmidt, selected from his interview on October 3, 2009 with Danny Sullivan of searchengineland.com, representing Schmidt's view of present problems and possible future solutions for newspapers and journalism impacted by the Internet:

"The number of readers for newspapers is declining. The market is becoming more specialized. There will always be a market for people who read the newspaper on a train going into New York City. There will always be a market for people who sit in in the afternoon in a cafe in the city and read the newspaper in the sunshine. The term “killing” is a bit over[blown]. Newspapers face a long-term secular decline because of the shift in user habits due to the Internet."

"In the case of the newspapers, they have multiple problems which are hard to solve. If you think about it there are three fundamental problems. One is that the physical cost of things is going up, physical newsprint. Another one has been the loss of classifieds. And a third one has been essentially the difficulty in selling traditional print ads. So, all of them have online solutions. And we’ve come to the conclusion that the right thing to do is to help them with the online."

"We think that over a long enough period of time, most people will have personalized news-reading experiences on mobile-type devices that will largely replace their traditional reading of newspapers. Over a decade or something. And that that kind of news consumption will be very personal, very targeted. It will remember what you know. It will suggest things that you might want to know. It will have advertising. Right? And it will be as convenient and fun as reading a traditional newspaper or magazine.

"So one way one to think about it is that the newspaper or magazine industry do a great job of the convenience of scanning and looking and understanding. And we have to get the web to that point, or whatever the web becomes. So we just announced, the official name is Google Fast Flip. And that’s an example of the kind of thing we’re doing. And we have a lot more coming."

"I specifically am talking about investigative journalism when I talk about this. There’s no lack of bloggers and people who publish their opinions and faux editorial writers and people with an opinion. And I think that one of the great things about the internet is that we can hear them. We can also choose to ignore them. So it’s not correct to say that the internet is decreasing conversation. The internet is clearly increasing conversation at an incredibly rapid pace. The cacophony of voices is overwhelming as you know.

"Well-funded, targeted professionally managed investigative journalism is a necessary precondition in my view to a functioning democracy. And so that’s what we worry about. And as you know, that was always subsidized in the newspaper model by the other things that they did. You know, the story about the scandal in Iraq or Afghanistan was difficult to advertise against. But there was enough revenue that it allowed the newspaper to fulfill its mission" (http://searchengineland.com/google-ceo-eric-schmidt-on-newspapers-journalism-27172)

View Map + Bookmark Entry

eBook Sales Represent 1.6% of Book Sales October 7, 2009

"According to a report being released Wednesday by Forrester Research, Cambridge, Massachusetts, e-reader sales will total an estimated 3 million this year, with Amazon selling 60 percent of them and Sony Corp. 35 percent."

"According to the Association of American Publishers, e-books accounted for just 1.6 percent of all book sales in the first half of the year. But the market is growing fast. E-book sales totaled $81.5 million in the first half, up from $29.8 million in the first six months of 2008.

"And [Jeff] Bezos said Amazon sells 48 Kindle copies for every 100 physical copies of books that it offers in both formats. Five months ago it was selling 35 Kindle copies per 100 physical versions.

"Bezos said that increase is happening faster than he expected.

" 'I think that ultimately we will sell more books in Kindle editions than we do in physical editions,' Bezos said in the interview, which was held in the Cupertino offices of Lab126, the Amazon subsidiary that developed the Kindle" (http://www.nytimes.com/aponline/2009/10/07/business/AP-US-TEC-Amazon-Kindle.html)

View Map + Bookmark Entry

" A Library to Last Forever" ?? October 9, 2009

On October 9, 2013 Sergey Brin, co-founder and technology president of Google published an Op-Ed piece regarding the Google Book Search program in The New York Times entitled, perhaps overly optimistically, "A Library to Last Forever," from which I quote without implied endorsement:

".  . .the vast majority of books ever written are not accessible to anyone except the most tenacious researchers at premier academic libraries. Books written after 1923 quickly disappear into a literary black hole. With rare exceptions, one can buy them only for the small number of years they are in print. After that, they are found only in a vanishing number of libraries and used book stores. As the years pass, contracts get lost and forgotten, authors and publishers disappear, the rights holders become impossible to track down.

"Inevitably, the few remaining copies of the books are left to deteriorate slowly or are lost to fires, floods and other disasters. While I was at Stanford in 1998, floods damaged or destroyed tens of thousands of books. Unfortunately, such events are not uncommon - a similar flood happened at Stanford just 20 years prior. You could read about it in The Stanford-Lockheed Meyer Library Flood Report, published in 1980, but this book itself is no longer available.

"Because books are such an important part of the world's collective knowledge and cultural heritage, Larry Page, the co-founder of Google, first proposed that we digitize all books a decade ago, when we were a fledgling startup. At the time, it was viewed as so ambitious and challenging a project that we were unable to attract anyone to work on it. But five years later, in 2004, Google Books (then called Google Print) was born, allowing users to search hundreds of thousands of books. Today, they number over 10 million and counting.

"The next year we were sued by the Authors Guild and the Association of American Publishers over the project. While we have had disagreements, we have a common goal - to unlock the wisdom held in the enormous number of out-of-print books, while fairly compensating the rights holders. As a result, we were able to work together to devise a settlement that accomplishes our shared vision. While this settlement is a win-win for authors, publishers and Google, the real winners are the readers who will now have access to a greatly expanded world of books.

"There has been some debate about the settlement, and many groups have offered their opinions, both for and against. I would like to take this opportunity to dispel some myths about the agreement and to share why I am proud of this undertaking. This agreement aims to make millions of out-of-print but in-copyright books available either for a fee or for free with ad support, with the majority of the revenue flowing back to the rights holders, be they authors or publishers.

"Some have claimed that this agreement is a form of compulsory license because, as in most class action settlements, it applies to all members of the class who do not opt out by a certain date. The reality is that rights holders can at any time set pricing and access rights for their works or withdraw them from Google Books altogether. For those books whose rights holders have not yet come forward, reasonable default pricing and access policies are assumed. This allows access to the many orphan works whose owners have not yet been found and accumulates revenue for the rights holders, giving them an incentive to step forward.

"Others have questioned the impact of the agreement on competition, or asserted that it would limit consumer choice with respect to out-of-print books. In reality, nothing in this agreement precludes any other company or organization from pursuing their own similar effort. The agreement limits consumer choice in out-of-print books about as much as it limits consumer choice in unicorns. Today, if you want to access a typical out-of-print book, you have only one choice - fly to one of a handful of leading libraries in the country and hope to find it in the stacks." (http://www.nytimes.com/2009/10/09/opinion/09brin.html?scp=2&sq=sergey%20brin&st=cse, accessed 10-09-2009).

View Map + Bookmark Entry

Discovery of Unknown Portrait by Leonardo Confirmed by a Fingerprint October 13, 2009

"The ghost of a fingerprint in the top left corner of an obscure portrait appears to have confirmed one of the most extraordinary art discoveries. The 33 x 23cm (13 x 9in) picture, in chalk, pen and ink, appeared at auction at Christie’s, New York, in 1998, catalogued as 'German school, early 19th century'. It sold for $19,000 (£11,400). Now a growing number of leading art experts agree that it is almost certainly by Leonardo da Vinci and worth about £100 million.

"Carbon dating and infra-red analysis of the artist’s technique are consistent with such a conclusion, but the most compelling evidence is that fragment of a fingerprint.

"Peter Paul Biro, a Montreal-based forensic art expert, found it while examining images captured by the revolutionary multispectral camera from the Lumière Technology company, Antiques Trade Gazette reports today.

"Mr Biro has pioneered the use of fingerprint technology to help to resolve art authentication disputes. Multispectral analysis reveals each layer of colour, and enables the pigment mixtures of each pixel to be identified without taking physical samples. The fingerprint corresponds to the tip of the index or middle finger, and is 'highly comparable' to one on Leonardo’s St Jerome in the Vatican. Importantly, St Jerome is an early work from a time when Leonardo was not known to have employed assistants, making it likely that it is his fingerprint.

"Martin Kemp, Emeritus Professor of History of Art at the University of Oxford, is convinced and recently completed a book about the find (as yet unpublished). He said that his first reaction was that 'it sounded too good to be true — after 40 years in the business, I thought I’d seen it all'. But gradually, “all the bits fell into place.”

Professor Kemp has rechristened the picture, sold as Young Girl in Profile in Renaissance Dress, as La Bella Principessa after identifying her, 'by a process of elimination', as Bianca Sforza, daughter of Ludovico Sforza, Duke of Milan (1452-1508), and his mistress Bernardina de Corradis. He described the profile as 'subtle to an inexpressible degree', as befits the artist best known for the Mona Lisa.

"If it is by Leonardo, it would be the only known work by the artist on vellum although Professor Kemp points out that Leonardo asked the French court painter Jean Perréal about the technique of using coloured chalks on vellum in 1494.

"The picture was bought in 1998 by Kate Ganz, a New York dealer, who sold it for about the same sum to the Canadian-born Europe-based connoisseur Peter Silverman in 2007. Ms Ganz had suggested that the portrait 'may have been made by a German artist studying in Italy ... based on paintings by Leonardo da Vinci'.

"When Mr Silverman first saw it, in a drawer, 'my heart started to beat a million times a minute,' he said. 'I immediately thought this could be a Florentine artist. The idea of Leonardo came to me in a flash.'

"Carbon-14 analysis of the vellum gave a date range of 1440-1650. Infra-red analysis revealed stylistic parallels to Leonardo’s other works, including a palm print in the chalk on the sitter’s neck 'consistent ... to Leonardo’s use of his hands in creating texture and shading', according to Mr Biro" (http://entertainment.timesonline.co.uk/tol/arts_and_entertainment/visual_arts/article6872019.ece, accessed 10-14-2009).

♦Another very useful report on this discovery appeared in Antiques Trade Gazette on October 12, 2009.

♦An interview with Peter Silverman about the purchase appeared in Antiques Trade Gazette on October 26, 2009.

View Map + Bookmark Entry

The Finest Roman Cameo Glass Vase Discovered October 13, 2009

On October 13, 2009 Bonhams auctioneers announced that they identified a Roman cameo glass vase, which may be the most important of its kind in the world. Strikingly similar to the Portland Vase, it is larger, in better condition and with superior decoration  

"The vase dates from between late First Century B.C. to early First Century A.D and stands 13in (33.5cm) high. Only 15 other Roman cameo glass vases and plaques are known to exist today. These very rare vessels were highly artistic, luxury items, produced by the Roman Empire’s most skilled craftsmen. They are formed from two layers of glass – cobalt blue with a layer of white on top – which is cut down after cooling to create the cameo-style decoration.  

"Items of this kind were produced, it is thought, within a period of only two generations. They would have been owned by distinguished Roman families.  

"Until now, the most famous example has been the Portland Vase, held by the British Museum. This is smaller, standing at only 9in (24cm) high. It is also missing its base and has been restored three times.

"The recently identified vase is also more complex than others of its kind, being decorated with around 30 figures and a battle scene around the lower register. By comparison, the Portland vase has just seven figures. Bonhams’ experts believe that this magnificent artefact could rewrite the history books on cameo vases. Unlike the Portland Vase, it still has its base and lower register and will therefore add significantly to the archaeological understanding of these vessels.  

"The vase is thought to have resided in a private European collection for some time. The collector is a long-term client of Bonhams.  

"In co-operation with leading experts in the field and with the present owner of the vase, Bonhams say they will be carrying out detailed research over the coming months into the historical background of the vase and its miraculous survival as well as into its more recent history and chain of ownership.  

"The vase was presented publicly for the first time at a the 18th Congress of the International Association for the History of Glass at Thessaloniki in Greece in September, where it was viewed by around 200 of the world’s leading glass specialists" (http://www.antiquestradegazette.com/news/7312.aspx).  

View Map + Bookmark Entry

The Largest Study of Global Internet Traffic Since the Beginning of the Commercial Internet October 19, 2009

On October 19, 2009 Arbor Networks, the University of Michigan, and Merit Network presented the findings of the Internet Observatory Report at the North American Network Operators Group NANOG47 in Dearborn, Michigan:

"• The report is believed to be the largest study of global Internet traffic since the start of the commercial Internet in the mid-1990s. The report offers analysis of two years worth of detailed traffic statistics from 110 large and geographically diverse cable operators, international transit backbones, regional networks and content providers.

"• At its peak, the study monitored more than 12 terabits-per-second and a total of more than 256 exabytes of Internet traffic over the two-year life of the study.

"• The Internet Observatory Report includes a discussion around significant changes in Internet topology and commercial inter-relationships between providers; analysis of changes in Internet protocols and applications; and a concluding analysis of Internet growth trends and predictions of future trends.

Key Findings:

"• Evolution of the Internet Core: Over the last five years, Internet traffic has migrated away from the traditional Internet core of 10 to 12 Tier-1 international transit providers. Today, the majority of Internet traffic by volume flows directly between large content providers, datacenter / CDNs and consumer networks. Consequently, most Tier-1 networks have evolved their business models away from IP wholesale transit to focus on broader cloud / enterprise services, content hosting and VPNs.

"• Rise of the ‘Hyper Giants’: Five years ago, Internet traffic was proportionally distributed across tens of thousands of enterprise managed web sites and servers around the world. Today, most content has increasingly migrated to a small number of very large hosting, cloud and content providers. Out of the 40,000 routed end sites in the Internet, 30 large companies – “hyper giants” like Limelight, Facebook, Google, Microsoft and YouTube – now generate and consume a disproportionate 30% of all Internet traffic.

"• Applications Migrate to the Web: Historically, Internet applications communicated across a panoply of application specific protocols and communication stacks. Today, the majority of Internet application traffic has migrated to an increasingly small number of web and video protocols, including video over web and Adobe Flash. Other mechanisms for video and application distribution like P2P (peer-to-peer) have declined dramatically in the last two years.

"• A New Internet Ecosystem: Over the last five years, macroeconomic forces have radically transformed the global Internet commercial ecosystem. Economic changes, including the collapse of wholesale IP transit and the dramatic growth in advertisement-supported service, reversed decade-old business dynamics between transit providers, consumer networks and content providers. A wave of innovation is ongoing, with service providers now offering everything from triple play services to managed security services, VPNs and increasingly, CDNs. This change in the Internet business ecosystem has significant ongoing implications for backbone engineering, design of Internet scale applications and research."

View Map + Bookmark Entry

Google Represents 6% of All Internet Traffic October 19, 2009

According to Arbor Networks' 2009 Atlas Observatory Report Google accounted for 6 percent of all Internet traffic of every type. 

"And how many would have heard of a company called Carpathia Hosting? Its MegaUpload, MeaErotik, MegaClick and MegaVideo services have turned it into a company that now accounts for 1 percent of all Internet traffic, says Arbor, and this will doubtless grow. The important takeaway is that few of these companies had even been heard of two years ago, and very few of them are big telcos. To put all this into perspective, in 2007 Arbor found that the overwhelming majority of Internet traffic was accounted for by 30,000 entities, with fifty percent of traffic accounted for by around 10,000 companies.

"Only two years later that same fifty percent now runs through only 150 top 'content delivery networks' (CDNs), an astonishing consolidation made more remarkable by the fact that Internet traffic has grown significantly during that time.

" 'Up to 2007, The Internet meant connecting to lots of servers and data centres around the world,' notes Arbor's chief scientist, Craig Labovitz. Now there are barely 100 companies that matter. Traffic patterns tend to be hidden, mainly because the companies losing out - the traditional telcos and ISPs - don't exactly have an interest in advertising their waning status. The reason for their decline in importance is that Internet traffic is being driven by huge providers with access to content such as video.

" 'For 150 years, they [BT and other telcos] have had the same business model. Now everyone is trying to get away from being a dumb pipe.' Arbor's Atlas Internet Observatory report crunched traffic from 100 of the Internet's largest entities, accounting for 12 Terabytes of peak throughput, equivalent to about a quarter of the Internet's total at any one moment, said Labovitz.The importance of this is not simply that a small number of companies will account for a lot of traffic, but that these companies are increasingly what the Internet actually is. The Internet up to around 2007 was dominated by a hierarchy of companies, co-operating with one another to allow traffic to be passed from one to the other, regardless of size. The new Internet superpowers, in stark contrast, bypass a lot of this and use direct connections from one to the other. If a company is not part of this new core, it could find itself increasingly passed to the 'long tail', a polite way of saying they will be shoved to the fringe.  

"Video, including video that runs over web/http, now accounts for an estimated 10 percent of all Internet traffic, and is one reason all these direct connections between large data centres are now necessary. IPv6 traffic remains tiny at only 0.03 percent of traffic, but is showing sudden and possibly rapid growth in recent months thanks to deployments by named hosters.  

"Interestingly, P2P is in rapid decline, falling from around 3 percent of all traffic in 2007 to only half a percent now. Again, downloaders appear to prefer direct connectivity for downloads, mostly through port 80 and the web" (http://www.thestandard.com/news/2009/10/14/internet-now-dominated-traffic-superpowers)

View Map + Bookmark Entry

David Hockney's iPhone Art October 22, 2009

On October 22, 2009 Lawrence Wechler, director of the New York Institute for the Humanities at New York University,  published "David Hockney's iPhone Passion," New York Review of Books LXVI, no. 16, 35.

Hockney had a history of exploiting new technologies in his art:

"Hockney continued to explore other media besides painting, most notably photography. From 1982-86, he created some of his best-known and most iconographic work — his “joiners,” large composite landscapes and portraits made up of hundreds or thousands of individual photographs. Hockney initially used a Polaroid camera for the photos, switching to a 35 mm camera as the works grew larger and more complex. In interviews, Hockney related the “joiners” to cubism, pointing out that they incorporate elements that a traditional photograph does not possess — namely time, space, and narrative.

"Always willing to adopt new techniques, in 1986 Hockney began producing art with color photocopiers. He has also incorporated fax machines (faxing art to an exhibition in Brazil, for example) and computer-generated images (most notably Quantel Paintbox, a computer system often used to make graphics for television shows) into his work" (http://www.pbs.org/wnet/americanmasters/episodes/david-hockney/the-colors-of-music/103/, accessed 01-09-2010).

View Map + Bookmark Entry

ICANN Will Allow Web Addresses in Non-Latin Alphabets October 30, 2009

The Internet Corporation for Assigned Names and Numbers (ICANN) voted on October 30, 2009 to allow Web addresses written completely in Chinese, Arabic, Korean and other languages using non-Latin alphabets.

"The decision is a 'historic move toward the internationalization of the Internet,' said Rod Beckstrom, Icann’s president and chief executive. 'We just made the Internet much more accessible to millions of people in regions such as Asia, the Middle East and Russia.' 

"This change affects domain names — anything that comes after the dot, including .com, .cn or .jp. Domain names have been limited to 37 characters — 26 Latin letters, 10 digits and a hyphen. But starting next year, domain names can consist of characters in any language. In some Web addresses, non-Latin scripts are already used in the portion before the dot. Thus, Icann’s decision Friday makes it possible, for the first time, to write an entire Internet address in a non-Latin alphabet.  

"Initially, the new naming system will affect only Web addresses with 'country codes,' the designators at the end of an address name, like .kr (for Korea) or .ru (for Russia). But eventually, it will be expanded to all types of Internet address names, Icann said.

"Some security experts have warned that allowing internationalized domain names in languages like Arabic, Russian and Chinese could make it more difficult to fight cyberattacks, including malicious redirects and hacking. But Icann said it was ready for the challenge.  'I do not believe that there would be any appreciable difference,' Mr. Beckstrom said in an interview. 'Yes, maybe some additional potential but at the same time, some new security benefits may come too. If you look at the global set of cybersecurity issues, I don’t see this as any significant new threat if you look at it on an isolated basis.'  

"The decision, reached after years of testing and debate, clears the way for Icann to begin accepting applications for non-Latin domain names Nov. 16. People will start seeing them in use around mid-2010, particularly in Arabic, Chinese and other scripts in which demand for the new 'internationalized' domain name system has been among the strongest, Icann officials say. Internet addresses in non-Latin scripts could lead to a sharp increase in the number of global Internet users, eventually allowing people around the globe to navigate much of the online world using their native language scripts, they said.  

"This is a boon especially for users who find it cumbersome to type in Latin characters to access Web pages. Of the 1.6 billion Internet users worldwide, more than half use languages that have scripts that are not based on the Latin alphabet." (http://www.nytimes.com/2009/10/31/technology/31net.html?hp)

View Map + Bookmark Entry

A "Significant Amount" of Water is Discovered on the Moon November 13, 2009

On November 13, 2009 NASA announced that the Lunar CRater Observation and Sensing Satellite (LCROSS) managed by Ames Research Center, Moffett Field, California, and its companion rocket, which impacted in crater Cabeus near the Moon's south pole on October 9, 2009, generated a "significant amount" of water

This discovery had significant implications for the support of a manned base on the moon or for the generation of rocket fuel to further space exploration. 

View Map + Bookmark Entry

U.S. National Text Pager Intercepts from 9/11 Are Released November 26 – November 26, 2009

"From 3AM on Wednesday November 25, 2009, until 3AM the following day (US east coast time), WikiLeaks released half a million US national text pager intercepts. The intercepts cover a 24 hour period surrounding the September 11, 2001 attacks in New York and Washington.

"The messages were broadcasted 'live' to the global community — sychronized to the time of day they were sent. The first message was from 3AM September 11, 2001, five hours before the first attack, and the last, 24 hours later.  

"Text pagers are usualy carried by persons operating in an official capacity. Messages in the archive range from Pentagon, FBI, FEMA and New York Police Department exchanges, to computers reporting faults at investment banks inside the World Trade Center  

"The archive is a completely objective record of the defining moment of our time. We hope that its entrance into the historical record will lead to a nuanced understanding of how this event led to death, opportunism and war" (http://911.wikileaks.org/, accessed 11-26-2009).

According to BBC.com, the number of text messages published may have been as high as 573,000.

View Map + Bookmark Entry

Convergence of Media: Packaging Blu-ray Discs in Books December 2009

Among the numerous things I collect are DVDs and high-definition Blu-ray Discs. Toward the end of 2009 I noticed that certain classic films were being re-issued as Blu-ray discs packaged in the back of short hardcover books concerning the films. These were not books that happened to include a disc as supplementary material. In those cases the electronic data is often secondary to the physical book. What I bought was the Blu-ray disc, packaged inside a full color book of 30 to 50 pages that was issued in the same size as the normal plastic Blu-ray clamshell boxes. The book is clearly secondary to the data—an excellent informative way of packaging and storing the data.

Two Blu-ray discs that I purchased in December 2009 packaged in hardcover books were Robert Redford's film, A River Runs Through It, based on the elegantly written novella by Norman Maclean, and the 50th Anniversary edition of Alfred Hitchcock's North by Northwest. The back of each book contains a thick plastic insert attached to the inside of the rear cover to protect the disc. Both books contain full-color content that is well-presented and informative.

Why do I include these details in this database? To me, selling Blu-ray discs inside a book represents a notable physical example of the convergence of the book and electronic media. To a book collector this format is also superior and of greater interest than the standard Blu-ray plastic clamshell box. 

View Map + Bookmark Entry

Google's Computers in China Come Under Attack, Initiating a Review of the Company's Operations in China December 2009 – January 12, 2010

"Like many other well-known organizations, we face cyber attacks of varying degrees on a regular basis. In mid-December, we detected a highly sophisticated and targeted attack on our corporate infrastructure originating from China that resulted in the theft of intellectual property from Google. However, it soon became clear that what at first appeared to be solely a security incident--albeit a significant one--was something quite different.

"First, this attack was not just on Google. As part of our investigation we have discovered that at least twenty other large companies from a wide range of businesses--including the Internet, finance, technology, media and chemical sectors--have been similarly targeted. We are currently in the process of notifying those companies, and we are also working with the relevant U.S. authorities.  

"Second, we have evidence to suggest that a primary goal of the attackers was accessing the Gmail accounts of Chinese human rights activists. Based on our investigation to date we believe their attack did not achieve that objective. Only two Gmail accounts appear to have been accessed, and that activity was limited to account information (such as the date the account was created) and subject line, rather than the content of emails themselves.

"Third, as part of this investigation but independent of the attack on Google, we have discovered that the accounts of dozens of U.S.-, China- and Europe-based Gmail users who are advocates of human rights in China appear to have been routinely accessed by third parties. These accounts have not been accessed through any security breach at Google, but most likely via phishing scams or malware placed on the users' computers.  

"We have already used information gained from this attack to make infrastructure and architectural improvements that enhance security for Google and for our users. In terms of individual users, we would advise people to deploy reputable anti-virus and anti-spyware programs on their computers, to install patches for their operating systems and to update their web browsers. Always be cautious when clicking on links appearing in instant messages and emails, or when asked to share personal information like passwords online. You can read more here about our cyber-security recommendations. People wanting to learn more about these kinds of attacks can read this Report to Congress (PDF) by the U.S.-China Economic and Security Review Commission (see p. 163-), as well as a related analysis (PDF) prepared for the Commission, Nart Villeneuve's blog and this presentation on the GhostNet spying incident.

 "We have taken the unusual step of sharing information about these attacks with a broad audience not just because of the security and human rights implications of what we have unearthed, but also because this information goes to the heart of a much bigger global debate about freedom of speech. In the last two decades, China's economic reform programs and its citizens' entrepreneurial flair have lifted hundreds of millions of Chinese people out of poverty. Indeed, this great nation is at the heart of much economic progress and development in the world today.  

"We launched Google.cn in January 2006 in the belief that the benefits of increased access to information for people in China and a more open Internet outweighed our discomfort in agreeing to censor some results. At the time we made clear that 'we will carefully monitor conditions in China, including new laws and other restrictions on our services. If we determine that we are unable to achieve the objectives outlined we will not hesitate to reconsider our approach to China.'

"These attacks and the surveillance they have uncovered--combined with the attempts over the past year to further limit free speech on the web--have led us to conclude that we should review the feasibility of our business operations in China. We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China" (http://googleblog.blogspot.com/2010/01/new-approach-to-china.html, accessed 01-16-2010).

View Map + Bookmark Entry

Google Announces Real-Time Search: "News at it Happens" December 2009

"First, we're introducing new features that bring your search results to life with a dynamic stream of real-time content from across the web. Now, immediately after conducting a search, you can see live updates from people on popular sites like Twitter and FriendFeed, as well as headlines from news and blog posts published just seconds before. When they are relevant, we'll rank these latest results to show the freshest information right on the search results page.  

Try searching for your favorite TV show, sporting event or the latest development on a recent government bill. Whether it's an eyewitness tweet, a breaking news story or a fresh blog post, you can find it on Google right after it's published on the web. . .

Our real-time search enables you to discover breaking news the moment it's happening, even if it's not the popular news of the day, and even if you didn't know about it beforehand. For example, in the screen shot, the big story was about GM's stabilizing car sales, which shows under "News results." Nonetheless, thanks to our powerful real-time algorithms, the 'Latest results' feature surfaces another important story breaking just seconds before: GM's CEO stepped down.

Click on 'Latest results' or select 'Latest' from the search options menu to view a full page of live tweets, blogs, news and other web content scrolling right on Google. You can also filter your results to see only 'Updates' from micro-blogs like Twitter, FriendFeed, Jaiku and others. Latest results and the new search options are also designed for iPhone and Android devices when you need them on the go, be it a quick glance at changing information like ski conditions or opening night chatter about a new movie — right when you're in line to buy tickets.  

And, as part of our launch of real-time on Google search, we've added 'hot topics' to Google Trends to show the most common topics people are publishing to the web in real-time. With this improvement and a series of other interface enhancements, Google Trends is graduating from Labs.  

"Our real-time search features are based on more than a dozen new search technologies that enable us to monitor more than a billion documents and process hundreds of millions of real-time changes each day. Of course, none of this would be possible without the support of our new partners that we're announcing today: Facebook, MySpace, FriendFeed, Jaiku and Identi.ca — along with Twitter, which we announced a few weeks ago" (http://googleblog.blogspot.com/2009/12/relevance-meets-real-time-web.html, accessed 05-06-2010).

View Map + Bookmark Entry

The Google Living Stories Project Begins December 8, 2009

Google announced the Living Stories project, which provided a new, experimental way to consume news, developed by a partnership between Google, the New York Times, and the Washington Post

"The announcement of the 'living stories' project shows Google collaborating with newspapers at a time when some major publishers have characterized the company as a threat. Google has also taken steps recently to project an image of itself as a friend to the industry. 

"Living stories is a much-enhanced version of what some newspaper Web sites already do by grouping material by subject matter. In the case of The Times, the paper’s Web site has thousands of “topic pages.” But those efforts have not yielded heavy reader traffic or much advertising.  

"The Google project, presented without ads, is now at livingstories.googlelabs.com, part of Google Labs, where the company tries out experimental products. If it is judged a success, it would eventually reside on the site of any publisher that wanted to use it. Those publishers could also sell ads on those pages.  

"Google’s dominant search engine sells ads alongside search results that often include news articles, leading some newspaper industry leaders — particularly executives of the News Corporation, led by Rupert Murdoch — to cry foul. Other publishers say that, on the contrary, they owe much of their Internet traffic and revenue to search engines.  

"Google executives argue that the tools their company has developed, including search, make them the papers’ ally, a case made by Eric E. Schmidt, Google’s chairman and chief executive, in an opinion piece published last week in The Wall Street Journal. Also last week, Google announced changes in the way its search function interacts with news sites, giving publishers more flexibility in limiting the material readers can see before encountering demands for payment or registration. The changes were relatively minor, but reinforced the message that the company wanted to help news sites.  

" 'There’s been a series of steps to work with and mollify news publishers, to improve the P.R., and you can see the living page in that same vein,' said Ken Doctor, a media analyst with the analysis firm Outsell. The project is a genuine step forward, he said, because 'on most news sites, site search, looking for a lot on one subject, is awful.'

"Google worked for months on the project with journalists and Web staffs at The Times and The Post. For now, it covers just eight broad topics, like health care reform and the Washington Redskins. At the top of each subject page is a summary, a timeline of major events and pictures, followed by the opening sections of a series of articles, in reverse chronological order. A set of buttons allows the reader to narrow the topic.  'It’s an experiment with a different way of telling stories,' said Martin A. Nisenholtz, senior vice president for digital operations of The New York Times Company. 'I think in it, you can see the germ of something quite interesting.'

"A reader can call up an entire article without navigating away from the subject page, reading one piece after another without using the 'forward' and 'back' buttons. Josh Cohen, business product manager for Google News, said that having all the material appear on a single page would help the page rank higher in Internet searches than newspapers’ subject pages do now.  

"In various ways, the experiment duplicates or improves on what can now be done on publishers’ own sites, through a search engine’s news function or even on Wikipedia. Mr. Cohen said that if it worked well, Google would make the software available free to publishers, much as those publishers now use Google Maps and YouTube functions on their sites" (http://www.nytimes.com/2009/12/09/technology/companies/09google.html?hpw).

View Map + Bookmark Entry

Google Introduces Google Goggles December 8, 2009

On Cember 8, 2009 Google introduced Google Goggles image recognition and search technology for the Android mobile device operating system.  If you photographed certain types of individual objects with your mobile phone the program would recognize them and automatically display links to relevant information on the Internet.If you pointed your phone at a building the program would identify it by GPS and identify it. Then if you clicked on the name of the building it would bring up relevant Internet links.

♦ On May 7, 2010 you could watch a video describing the features of Google Goggles at this link:

http://www.google.com/mobile/goggles/#text

View Map + Bookmark Entry

The Film "Avatar" and Visions of Reality, Virtual and Otherwise December 10, 2009

Avatar, an American science fiction epic film written and directed by film director, producer, screenwriter, editor, and inventor James Cameron, and starring Sam Worthington, Zoe Saldana, Sigourney Weaver, Michelle Rodriguez and Stephen Lang, was first released in London On December 10, 2009 by Twentieth Century Fox, headquartered in Century City, Los Angeles.

"The film is set in the year 2154 on Pandora, a moon in the Alpha Centauri star system. Humans are engaged in mining Pandora's reserves of a precious mineral, while the Na'vi—a race of indigenous humanoids—resist the colonists' expansion, which threatens the continued existence of the Na'vi and the Pandoran ecosystem. The film's title refers to the genetically engineered bodies used by the film's characters to interact with the Na'vi.

"Avatar had been in development since 1994 by Cameron, who wrote an 80-page scriptment for the film. Filming was supposed to take place after the completion of Titanic, and the film would have been released in 1999, but according to Cameron, 'technology needed to catch up' with his vision of the film. In early 2006, Cameron developed the script, as well as the language and culture of the Na'vi. He said sequels would be possible if Avatar was successful, and in response to the film's success, confirmed that there will be another two.

"The film was released in traditional 2-D, as well as 3-D, RealD 3D, Dolby 3D, and IMAX 3D formats. Avatar is officially budgeted at $237 million; other estimates put the cost at $280–310 million to produce and $150 million for marketing. The film is being touted as a breakthrough in terms of filmmaking technology, for its development of 3D viewing and stereoscopic filmmaking with cameras that were specially designed for the film's production.

"Avatar premiered in London, UK on December 10, 2009, and was released on December 18, 2009 in the US and Canada to critical acclaim and commercial success. It grossed $27 million on its opening day domestically (in the United States and Canada) and $77 million domestically on its opening weekend. It opened two days earlier internationally and grossed $232 million worldwide in its first five days of international release. Within three weeks of its release, with a worldwide gross of over $1 billion, Avatar became the second highest-grossing film of all time worldwide, exceeded only by Cameron's previous film, Titanic" (Wikipedia article on Avatar (2009 film), accessed 01-16-2010).

♦ From my perspective the most significant aspect of Avatar, apart from its breathtaking computer graphic animation, and the fascinating artificial culture and language of the Na'vi, was the convincing portrayal of a total virtual reality experience, and the interplay between virtual reality, the reality of earth-born humans, some of whom animated the avatars, and the different reality of the Na'vi. The film presented visions of a reality that I could not have imagined before viewing. In its presentation of new views of reality it is reminiscent of the 1982 film, Blade Runner, directed by Ridley Scott.

Another aspect of the film that was highly timely was its depiction of the struggle between destructive exploitation of natural resources versus living in harmony with nature.

View Map + Bookmark Entry

A French Alternative to Google Books is Formed December 17, 2009

Jean-Pierre Gérault, president of i2S, Pessac, France, announced the formation of a French consortium to scan the contents of French libraries. The project is called "Polinum," a French acronym that stands for "Operating Platform for Digital Books."

"French President Nicolas Sarkozy has made catching up on France's digital delay one of the national priorities by earmarking euro750 million of a euro35 billion ($51 billion) spending plan announced earlier this week for digitizing France's libraries, film and music archives and other repositories of the nation's recorded heritage. These funds will mainly go to French libraries, universities and museums, who will use them to develop their own plans for digitizing their holdings.  

"The consortium, meanwhile, intends to be the technological choice for those institutions, Gerault said. He declined to estimate what part of the euro750 million the consortium thinks it can capture. 

"France's culture ministry has been in difficult negotiations with Google, which would like to help digitize France's archives but has met resistance in France over fears of giving the internet search giant too much control over the nation's cultural heritage, as well as over how it would protect the interests of authors and other copyright holders" (http://www.businessweek.com/ap/financialnews/D9CL4M480.htm, accessed 12-17-2009).

View Map + Bookmark Entry

The Amazon Kindle is Hacked; eBook Digital Rights Management Cracked December 23, 2009

On December 23, 2009 it was announced that the Amazon Kindle was hacked, allowing for all purchased content to be transferred off the device via a PDF file. 

"Kindle e-books are sold as .AZW files which have DRM that stops users from transferring the purchased books to other devices that are not Kindles.

"That should no longer be a problem thanks to Israeli hacker "Labba" who has cracked the DRM. A second hacker, 'I

" 'Cabbages' did note that Amazon's DRM process was tough to crack, although ultimately Amazon's work was in vain. 'Amazon actually put a bit of effort behind the DRM obfuscation in their Kindle for PC application. And they seem to have done a reasonable job on the obfuscation. Way to go Amazon! It's good enough that I got bored unwinding it all and just got lazy with the Windows debugging APIs instead,' he said" (http://www.afterdawn.com/news/archive/20989.cfm#comments, accessed 01-02-2010).

Amusingly perhaps, or following the belief that all publicity is good publicity, Amazon.com had two advertisements for the Kindle on the web page publishing the above story.

View Map + Bookmark Entry

eBooks Begin to Outsell Physical Books; 1.49 Million Kindles Sold? December 27, 2009

According to an Amazon.com press release on December 27, 2009, the company sold more Kindle books (ebooks) for Christmas than physical books. At this time the company had over 390,000 titles available for wireless download on the Kindle. The Kindle 2, which weighed 10.2 ounces, could store up 1,500 books. The larger Kindle DX could store approximately 3500 non-illustrated books

Since the company did not give out specific statistics, except to state that the Kindle was their best-selling product in the 2009 Christmas season, it is unclear whether the number of books "sold" included the vast number of free titles available for the Kindle. it is also understandable, that since the Kindle was their best-selling product, that buyers would have ordered multiple titles for each Kindle.

"Two interesting factoids emerge from the marketing verbiage: First, Kindle books outsold paper books on Christmas Day, the first time that has ever happened; Second, the Kindle is the 'most gifted item ever in our history,' according to Bezos. The first may not mean much, since Christmas Day isn’t necessarily a normal shopping day, though the volume of Kindle books sold suggests that on that day a lot of new Kindle users started stocking up on e-books. The second, an aggregate figure that appears to reflect all gifted items over all time, may be very significant or mean absolutely nothing at all, as the increase in online shopping and gifting continues to dwarf previous 'record-setting' gift sales by the law of large(r) numbers.  

"Nevertheless, it is clear that this was the Kindle Christmas. During the third quarter of 2009, I estimated that Amazon sold 289,000 Kindles on sales growth of 60 percent year over year. We can assume, given the disappointing availability of most competitors, that Kindle grabbed a very large percentage of e-book reader sales this holiday season. However, it was also a poor Christmas overall, in terms of retails sales, even if Amazon did sell more stuff than ever before.  

"So, how many more Kindles sold between the end of the Q3 and Christmas Day? Extrapolating from previous quarters, and assuming this was a break-out sales season for Kindle, meaning that it more that doubled over the previous quarter, factoring in the sales of Kindle books versus paper books as Christmas gift cards were redeemed yesterday, I estimate Amazon sold 419,000 Kindles in the fourth quarter, or 145 percent of the sales in Q3.

"That would make the total number of Kindles sold to date 1,491,000. Kindle now represents approximately 65 percent of the hardware reader market despite the appearance of Barnes & Noble’s Nook, which may reach 30,000 units in the quarter because of delays" (http://blogs.zdnet.com/Ratcliffe/?p=486, accessed 12-0-2013).

View Map + Bookmark Entry