A: Livermore, California, United States
The ASCI White supercomputer at the Lawrence Livermore National Laboratory in California became operational on June 29, 2000. An IBM system, it covered a space the size of two basketball courts and weighed 106 tons. It contained six trillion bytes (TB) of memory— almost 50,000 times greater than the average personal computer at the time—and had more than 160 TB of Serial Disk System storage capacity—enough to hold six times the information stored in the 29 million books in the Library of Congress.
♦ In December 2013 I decided that the ASCI White would be the last supercomputer documented in HistoryofInformation.com. The merits of supercomputers are mainly appreciated for their abilities to perform the most complex of calculations, and without the time and space and the ability to explain such calculations, descriptions of the ever-advancing magnitudes of supercomputers seemed beyond the scope of this project. Readers can follow the development of supercomputers through the Wikipedia article on supercomputer and through other websites, such as the TOP500 twice-annual ranking of the world's supercomputers. To review progress to 2000 and a bit afterward, I quote the section on Applications of Supercomputers from the Wikipedia article as it read in December 2013:
"Applications of supercomputers
"The stages of supercomputer application may be summarized in the following table:
Decade | Uses and computer involved |
---|---|
1970s | Weather forecasting, aerodynamic research (Cray-1). |
1980s | Probabilistic analysis, radiation shielding modeling (CDC Cyber). |
1990s | Brute force code breaking (EFF DES cracker), |
2000s | 3D nuclear test simulations as a substitute for legal conduct Nuclear Non-Proliferation Treaty (ASCI Q). |
2010s | Molecular Dynamics Simulation (Tianhe-1A) |
"The IBM Blue Gene/P computer has been used to simulate a number of artificial neurons equivalent to approximately one percent of a human cerebral cortex, containing 1.6 billion neurons with approximately 9 trillion connections. The same research group also succeeded in using a supercomputer to simulate a number of artificial neurons equivalent to the entirety of a rat's brain.
"Modern-day weather forecasting also relies on supercomputers. The National Oceanic and Atmospheric Administration uses supercomputers to crunch hundreds of millions of observations to help make weather forecasts more accurate."