Metadata only
Date
2021Type
- Monograph
ETH Bibliography
yes
Altmetrics
Abstract
Supercomputing refers to exclusive processing at the outer limit of computability – controversial, politically attractive, and very expensive.
David Gugerli and Ricky Wichum explore the development of supercomputing in Stuttgart since 1970, and the surprising twists, operational crises, and new technologies it entailed. For example, who would have expected that expansion of Stuttgart’s computing center in the 1970s would be capped off with the installation of an outdated supercomputer? Or that the spectacular acquisition of the world’s fastest computer in the 1980s would be followed by a years-long quest for users and suitable forms of operation? When, in the 1990s, the Internet made global connectivity possible, Stuttgart was at the forefront, flaunting its dominance in a display of transatlantic experiments. Yet the practical question of what to do with supercomputing was ultimately decided at home, in Germany. The proper management of “users” and extending services to Europe occupied much of the 2000s. By then, previously unanticipated limits to growth had become apparent in the hardware.
Told from a history of technology perspective, this study shows that productive supercomputing requires the constant reconfiguring of computers, science, industry, and policy. Show more
Publication status
publishedPublisher
ChronosOrganisational unit
03486 - Gugerli, David / Gugerli, David
02803 - Collegium Helveticum / Collegium Helveticum
Related publications and datasets
Is variant form of: http://hdl.handle.net/20.500.11850/466671
More
Show all metadata
ETH Bibliography
yes
Altmetrics