93,000,000,000,000,000
calculations per second performed by the world’s fastest
supercomputer TaihuLight (as of November 2017).
Supercomputers
Novice in mediji
Supercomputers are already today lightning-fast analyzers. Now the next great breakthrough is just around the corner. And the potential is enormous.
The answer to all the ultimate questions of life is actually very simple. It is 42. This was calculated in 7.5 million years by Deep Thought, the supercomputer in the science fiction novel, “The Hitchhiker’s Guide to the Galaxy.”
Unlike the machine in a 40-year-old work of fiction, today’s powerful computers provide usable results. In chemistry, for example, they help in molecular simulation for finding new active agents. They make water and energy supplies more efficient and are important helpers in predicting epidemics and earthquakes or in diagnosing illnesses. For example, oncologists in Japan were groping in the dark in the case of a 60-year-old woman until they enlisted IBM’s Watson. This supercomputer required just 10 minutes to compare the data of the sick woman’s diagnosis against millions of cancer studies to find an extremely rare type of leukemia. The doctors adjusted their therapy and the woman was treated successfully with the help of “Dr. Watson.”
Calculations per second (FLOPS)
1941 Konrad Zuse’s Z3, Germany, the world’s first functioning digital computer: 2
1946 ENIAC, USA, the first electronic universal computer: 500
1964 CDC 6600, USA, the first supercomputer: 3,000,000
1984 M-13, Soviet Union, the first computer in gigaflop range: 24,000,000,000
2017 Sunway TaihuLight, China, fastest computer to date: 93,000,000,000,000,000
(as of November 2017)
Supercomputers that achieve top computing power with several thousand processors could play a key part in meeting the challenges of the future. “We are facing changes that will prove to be revolutionary,” the U.S. computer science professor and supercomputer expert Thomas Sterling predicts. Thanks to their computing power, Sterling places supercomputers on a par with innovations that have given a decisive impetus to human development, such as the discovery of fire. Competition in the market is correspondingly fierce. China and the United States, in particular, are engaged in a race among high-performance computers.
The world’s first supercomputer came on to the market in 1964 in the United States, in the form of the CDC 6600. The Americans dominated the scene for many years, but recently computers from China have made their way to the top. At 93 petaflops – that‘s 93,000,000,000,000,000 calculations per second – the Sunway TaihuLight is the fastest supercomputer by far (as of November 2017). “With its help, complex climate models, for example, can be calculated nearly a hundred times faster than by a computer capable of one petaflops, which would need one year for the task. This adds a whole new dimension to the fight against climate change,” Sterling says. The Sunway is followed by the Tianhe-2, which still has almost twice as much computing power as the Piz Daint from Switzerland, which comes third. The fastest U.S. computer, Titan, is fifth.
93,000,000,000,000,000
calculations per second performed by the world’s fastest
supercomputer TaihuLight (as of November 2017).
1951
Marvin Minsky builds the first neurocomputer, SNARC.
1956
Scientists present the first AI program, Logic Theorist.
1972
Introduction of Mycin expert system for the diagnosis and treatment of infectious diseases.
1994
First test of autonomous automobiles on German roads.
1997
The Deep Blue computer beats the reigning world champion Garry Kasparov at chess.
2011
IBM brings the powerful AI computer Watson onto the market.
2017
The Libratus software beats four world-class players at poker.
But performance rankings often involve great simplifications. A high level of computing power alone does not help with every scientific question. A big part is also played by the size of the memory – and above all by the programming. Nevertheless, computing power is a major requirement for these super brains to exploit their abilities to the full. For this reason, researchers around the world are already working on the next stage of supercomputers: the exascale computer. With a capacity of 1,000 petaflops, this will be able to perform one quintillion – meaning 10 to the power of 18 – computing operations per second. China says it has already begun building a prototype, and it is followed in this by the United States. So they do not fall behind, the U.S. Department of Energy this summer announced $258 million to support companies to make progress on the exascale computer in the next three years.
Meanwhile, the European Union, which has so far been lagging some way behind, is likewise planning to invest heavily in breaking the exascale barrier by 2022, according to Andrus Ansip, the Commissioner for the Digital Single Market. The E.U. estimates that €5 billion will be needed for this. At present, E.U. states are much too reliant on the computing power of supercomputers based in, for example, China and the United States. For instance, as recently as spring 2017, E.U. industry provided only around 5 percent of the power of high-performance computers but used one-third of global resources. Japan, too, is getting involved in this catch-up race and is aiming to top the supercomputer league as early as 2018 with its AI Bridging Cloud Infrastructure.
The digital transformation is making ever greater advances and permeating the value chains of industry.
Here are some examples.
“Especially in the natural sciences, powerful supercomputers are already indispensable for simulating molecular processes one to one,” says the German philosopher of science and expert in artificial intelligence (AI), Professor Klaus Mainzer. Out of the many possible combinations of these building blocks, they help to single out those that offer the prospect of surprising discoveries and new products. The supercomputer is capable of learning, and it performs an initial selection, so that only the most promising substances find their way into the laboratory. Accordingly, BASF has, since fall 2017, relied on just such a powerful digital helper for developing virtual experiments and answering complex questions. It shortens the time taken to obtain usable results from several months to a few days. (see BASF supercomputer QURIOSITY)
“The challenging problems in the field of chemistry could become drivers for super- computing,” Sterling believes. In his view, they could contribute to investigating the critical boundaries of technology – and how to overcome them. The bottlenecks between processors and memory pose increasing problems to the industry. These become more serious as the masses of data which have to be shifted around, in simulations for example, grow larger. “This bottleneck in traditional von Neumann computer architecture needs to be eliminated,” Sterling says. A new way of thinking is required to bring together computing and memory operations in a smart way. Another technology has already assimilated the elementary logic by which chemical processes work: the quantum computer, which could open up new horizons of knowledge. The next dimension of super brains – thinking in several states at once – is in the starting blocks.
Next part: Quantum computers
Continue reading