The devastating tornadoes in the U.S. will study supercomputer

Simulated storm. Illustration from the National Institute for Computational Sciences (

According to estimates by the National Weather Service and the National Agency for Oceanic and Atmospheric Administration (USA), April 27-28, in the south-eastern United States because of a tornado killed 344 people. Such an impressive number of victims in such a short time it was noted the last time only in 1936. In addition, the material damage caused by a tornado more than a billion dollars. Researchers from the School of Computer Science, University of Oklahoma (USA) began using a supercomputer Kraken (Kraken) in order to understand the nature of the devastating tornado. This project is funded by the U.S. National Science Foundation. Reported by the website of the National Institute for Computational Sciences (National Institute for Computational Sciences) at the University of Tennessee (USA).

The publication of the new draft notes that the collection of data on existing tornado — a very risky business. Hunters for tornadoes, chasing the elements, risking their lives. However, their mobile radars can measure only to the lives of some of the tornado wind speed or intensity of rainfall. To understand the nature of tornadoes, and even, perhaps, to learn how to predict them, researchers need data that are beyond the power of a tornado hunters: indicators of atmospheric pressure and wind structure in three dimensions. To do this, scientists need many more tornadoes than they, in fact, occur.

"I do not need three, I need three hundred [tornado]," — says Amy McGovern (Amy McGovern), professor (assistant professor) School of Computer Science of the University of Oklahoma, where a tornado — a frequent occurrence. In his project, scientists use the Kraken supercomputer at the University of Tennessee — the eighth-speed performance in the world. McGovern team uses ground-based observations, and various weather monitoring systems to bring together all of the variables to describe the conditions under which may or may not be a tornado.

Thinking about a new project appeared five years ago. Then, according to McGovern, the members of her team used data from observations of storms in 20 years, based on these identified 40 indicators to reconstruct more than 250 virtual storm with a resolution of 500 meters. This figure means that every 500 meters to collect data on all 40 variables. The staff quickly realized that such a high resolution is necessary in order to achieve the required precision. Such calculations can not hold due to the Kraken. Kraken — the name of the supercomputer Cray XT5. All the components of the computer cover over 200 square meters. meters. It is located in Oak Ridge National Laboratory (Oak Ridge) — the most powerful computer center of the world and managed by the National Institute for Computational Sciences at the University of Tennessee. The work is funded by the Kraken U.S. National Science Foundation.

Team Amy McGovern "produced" 150 storms — possible precursors of a tornado with a resolution as early as 75 meters. Each computer simulation gives 3.2 "storm" and requires 30 hours of 3000 and involving more than 112 thousand cores Kraken. According to McGovern, about 50-75 of those storms that occur in the virtual world, replaced by a tornado, providing researchers with new data, which may allow them to solve the mystery of one of the most destructive natural phenomena.

The researchers hope that their work will help to significantly reduce the number of false warnings of tornadoes, and their now almost 75% of the 100, and to give people more time to prepare (now warn people for 12-14 minutes before the tornado). "If we can understand how tornadoes form, it may lead to the development of more efficient algorithms that predict tornadoes." Currently, scientists have completed 30 of 150 planned simulations. With the recent upgrade to the Kraken peak performance of 1.17 petaflops (1 petaflop = 1,000 teraflops = 1 quadrillion, or 10 to the 15th degree of instructions per second), they were able to move forward quickly. Developed their data processing algorithms can be used in various fields of study of atmospheric turbulence to robotics.


Like this post? Please share to your friends: