GpHow

How scientists modeled a deadly tornado with an insanely powerful computer

How scientists modeled a deadly tornado with an insanely powerful computerHow scientists modeled a deadly tornado with an insanely powerful computer

How scientists modeled a deadly tornado with an insanely powerful computer

Supercell thunderstorms are giant tempests with powerful rotating updrafts at their cores—and one out of every four or five spawn tornadoes. Most of these twisters are little, but some can grow fierce. To predict the rare killers, and thus give more targeted warnings, meteorologists need to better understand how tornadoes form. But simulating a supercell thunderstorm and the tornado it produces involves hundreds of terabytes of data—an amount so vast that Leigh Orf, an atmospheric scientist at the University of Wisconsin at Madison, had to use a supercomputer to make it happen.

Some of that data came from the sheer size of the storm (similar supercells can stretch more than 12 miles high). But Orf needed most of the power in order to capture all the details and see the whole system at a high resolution. To get started, he used observations from an actual storm that raged through central Oklahoma in 2011. Then he created a digital version similar to the real thing, spinning together the most high-resolution supercell simulation ever made. “For the first time, we’ve been able to peer into the inner workings of a supercell that produces a tornado, and we’re able to watch that process occur,” Orf says.

Here’s how the simulation breaks down, by the numbers:

Number of data points: 1,839,200,000

In order to see the digital storm in as high a resolution as possible, Orf divided the virtual space into nearly 2 billion pieces, the majority of them cubes about 100 feet per side. In each of these chunks, the supercomputer simulated factors like wind speed and direction, temperature, barometric pressure, humidity, and precipitation.

Number of supercomputer cores used: 20,000

Simulating all those pieces required a massive amount of computing power, although it was just a small amount of the roughly 800,000 cores, or processing components, the University of Illinois’ Blue Waters supercomputer has to offer. Orf used the rough equivalent of 1,250 Mac Pros.

Approximate computing time, in hours: 30

Although the actual calculations took Blue Waters less than a standard workweek, Orf has worked toward making a simulation like this one since 2012. His result produced 400 terabytes of data—enough to fill more than 3,000 iPhones. It’s also the most detailed tornado model ever. “We can see everything going on inside it,” Orf says.

Estimated maximum wind speed, in mph: 210

The twister that inspired this simulation struck on May 24, 2011. It began as a supercell thunderstorm, started rotating, and ultimately birthed an EF-5 tornado, the most powerful category. For nearly two hours, it carved a path 63 miles long and up to a mile wide. Along the way, it ripped the bark off trees, tossed cars, injured 181 people—and killed nine.

Volume of the model storm, in cubic miles: 69,750

To mimic the conditions of the 2011 storm, Orf set the simulation in a three-dimensional block of virtual space measuring about 75 miles long by 75 miles wide by 12.4 miles high. He kick-started the digital tempest by creating an updraft in the system. From there, the computer followed the laws of physics until a twister formed.

Exit mobile version