Computing Power Grows at Quake Center
Largest grant of supercomputing time from the National Science Foundation will vastly improve the scope of earthquake simulations at USC.By Carl Marziali
July 1, 2007
Figuring out what the next big quake will do to Los Angeles takes a lot of computing power, and the Southern California Earthquake Center is about to get it in spades.
The National Science Foundation has awarded the center more than 15 million service units, valued at close to $20 million and roughly equivalent to running 2,000 desktop computers continuously for a year.
By comparison, the next largest award in this round totaled four million units.
“It’s the biggest allocation, as far as we know, of any group in the country,” said Thomas Jordan, USC University Professor and director of the Southern California Earthquake Center.
The center will use the time, spread among nine networked supercomputers, to improve the detail and scope of its massive simulations of earthquakes in Southern California.
The new simulations will come one step closer to the realism needed to influence building codes and city planning.
“SCEC will be able to simulate the shaking from the largest and potentially most disastrous earthquakes, such as magnitude eight events on the San Andreas Fault that could produce Katrina-scale disasters,” Jordan said.
Those would be twice as powerful as the biggest earthquakes simulated by the center to date.
Current simulations also are limited to low-frequency waves, said Philip Maechling, an information technology architect at the center. Because low-frequency waves are very long, they mainly affect tall structures, such as high-rises.
The waves that wreak the most damage to smaller buildings and homes are many times higher in frequency.
“We want to raise the shaking frequencies in our simulations so that they can be used to predict what might happen to more and more buildings,” Maechling said.
Simulating such waves will require hundreds to thousands of times more computing power, not to mention a more detailed database of soil densities in the Los Angeles basin.
By harnessing the National Science Foundation’s supercomputers and using historical earthquake data to infer ground conditions, researchers at the Southern California Center hope to produce simulations that will save lives and property.
“It is a very socially relevant use of the national supercomputer facilities,” Maechling said.
And tailor-made for them: The SCEC won the grant in part because it was able to show that it could make efficient use of the hundreds of millions of dollars in new machines that the NSF plans to buy.
“With these computational resources, we will be able to simulate thousands of possible fault-rupture scenarios in Southern California, including the largest breaks on the San Andreas.”
In addition to the NSF network, the Southern California Earthquake Center also plans to use the resources of the USC Center for High-Performance Computing and Communications, home to the nation’s eleventh-fastest supercomputer in an academic setting.