The ground beneath our feet is always rumbling. Even in the world’s most seismically active regions, however, most earthly shudders tend to pass unnoticed.
But now, with the help of a new computer algorithm that finds patterns in existing data, a team of researchers has expanded the number of Southern California earthquakes archived in the past decade by a factor of 10. Their final tally, published today in the journal Science, tabulates 1.81 million events in the past decade alone, averaging out to an earthquake every 174 seconds.
Many of the new additions are seismic pipsqueaks, shaking with all the might of a quivering coffee table. But bolstering the geologic record could give scientists fresh perspectives on how earthquakes behave and interact.
“This research helps us see all the texture and details in the Earth’s crust that we weren’t able to see before,” says Katherine Scharer, a paleoseismologist at the United States Geological Survey who was not involved in the study. “We’ve just been handed a new set of tools to find out what causes big earthquakes, and how the crust responds to these changes.”
Today, more than 500 different seismic monitoring stations speckle the arid landscape of Southern California, listening for shakes and trembles in the earth below. Between 2008 and 2017, this vast network detected a whopping 180,000 earthquakes rippling through the southern half of the state—some 50 events per day.
But scientists have long known that these are just the tip of the seismic iceberg.
Though high-magnitude earthquakes tend to hog the limelight, these catastrophic events are pretty uncommon compared to milder temblors. And trying to understand what’s going on in the Earth’s crust with just rare, cataclysmic events is as futile as recreating a symphony with only the clash and clang of the orchestra’s loudest cymbals.
But lower-magnitude tremors are, by nature, trickier to detect. Earthquakes operate on a logarithmic scale: Each single-digit decrease in magnitude (for instance, going from 4.0 to 3.0) corresponds to a 10-fold drop in amplitude, or ground motion, as well as a 31-fold dip in energy released. This means quakes can even run at magnitudes of 0 and below.
Seismic monitors are pretty good at picking up geologic grumbles that run above a magnitude of about 2.0 (barely enough to wobble a creaky building). Below this level, however, movements in the Earth’s crust get jumbled up with vibrations from other sources, like construction, air traffic, trains, and even the ocean. In a sense, the orchestra’s more modest melodies are still being recorded—they’re just getting drowned out by the din of a chatty crowd.
“You might think that since you don’t feel small earthquakes, they’re not very useful,” says study author Daniel Trugman, a seismologist at the Los Alamos National Laboratory. “But they really tell us a lot about the physical processes that are hidden beneath our feet.”
To identify earthquakes masked by the cacophony, a team of seismologists led by Zachary Ross of Caltech decided to take a computational approach. Even minor earthquakes tend to look and behave in similar ways. And when seismic waves are picked up by monitors, they tend to leave the same telltale traces of their passing. This means earthquakes have something of a seismic fingerprint—one that could be used to tease similarly-shaped events out of noisy data.
And so the researchers trained a computer program to do just that. Using templates based on nearly 284,000 past earthquakes, their algorithm scoured 10 years of data banked by the Southern California Seismic Network (SCSN) for pint-sized versions of the same patterns.
Scharer compares the technique to a seismic Shazam app—one that can test a musical snippet against a database of songs, hunting for a perfect match.
With such a vast repository of seismic waves in the archives, the researchers had to wrangle about 100 terabytes of data. Even backed by the computational firepower of a giant computer cluster with 200 processing units, each iteration of the program still took weeks to run its course. “If you make one little mistake, it sets you back a month,” Trugman says.
But the effort paid off: In went the data, and out came more than 1.6 million previously undocumented earthquakes. While the original catalog was rife with gaps below magnitude 2.0, the team’s calculations show their new-and-improved version is nearly complete down to 0.3. Several quakes even registered at values close to -1—that’s “like dropping a bag of groceries on the floor,” Scharer says. “It’s amazing they were able to pull that kind of detail out.”
Eileen Martin, a computational scientist specializing in geophysics at Virginia Tech who was not involved in the study, also praises the more comprehensive catalog as impressive. “This really has a big payoff in our ability to observe the world,” she says.
And revising the overall numbers of events was just the beginning. With the subtleties of Southern California’s seismic symphony fleshed out, the team was also able to reassess the nature of past seismic events.
Earthquakes often occur in rapid succession without warning or a clear source—but in one case, the revamped archives unveiled over 30 tiny quakes preceding a recorded swarm. Seismic “breadcrumb trails” like this could someday help researchers figure out how these tectonic collusions are triggered, Scharer says.
And just as earthquakes can be preceded by subdued shaking, they can also be followed by a series of seismic waves—and these echoes can reverberate far and wide. In 2010, a magnitude 7.2 earthquake rocked Baja California, sending ripples through the landscape that were detected more than 100 miles away. But the new algorithm shows that even this was an underestimation: The initial shock’s true reach was actually closer to 200 miles.
Finally, because more seismically active spots tend to match up with fault lines, the more complete catalog also yielded clearer maps of areas like the San Jacinto Fault Zone—one of the most volatile regions in the San Andreas Fault. Delineating these types of geological boundaries could help with everything from city planning to simulating the reach of future earthquakes.
Though the researchers have so far only focused their work on Southern California, there’s no reason the algorithm can’t be applied to other high-stakes places like Japan and Italy. Because of region- and instrument-specific quirks, templates generated in California probably won’t translate over to other locations. But anywhere with a good network of monitors can probably fine-tune the approach to match its own unique seismic soundtrack, Ross says.
At the same time, because the crux of the method rests on a reliable seismic fingerprint, places that lack thorough earthquake archives and instrument coverage probably can’t benefit as much from this technology, says Elizabeth Vanacore, a geophysicist at the University of Puerto Rico at Mayagüez who was not involved in the study.
But those newer to the seismometer scene can potentially use other computational techniques to identify earthquakes concealed in what data they have. For instance, some methods can hunt for aberrations, or unexpected jumps, in instrument readouts—which could even generate the kinds of templates necessary for the new algorithm to run, Martin says.
“We’d like to know the conditions that lead to big earthquakes and fatalities and injuries,” Scharer says. “If we can see in greater detail what’s happening in the days or months prior to a large event, maybe we can find signals that help us clue in to what’s important.”
Some of the Earth’s grandest compositions might be quiet. But that doesn’t mean they’re not worth a listen.