Group B, Poster #204, Earthquake Forecasting and Predictability (EFP)
Nowcasting Earthquakes with QuakeGPT: An AI-Enhanced Earthquake Generative Pretrained Transformer
Poster Image:

Poster Presentation
2025 SCEC Annual Meeting, Poster #204, SCEC Contribution #14269 VIEW PDF
rmers extend deep learning in the adoption of a context-sensitive protocol "attention", which is used to tag important sequences of data, and to identify relationships between those tagged data. Pretrained transformers are the foundational technology that underpins the new AI models ChatGPT (Generative Pretrained Transformers) from openAI.com, and Bard, from Google.com. In our case, we hypothesize that a transformer might be able to learn the sequence of events leading up to a major earthquake. Typically, the data used to train the model is in the billions or larger, so these models, when applied to earthquake problems, need the size of data sets that only long numerical earthquake simulations can provide. In this research, we are developing the Earthquake Generative Pretrained Transformer model, "QuakeGPT", in a similar vein. For simulations, we are using simulation catalogs from a stochastic physics-informed earthquake simulation model "ERAS", similar to the more common ETAS models. ERAS has only 2 uncorrelated parameters that are easily retrieved from the observed catalog. In the future, physics-based models such as Virtual Quake model could be used as well. Observed data, which is the data to anticipate with nowcasting, is taken from the USGS online catalog for California. In this talk, we discuss 1) recent results from our earthquake nowcasting machine learning methods; and 2) the architecture of QuakeGPT together with first results.
SHOW MORE
SHOW MORE