トップ 差分 一覧 ソース 検索 ヘルプ PDF RSS ログイン

ModelDescription_e

Japan forecast models

Last modified: June 2011

  • ALM Models

D. Schorlemmer submitted an Asperity-based Likelihood Model (ALM) to 1- and 3-year forecasts of the "All Japan" region. This model assumes a GR distribution of events at each grid point based on declustered, observed seismicity. However, in addition to spatially variable a-values, the model also incorporates spatial variability of the b-values where these are well constrained by smaller events. The assumption here is that the a- and b-values, inferred from microseismicity, can and should be extrapolated to predict therates of larger events-all the way up to M = 9. See also Wiemer and Schorlemmer (2007) and Gulia et al. (2010) for application to California and Italy, respectively.

    • Stefan Wiemer and Danijel Schorlemmer, ALM: An Asperity-based Likelihood Model for California, Seismological Research Letters, 78(1), 134-140, 2007, doi:10.1785/gssrl.78.1.134.
    • Laura Gulia, Stefan Wiemer, Danijel Schorlemmer, Asperity-based earthquake likelihood models for Italy, ANNALS OF GEOPHYSICS, 53(3), 63-75, 2010, doi:10.4401/ag-4843.
  • Coulomb model

S. Toda and B. Enescu (2011) submitted a Coulomb stress transfer model, incorporating a rate- and state dependent friction law, to the 1- and 3-year classes as applied to the "Mainland" region. Their forecast rates are based on using data on large earthquakes during the past 120 years. Toda and Enescu's model differs from other statistical earthquake clustering models as follows: (1) off fault aftershock zones can be modeled not just as a set of point sources but also as a set of finite, fault-shaped zones; (2) spatial distribution patterns of seismicity to be triggered by Coulomb stresses are determined by taking account of the most likely source mechanisms of future earthquakes; (3) stresses imparted by large earthquakes create stress shadows where smaller numbers of earthquakes are predicted to occur. Although the model has its own weaknesses, such as a number of uncertainty factors and unknownparameters, it is the first physics-based model ever participating in the CSEP. With modifications, this model has the potential to be used for short-term forecasting, possibly even quasi-real-time off-fault aftershock forecastingin the immediate aftermath of a large earthquake.

    • S. Toda and B. Enescu, Rate/state Coulomb stress transfer model for the CSEP Japan seismicity forecast, Earth Planets Space, 63(3), 171-185, 2011, doi:10.5047/eps.2011.01.004.
  • DBM model

A. M. Lombardi and W. Marzocchi (2011) presented a 1-year forecast model for "All Japan". It is called a Double Branching Model (DBM) and is a stochastic time dependent model which assumes that every earthquake can generate, or is correlated to, other earthquakes through different physical mechanisms (Marzocchi and Lombardi, 2008). More specifically, it consists of a sequential application of two branching processes in which any earthquake can trigger a family of subsequent events on different space time scales. The first part of their model consists of the well-known Epidemic-Type Aftershock Sequence (ETAS)model (Ogata, 1998) that describes the short-term clustering of earthquakes due to coseismic stress transfer. The second branching process works on larger space-time scales than do the short-term clustering domains. This step consistsin re-applying the branching process to a dataset obtained by using the ETAS-type declustering procedure, with the aim of describing long-term stationary seismic background that is not ascribable to coseismic stress perturbations. See also Lombardi and Marzocchi (2010) for a CSEP model as applied to Italy.

    • Anna Maria Lombardi and Warner Marzocchi, A double-branching model applied to long-term forecasting of Italian seismicity (ML>=5.0) within the CSEP project, ANNALS OF GEOPHYSICS, 53(3), 31-39, 2010, doi:10.4401/ag-4762.
    • Anna Maria Lombardi and Warner Marzocchi, The double branching model for earthquake forecast applied to the Japanese seismicity, Earth Planets Space, 63(3), 187-195, 2011, doi:10.5047/eps.2011.02.001.
  • EEPAS and PPE Models

The EEPAS and PPE models were contributed by D. Rhoades (2011) to the 3-month and 1-year testing classes for "Mainland." The "Every Earthquake a Precursor According to Scale" (EEPAS) model is a long-range forecasting method that had previously been applied to a number of regions, including Japan. The model sums up contributions to the rate density from earthquakes in the past on the basis of predictive scaling relations that are derived from the precursory scale increase phenomenon (Rhoades and Evison, 2004). Two features in the earthquake catalog for the Japan’s mainland region make it difficult to apply thismodel, namely, the magnitude-dependence of the proportion of aftershocks and of the GR b-value. To account for these features, the model was fit separately to earthquakes in three different target magnitude classes over the period 2000-2009 and made suitable to 3-month testing for M >= 4 and to 1-year testing for M >= 5. The "Proximity to Past Earthquakes" (PPE) model is a spatially smoothed seismicity model that could in principle be applied to any testingclass. This model has no predictive elements, but it can play the role of a spatially varying reference model against which the performance of time-varying models can be compared. In retrospective analysis, the mean probability gainof the EEPAS model over the PPE model increases with magnitude. The same trend is expected for prospective testing.

    • David A. Rhoades, Application of the EEPAS model to forecasting earthquakes of moderate magnitude in southern California, Seismol. Res. Lett., 78(1), 110-115, 2007, doi:10.1785/gssrl.78.1.110.
    • David A. Rhoades, Application of a long-range forecasting model to earthquakes in the Japan mainland testing region, Earth Planets Space, 63(3), 197-206, 2011, doi:10.5047/eps.2010.08.002.
  • ERS and ETES Models

M. Murru, R. Console, and G. Falcone submitted ETES and ERS models to 1-day forecasts in all three testing regions. The two models consider short-term clustering properties of earthquakes. The first is purely stochastic andis called the Epidemic Type Earthquake Sequence (ETES) model, in which the temporal aftershock decay rate is supposed to be governed by an Omori-Utsu law, and the distance decay is supposed to follow a power law. The second kind of short-term forecast (named Epidemic Rate-State: ERS) is constrained physically by the application of Dieterich's rate-and-state constitutive law (Dieterich, 1994) to earthquake clustering. For the computation of earthquake rates, both of these short-term models assume the validity of the GR distribution. The reader is referred to Console et al. (2007) for a California RELM model and Falcone et al. (2010) for a Italy CSEP model.

    • Rodolfo Console, Maura Murru, Flaminia Catalli, and Giuseppe Falcone, Real Time Forecasts through an Earthquake Clustering Model Constrained by the Rate-and-State Constitutive Law: Comparison with a Purely Stochastic ETAS Model, Seismological Research Letters, 78(1), 49-56, 2007, doi:10.1785/gssrl.78.1.57.
    • Giuseppe Falcone, Rodolfo Console, Maura Murru, Short-term and long-term earthquake occurrence models for Italy: ETES, ERS and LTST, Ann. Geophys.,

53(3), 41-50, 2010, doi:10.4401/ag-4760.

  • ETAS model

A variant of the space-time ETAS model, submitted by J. Zhuang (2011) to the 1-day class as applied to "All Japan" is based on the studies of Zhuang et al. (2002, 2004, 2005) and Ogata and Zhuang (2006). The background (spontaneous) seismicity rate varies with location in space but remains constant in time. The model defines two space-time windows to solve the problem of data censoring: events in the smaller "target windows" are used to obtain model parameters,whereas events in the bigger "auxiliary window", which contains more than one "target window", are used to calculate triggering effects that contribute to the occurrence of target events. The implementation of this model consistsof three steps: (1) the estimation procedure, which is a combination of nonparametric estimation (variable kernel estimation) of the background rate and parametric (maximum likelihood) estimation of model parameters in an iterative manner; (2) the simulation procedure, which simulates thousands of possible scenarios for earthquake occurrence within a future time interval; (3) the smoothing procedure, which smoothes the events generated during the simulation step to obtain more stable spatiotemporal occurrence rates. Because the iterative algorithm for the simultaneous estimation of the background and model parameters involves heavy calculations, off-line optimization and on-line forecasting implementation are used to alleviate the computationalcosts.

    • Jiancang Zhuang, Next-day earthquake forecasts for the Japan region generated by the ETAS model, Earth Planets Space, 63(3), 207-216, 2011, doi:10.5047/eps.2010.12.010.
  • HIST-ETAS and HIST-POISSON models

A space-time variant of the ETAS model (Ogata, 1998) has been designed for earthquake clustering with a certain space-time function form on the basis of empirical laws for aftershocks. For more accurate seismicity prediction, Y.Ogata (2011) has modified it so that it can deal not only with anisotropic clustering but also with regionally distinct characteristics of seismicity. The former requires the identification of the centroid and correlation coefficient of each spatial cluster, while the latter requires the development of a space-time ETAS model with location-dependent parameters, called a hierarchical space-time ETAS (HIST-ETAS) model. Together with the GR law of magnitude frequencies with location-dependent b-values, the proposed models have been applied to short-term, mid-term, and long-term forecasting. Ogata submitted two slightly different variants of his HIST-ETAS model (HIST-ETAS5pa and HISTETAS7pa), plus a Poisson model (HIST-POISSON), based on the background seismicity rate that is used in the HISTETAS models. The first two are applied to all testing classes and the last one only to long-term classes (1 and 3 years). All models are applied to the "All Japan" and "Kanto" regions.

    • Yosihiko Ogata, Significant improvements of the space-time ETAS model for forecasting of accurate baseline seismicity, Earth Planets Space, 63(3), 217-229, 2011, doi:10.5047/eps.2010.09.001.
  • MARFS and MARFSTA models

C. Smyth and J. Mori (2011) have presented a model for forecasting the rate of earthquakes during a specified period and in a specified area. The model explicitly predicts, by applying an autoregressive process, the number of earthquakes and the b-value of the GR distribution for the period of interest. The model also introduces a time dependency adjustment for larger magnitude ranges, assuming that the probability of another large earthquake increaseswith increasing time after the last large event within the area. These predictions are superimposed on a spatial density map obtained with a multivariate normal mixture model for historical earthquakes that occurred in the area. This forecast model differs from conventional models currently in use because of its density estimation and its assumption of temporal changes. Two variants, one using a base algorithm (MARFS) and the other using its optional adjustment (MARFSTA), have been submitted to the 3-month and 1-year testing classes and applied to the "All Japan" and "Mainland" regions.

    • Christine Smyth and Jim Mori, Statistical models for temporal variations of seismicity parameters to forecast seismicity rates in Japan, Earth Planets Space, 63(3), 231-238, 2011, doi:10.5047/eps.2010.10.001.
  • MGR model

Although the frequency-magnitude distribution, as expressed by the GR law, gives a basis for simple methods to forecast earthquakes, F. Hirose and K. Maeda (2011) point out that this distribution can sometimes be approximated by a modified GR law that imposes a maximum magnitude. For their model development, these authors tested three earthquake forecast models: (1) the Cbv (Constant b-value) model, based only on the GR law with a spatially constant b-value; (2) the Vbv (Variable b-value) model, based only on the GR law with regionally variable b-values; (3) the MGR (Modified GR) model, based either on the original or a modified GR law (the choice is made according to the Akaike Information Criterion) with regionally variable b-values. They also incorporated both aftershock decay and minimum limits on expected seismicity in these models. Comparing the results of retrospective forecasts by the three models, Hirose and Maeda found that the MGR model was almost always better than the Vbv model; that the Cbv model was better than the Vbv model for 1-year forecasts; that the MGR and Vbv models tended to be better than the Cbv model for forecasts of >=3 years. These researchers submitted the MGR-based model to two long-term (1- and 3-years) classes for the "Mainland" region.

    • Fuyuki Hirose and Kenji Maeda, Earthquake forecast models for inland Japan based on the G-R law and the modified G-R law, Earth Planets Space, 63(3), 239-260, 2011, doi:10.5047/eps.2010.10.002.
  • RI model

K. Z. Nanjo (2011) contributed his model to nine categories: three testing classes (3 years, 1 year, and 3 months, respectively) as applied to all three testing regions. His RI algorithm is originally a binary-forecast system basedon the working hypothesis that future large earthquakes are more likely to occur at locations of higher seismicity rates. The measure used here is simply to count the number of earthquakes in the past, resulting in the name of the model, i.e., the Relative Intensity (RI) of seismicity (Tiampo et al., 2002; Holliday et al., 2005). To improve its forecasting performance, Nanjo first expanded the RI algorithm so that it can be adapted to a general class of smoothed seismicity models. He then converted the RI representation from a binary system to a CSEP-testable model that gives forecasts for the number of earthquakes with prescribed magnitudes. The final submission consists of 36 variants, with four different smoothing representations (smoothing radii of 10, 30, 50, and 100 km, respectively) for each of the nine categories, so that it is possible to see which categories and which smoothing methods can make the most of the RI hypothesis.

    • K. Z. Nanjo, Earthquake forecasts for the CSEP Japan experiment based on the RI algorithm, Earth Planets Space, 63 (3), 261-274, 2011, doi:10.5047/eps.2011.01.001.
  • Triple-S-Japan model

The Simple Smoothed Seismicity (Triple-S) model is based on Gaussian smoothing of historical seismicity. Epicenters of past earthquakes are supposed to contribute to earthquake density estimates, after those epicenters have been smoothed using a fixed length scale; this scale is optimized so that it minimizes the average area skill score misfit function in a retrospective experiment (Zechar and Jordan, 2010). The density map is scaled to match the average rate of historical seismicity. J. Zechar optimized the Triple-S model specifically for Japan and submitted it (called Triple-S-Japan) to nine categories: three testing classes (3 years, 1 year, and 3 months) for all three testing regions. The reader is referred to Zechar and Jordan (2010) for the CSEP Italy experiment.

    • J. Douglas Zechar and Thomas H. Jordan, Simple smoothed seismicity earthquake forecasts for Italy, ANNALS OF GEOPHYSICS, 53(3), 99-105, 2010, doi:10.4401/ag-4845.