 Technical report
 Open access
 Published:
Shortterm prediction of geomagnetic secular variation with an echo state network
Earth, Planets and Space volume 76, Article number: 121 (2024)
Abstract
A technique for predicting the secular variation (SV) of the geomagnetic field based on the echo state network (ESN) model is proposed. SV is controlled by the geodynamo process in the Earth’s outer core, and modeling its nonlinear behaviors can be challenging. This study employs an ESN to represent the shortterm temporal evolution of the geomagnetic field on the Earth’s surface. The hindcast results demonstrate that the ESN enables us to predict SV for a duration of several years with satisfactory accuracy. It is also found that the prediction is robust to the length of the the training data period. This suggests that the recent features of the SV are important for shortterm prediction and that the ESN effectively learns these features.
Graphical Abstract
1 Introduction
The geomagnetic main field is thought to be driven by a dynamo process in the Earth’s outer core. It undergoes gradual and continuous changes known as secular variation (SV) as a result of the dynamo process. The magnitude of SV can exceed \(10\,\textrm{nT}\) per year on the Earth’s surface, which is comparable to or larger than that of ionospheric and magnetospheric origin. Hence, the prediction of SV on a time scale of several years is an important component of the geomagnetic model. The International Geomagnetic Reference Field (IGRF) model (Alken et al. 2021) includes an SV model for predicting the trend for the upcoming 5 years. However, since SV sometimes shows nonlinear behaviors such as geomagnetic jerks (e.g., Courtillot and Mouël 1984; Alexandrescu et al. 1996), its accurate prediction is difficult. Accordingly, various approaches were employed in the 14 SV candidate models which contributed to the latest IGRF model (Alken et al. 2021 and references therein). Since the geomagnetic main field is thought to be driven by a dynamo process in the Earth’s outer core, some candidate models assimilated ground and satellite data into numerical geodynamo models. (e.g., Minami et al. 2020; Fournier et al. 2021). Data assimilation is a straightforward approach to account for the nonlinear dynamics of the outer core. However, a typical geodynamo model represents the state of the geodynamo with millions of variables, whereas the IGRF model represents the geomagnetic main field on the Earth’s surface using about 100 internal Gauss coefficients. The computational cost of data assimilation using a geodynamo model is thus excessive for predicting the parameters of the geomagnetic field model.
The purpose of this study is to explore a machinelearningbased method for predicting SV efficiently. Although machine learning models are usually trained to represent general nonlinear dynamics under various conditions, a huge data set is basically required to cover all possible states to allow for various cases (e.g., Fukagata 2023). For the problem considered here, however, we have observation data for only the most recent 100 to 1000 years, while it takes at least tens of thousands of years for the state of the outer core to circulate to sufficiently cover the solution trajectory. Hence, the available observations are obviously insufficient for covering all possible solution trajectories of the geodynamo system. Gwirtz et al. (2022) attempted to predict reversals of the geomagnetic field polarity rather than spatial structures of the geomagnetic field, and they reported that the prediction of reversals was difficult even with over one million years of paleomagnetic data. This study does not aim to consider all the possible cases but to concentrate on the local features of the solution trajectory for recent decades. Here, we employ an echo state network (ESN) model (Jaeger and Haas 2004; Lukoševičius and Jaeger 2009; Lukoševičius 2012) to achieve this goal. The ESN is a kind of reservoir computing framework and it is also regarded as a recurrent neural network in which the connections and weights between hidden state variables are randomly set and fixed. An ESN is trained by optimizing the weights of only the output layer, and the number of the model parameters is much smaller in the ESN than in the latest deep neural network models. Hence, a large amount of data is not required for training the ESN. In spite of its simplicity, an ESN shows satisfactory performance in various geophysical applications (e.g., Kataoka and Nakano 2021; Nakano and Kataoka 2022; Walleshauser and Bollt 2022). In the present study, we apply an ESN to model the temporal evolution of the geomagnetic field, aiming to capture the nonlinear behaviors of SV effectively.
2 Method
Following many models of the Earth’s magnetic field, including the IGRF model, we represent the magnetic field \(\varvec{B}\) with a scalar potential V as:
As we can assume the magnetic field is of internal origin, the potential V is expanded into spherical harmonics:
where a denotes the Earth’s mean radius. The SV of the geomagnetic field is represented by the first time derivatives of the Gauss coefficients \(g_n^m(t)\) and \(h_n^m(t)\). We set \(N=10\) in this study.
As the nonlinear machine learning model cannot appropriately handle the time series with a monotonous trend, we remove the trend by considering the temporal difference of the Gauss coefficients as follows:
We model the temporal variations of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) using an ESN. The state of the ESN at time \(t_k\) is represented by state vector \(\varvec{x}_k\). The number of state variables \(M_x\) is set to 1000 in this study because the prediction was not improved even with \(M_x\) exceeding 1000. We feed the current \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) as the inputs of the ESN and aim to predict the next \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) as depicted in Fig. 1. At each time step k, the ith element of \(\varvec{x}_k\), \(x_{k,i}\), is updated as follows:
where \(\varvec{z}_k\) denotes the input vector, \(\varvec{w}_i\) is the weight vector for connections among the state variables, and \(\varvec{u}_i\) is the weight vector for connections between the state variables and input variables. The value of \(\eta _i\) for each i was given by a random number drawn from a normal distribution with mean 0 and standard deviation 20 and was fixed. The parameter \(\xi \) is called the leakage rate (Jaeger et al. 2007) and we fixed its value at 0.5 in the present study. When the leakage rate is set between 0 and 1, Eq. (5) works as a lowpass filter for its activations with the cutoff frequency:
(Lukoševičius and Jaeger 2009). The leakage rate 0.5 would thus smooth the variations with a timescale less than \(2\pi \) steps. The weights \(\varvec{w}_i\) and \(\varvec{u}_i\) are given in advance and are fixed. We set \(90\%\) of the weights \(\{\varvec{w}_i\}\) and \(\{\varvec{u}_i\}\) (randomly chosen) to zero. Accordingly, the nodes in the network are randomly connected as shown in Fig. 1. The values of the remaining nonzero elements of \(\varvec{u}_i\) are drawn randomly from a normal distribution with mean 0 and standard deviation \(\sigma _u\). The standard deviation \(\sigma _u\) is set to be 1 in this study. The values of the nonzero elements of \(\varvec{w}_i\) are also drawn from a normal distribution. The weights \(\{\varvec{w}_i\}\) are then rescaled such that the maximum singular value of the weight matrix, which is defined as:
becomes 0.99. This rescaling is applied to satisfy the socalled “echo state property”, which guarantees that the state of the ESN is not affected by distant past inputs. The output of the ESN at time \(t_k\), \(\varvec{y}_k\), is then obtained from \(\varvec{x}_k\) as follows:
where \(\textbf{Q}\) is the weight matrix for obtaining the output. The output \(\varvec{y}_k\) corresponds to a prediction of the observation at time \(t_k\).
Denoting the observation at time \(t_k\) as \(\varvec{d}_k\), the matrix \(\textbf{Q}\) is determined by minimizing the following objective function:
where \(\Vert \textbf{Q}\Vert _F\) denotes the Frobenius norm of the matrix \(\textbf{Q}\). \(\textbf{R}_k\) is the matrix that controls the misfit to the data \(\varvec{d}_k\). In this study, we assume the \(\textbf{R}_k\) is a diagonal matrix decomposed as follows:
We determine \(\sigma _{k,1}, \ldots , \sigma _{k,M_y}\) according to the uncertainties of each element of the data vector \(\varvec{d}_k\) and \(\lambda _{1}, \ldots , \lambda _{M_y}\) are tunable parameters, which will be discussed later. Decomposing \(\varvec{d}_k\) and \(\textbf{Q}\) as \(\varvec{d}_k=\left( d_{k,1}, \ldots , d_{k,M_y}\right) \) and \(\textbf{Q}=\left( \varvec{q}_{1}, \ldots , \varvec{q}_{M_y}\right) \), respectively, Eq. (9) can be rewritten as follows:
We can thus find the optimal weight matrix \(\textbf{Q}\) by obtaining the optimal value for each vector \(\varvec{q}_k\) that minimizes the following function:
for each i. For training the ESN, we use the observations as the input. Given a sequence of inputs, the state vector \(\varvec{x}_k\) for each step k is deterministically obtained via Eq. (5). The observation \(\varvec{d}_k\) is also given. As Eq. (12) is quadratic, the optimal \(\varvec{q}_{i}\) that minimizes \(J_i\) is analytically obtained by solving the following linear equation:
We obtain the optimal \(\varvec{q}_i\) as:
where \(\textbf{I}\) denotes the identity matrix.
For training the ESN, \(\Delta g_n^m(t_{k1})\) and \(\Delta h_n^m(t_{k1})\) are fed into the ESN as the input \(\varvec{z}_k\) in Eq. (5), and \(\Delta g_n^m(t_{k})\) and \(\Delta h_n^m(t_{k})\) are used as the observation \(\varvec{d}_k\) in Eq. (9). We derive the time sequence of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) from the COVOBS.x2 model (Huder et al. 2020) and use it for training. As \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) are used as the observations, the trained ESN yields a prediction for \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) as the output \(\varvec{y}_k\). When we use the trained ESN for future prediction, the prediction of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) is fed back into the ESN as the input at the next time step \(\varvec{z}_{k+1}\), allowing the model to predict \(\Delta g_n^m(t_{k+1})\) and \(\Delta h_n^m(t_{k+1})\).
3 Hindcast experiments
We conduct hindcast experiments to reproduce the temporal evolution of the geomagnetic main field after training the ESN using the COVOBS.x2 model. The COVOBS.x2 model gives the temporal evolution of the Gauss coefficients of the scalar potential V as an ensemble of field models sampled from a continuous stochastic model. Here we obtain the Gauss coefficients for every year and model the yearly temporal evolution of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) with the ESN. When \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) are obtained as the temporal difference for a 1year interval, their typical scale is of the order of \(10\,\textrm{nT}\). To prevent the output from being affected by the setting of the initial value, we spun up the ESN by inputting the observations of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) for at least 20 years before the start time of the training period. After the spinup, input data for a sufficient number of time steps are fed into the ESN which satisfies the echo state property, it would no longer depend on how the initial value is set. Since the COVOBS.x2 model is available from 1840, we use the observations of \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) from 1840 to 1920 for spinup and train the ESN using the observations from 1921 to 2005. We then predict \(\Delta g_n^m(t_k)\) and \(\Delta h_n^m(t_k)\) from 2006 to 2015 and obtain the hindcast of \(g_n^m(t_k)\) and \(h_n^m(t_k)\) accordingly.
To determine \(\varvec{q}_i\) using Eq. (14), the parameters \(\sigma _{k,i}\) and \(\lambda _{i}\) must be given in advance. The parameter \(\sigma _{k,i}\) corresponds to the uncertainty of the observation \(d_{k,i}\) and was determined by the variance for each coefficient \(g_n^m(t_k)\) and \(h_n^m(t_k)\) given by the COVOBS.x2 model. As the minimization of \(J_i^{\prime }\) can be regarded as a Bayesian estimation problem where \(\lambda _i\) represents the variance of the Gaussian prior distribution for \(\varvec{q}_i\), the parameter \(\lambda _{i}\) was determined by maximizing the log marginal likelihood \(L_i\) for each i:
(e.g., Morris 1983; Casella 1985). The optimal \(\lambda _i\) was found by line search.
First, we set the start time of the hindcast experiments to be 2005, with the ESN trained with the COVOBS.x2 model up to this year. Figure 2 shows the results of the hindcast for \(g_1^0\), \(g_1^1\), \(h_1^1\), \(g_2^0\), \(g_2^1\), \(h_2^1\), \(g_2^2\), \(h_2^2\), and \(g_3^0\). Figure 3 shows the results of the hindcast of the SV for the corresponding coefficients \(\Delta g_1^0\), \(\Delta g_1^1\), \(\Delta h_1^1\), \(\Delta g_2^0\), \(\Delta g_2^1\), \(\Delta h_2^1\), \(\Delta g_2^2\), \(\Delta h_2^2\), and \(\Delta g_3^0\). In each panel of these figures, the blue line indicates results of the hindcast conducted with the ESN, the red line indicates the COVOBS.x2, and the green line indicates the prediction of the 10thgeneration IGRF (IGRF10) model (Maus et al. 2005). The COVOBS.x2 model is regarded as the actual SV. Since the IGRF10 was released in 2005, the prediction by the IGRF10 is also shown as a benchmark of the prediction from 2005. The ESN provided better predictions for \(g_1^0\), \(g_1^1\), \(g_2^0\), \(g_2^1\), and \(h_2^2\). Although the prediction with the ESN deviated from the actual SV as described by the COVOBS.x2 for \(g_2^2\) and \(g_3^0\) (coefficients whose trends changed around 2005), the performance of the ESN was comparable to or better than that of the IGRF10 even for these coefficients. This result suggests that the ESN has significant potential for predicting SV with satisfactory accuracy. We also examined the performance for the case when the start time of the hindcast was set to be 2010. Figures 4 and 5 show the results of the hindcast beginning in 2010 with the blue line. The COVOBS.x2 model and the 11thgeneration IGRF (IGRF11) model (Finlay et al. 2010) are also indicated with the red and green lines, respectively, for reference. Although the IGRF11 model well predicted the SV from 2010, the ESN predicted the SV better than the IGRF11 model.
We evaluate the accuracy of the prediction with the rootmeansquare (RMS) differences (Whaler and Beggan 2017; Minami et al. 2020):
where \(g_{n, p}^m\) and \(h_{n, p}^m\) are the predicted Gauss coefficients and \(g_{n, o}^m\) and \(h_{n, o}^m\) are the observed actual Gauss coefficients provided by the COVOBS.x2 model. The former metric \(\sqrt{dP_{n}}\) indicates the prediction error for each degree and the latter metric \(\sqrt{dP}\) indicates the overall error. In Fig. 6, the blue and solid green lines show the values of \(\sqrt{dP(t_k)}\) from 2005 of the prediction with the ESN and the IGRF10 model, respectively. As the IGRF model did not match the COVOBS model at the start time, prediction with the SV model of the IGRF10 was also performed with the COVOBS.x2 model at the beginning of 2005 as shown by the dotted green line. Figure 7 shows the normalized errors defined as:
for the predictions shown in Fig. 6. The ESN achieved a better accuracy than the IGRF10 for the span of 10 years. Even if the prediction was initialized with the same model, the ESN outperformed the IGRF10 SV model. Figure 8 compares \(\sqrt{dP(t_k)}\) from 2010 between the ESN prediction and the IGRF11 model. Again, the prediction with the ESN was better than the IGRF11 model. The RMS difference between the ESN prediction and the COVOBS.x2 was about \(65\,\textrm{nT}\) in 2015, which was comparable to the result with the prediction based on the estimation of core surface flow and magnetic diffusion (Metman et al. 2020). Minami et al. (2020) hindcasted the SV from 2015 with data assimilation using the ensemblebased variational method, and attained \(\sqrt{dP_{n}(t_k)}\simeq 100\,\textrm{nT}\) at 2020. As indicated with a cyan line in Fig. 8, the ESN attained \(\sqrt{dP_{n}(t_k)}\simeq 80\,\textrm{nT}\) at 2020, which is better than the result with the ensemblebased variational method. On the other hand, Sanchez et al. (2020) conducted a hindcast with data assimilation based on the ensemble Kalman filter. Their prediction for \(\dot{g}_1^0\) was very similar to the COVOBS.x2, while the ESN prediction deviated from the COVOBS.x2 by about \(1\,\mathrm {nT/year}\). However, the ESN overall achieved high accuracy comparable to other methods with remarkably low computational cost.
4 Discussion
To predict the evolution of a dynamical system for general cases, it is necessary to model the global structure of the trajectory in its phase space. To achieve this purpose with datadriven approaches such as machine learning techniques, a large amount of data covering all possible cases is required. However, this is not possible for the geodynamo system because the time scale of the observation is much shorter than the time scale for the outer core system to pass through the entire trajectory, as well as the large number of state vector elements are required (on the order of 100 for the 10th order spherical harmonics). On the other hand, the result of the hindcasts presented in this article demonstrates that the ESN is potentially effective for predicting SV for several years. This could be interpreted as the ESN learning a local structure of the trajectory in the vicinity of the current state. Indeed, the ESN can well predict the SV with the training data for a shorter period. Figure 9 compares \(\sqrt{dP}\) between the ESNs trained with the data for four different periods: 1861–2005, 1921–2005, 1961–2005, and 2000–2005. The starting point of the spinup period was unchanged at 1840. The performances were very similar even if the length of the training data was shortened. This suggests that the ESN uses the characteristics of the recent variations of the geomagnetic main field for predicting SV when the observations were more reliable and the uncertainties were small. Conversely, the prediction gets worse without recent observations. Figure 10 shows \(\sqrt{dP}\) for the ESN trained with the data for the period from 1861 to 1945, which is the same length as the case in Fig. 6. When the ESN was trained without recent observations, the prediction errors increased, suggesting the importance of the recent data.
There have been several previous studies on the prediction of SV by geodynamobased data assimilation using ensemble Kalman filters (Aubert 2015; Sanchez et al. 2020) or deterministic forecasts based on sequential solutions of the magnetic induction equation incorporating both core surface flow and magnetic diffusion (Metman et al. 2020). The former two data assimilation studies needed to constrain state vectors with \(10^6\) elements or more in the Earth’s liquid outer core using very limited surface observations or geomagnetic field models such as COVOBS. Moreover, they had to work with 100 to 1000 ensemble members at the same time, which inevitably led to prohibitive computational costs. The deterministic approach by Metman et al. (2020) required less computer resources, but their prediction errors were dependent on the length of the model fitting window. The ESN method employed here, however, achieved prediction errors comparable to those methods as shown in Fig. 6 and these prediction errors are likely to be robust to the length of the training data period. The ESN calculation was light and fast mainly because it needs only to work with a state vector of 1000 elements. Moreover, examples of our 10year prediction for the first 9 internal Gauss coefficients in Fig. 2 suggest that the ESN has significant potential for shortterm SV prediction of our planets’ intrinsic magnetic field for a duration of 5 to 10 years.
5 Summary
This study examined the applicability of the ESN, which is a kind of recurrent neural network with fixed connections among hidden state variables, for predicting SV. We trained the ESN using the COVOBS.x2 model from 1840 to 2005 and conducted a hindcast of SV from 2005. The results demonstrate that the ESN can predict SV with satisfactory accuracy. In spite of the nonlinear behavior of SV, the temporal evolution of the geomagnetic field is successfully predicted over a 10year prediction period. The performance was also not highly dependent on the length of the period of the training data. The availability of a highly accurate temporal evolution of the geomagnetic field for the last several years is thus important for predicting SV with the ESN.
Data availability
The COVOBS.x2 geomagnetic field model is available via: http://www.spacecenter.dk/files/magneticmodels/COVOBSx2/.
Code availability
The code of the trained ESN model is available at https://doi.org/10.5281/zenodo.12650181.
Abbreviations
 SV:

Secular variation
 ESN:

Echo state network
 IGRF:

International Geomagnetic Reference Field
References
Alexandrescu M, Gibert D, Hulot G, Le Mouël JL, Saracco G (1996) Worldwide wavelet analysis of geomagnetic jerks. J Geophys Res 101:21975–21994. https://doi.org/10.1029/96JB01648
Alken P, Thébault E, Beggan CD, Amit H, Aubert J, Baerenzung J, Bondar TN, Brown WJ, Califf S, Chambodut A, Chulliat A, Cox GA, Finlay CC, Fournier A, Gillet N, Grayver A, Hammer MD, Holschneider M, Huder L, Hulot G, Jager T, Kloss C, Korte M, Kuang W, Kuvshinov A, Langlais B, Léger JM, Lesur V, Livermore PW, Lowes FJ, Macmillan S, Magnes W, Mandea M, Marsal S, Matzka J, Metman MC, Minami T, Morschhauser A, Mound JE, Nair M, Nakano S, Olsen N, PavónCarrasco FJ, Petrov VG, Ropp G, Rother M, Sabaka TJ, Sanchez S, Saturnino D, Schnepf NR, Shen X, Stolle C, Tangborn A, TøffnerClausen L, Toh H, Torta JM, Varner J, Vervelidou F, Vigneron P, Wardinski I, Wicht J, Woods A, Yang Y, Zeren Z, Zhou B (2021) International Geomagnetic Reference field: the thirteenth generation. Earth Planets Space 73:49. https://doi.org/10.1186/s4062302001288x
Alken P, Thébault E, Beggan CD, Aubert J, Baerenzung J, Brown WJ, Califf S, Chulliat A, Cox GA, Finlay CC, Fournier A, Gillet N, Hammer MD, Holschneider M, Hulot G, Korte M, Lesur V, Livermore PW, Lowes FJ, Macmillan S, Nair M, Olsen N, Ropp G, Rother M, Schnepf NR, Stolle C, Toh H, Vervelidou F, Vigneron P, Wardinski I (2021) Evaluation of candidate models for the 13th generation International Geomagnetic Reference Field. Earth Planets Space 73:48. https://doi.org/10.1186/s40623020012814
Aubert J (2015) Geomagnetic forecasts driven by thermal wind dynamics in the Earth’s core. Geophys J Int 203:1738–1751. https://doi.org/10.1093/gji/ggv394
Casella G (1985) An introduction to empirical Bayes data analysis. Am Stat 39:83–87
Courtillot V, Mouël JLL (1984) Geomagnetic secular variation impulses: a review of observational evidence and geophysical consequences. Nature 311:709–716. https://doi.org/10.1186/s4062302001313z
...Finlay CC, Maus S, Beggan CD, Bondar TN, Chambodut A, Chernova TA, Chulliat A, Golovkov VP, Hamilton B, Hamoudi M, Holme R, Hulot G, Kuang W, Langlais B, Lesur V, Lowes FJ, Lühr H, Macmillan S, Mandea M, McLean S, Manoj C, Menvielle M, Michaelis I, Olsen N, Rauberg J, Rother M, Sabaka TJ, Tangborn A, TøffnerClausen L, Thébault E, Thomson AWP, Wardinski I, Wei Z, Zvereva TI (2010) International Geomagnetic Reference Field: the eleventh generation. Geophys J Int 183:1216–1230. https://doi.org/10.1111/j.1365246X.2010.04804.x
Fournier A, Aubert J, Lesur V, Ropp G (2021) A secular variation candidate model for IGRF13 based on Swarm data and ensemble inverse geodynamo modelling. Earth Planets Space 73:43. https://doi.org/10.1186/s40623020013099
Fukagata K (2023) Reduced order modeling of fluid flows using convolutional neural networks. J Fluid Sci Technol 18:jfst0002. https://doi.org/10.1299/jfst.2023jfst0002
Gwirtz K, Davis T, Morzfeld M, Constable C, Fournier A, Hulot G (2022) Can machine learning reveal precursors of reversals of the geomagnetic axial dipole field? Geophys J Int 231:520–535. https://doi.org/10.1093/gji/ggac195
Huder L, Gillet N, Finlay CC, Hammer MD, Tchoungui H (2020) COVOBS.x2: 180 years of geomagnetic field evolution from groundbased and satellite observations. Earth Planets Space 72:160. https://doi.org/10.1186/s40623020011942
Jaeger H, Haas H (2004) Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 304:78–80. https://doi.org/10.1126/science.1091277
Jaeger H, Lukoševičius M, Popovici D, Siewert U (2007) Optimization and applications of echo state networks with leakyintegrator neurons. Neural Netw 20:335–352. https://doi.org/10.1016/j.neunet.2007.04.016
Kataoka R, Nakano S (2021) Reconstructing solar wind profiles associated with extreme magnetic storms: a machine learning approach. Geophys Res Lett 48:e2021GL096275. https://doi.org/10.1029/2021GL096275
Lukoševičius M (2012) A practical guide to applying echo state networks. In: Montavon G, Orr G, Müller K (eds) Neural networks: tricks of the trade. Springer, Berlin, pp 659–686
Lukoševičius M, Jaeger H (2009) Reservoir computing approaches to recurrent neural network training. Comput Sci Rev 3:127–149. https://doi.org/10.1016/j.cosrev.2009.03.005
Maus S, Macmillan S, Chernova T, Choi S, Dater D, Golovkov V, Lesur V, Lowes F, Lühr H, Mai W, McLean S, Olsen N, Rother M, Sabaka T, Thomson A, Zvereva T (2005) The 10thgeneration International Geomagnetic Reference Field. Geophys J Int 161:561–565. https://doi.org/10.1111/j.1365246X.2005.02641.x
Metman MC, Beggan CD, Livermore PW, Mound JE (2020) Forecasting yearly geomagnetic variation through sequential estimation of core flow and magnetic diffusion. Earth Planets Space 72:149. https://doi.org/10.1186/s40623020011933
Minami T, Nakano S, Lesur V, Takahashi F, Matsushima M, Shimizu H, Nakashima R, Taniguchi H, Toh H (2020) A candidate secular variation model for IGRF13 based on MHD dynamo simulation and 4DEnVar data assimilation. Earth Planets Space 72:136. https://doi.org/10.1186/s40623020012538
Morris CM (1983) Parametric empirical Bayes inference: theory and applications. J Am Stat Assoc 78:47–55
Nakano S, Kataoka R (2022) Echo state network model for analyzing solarwind effects on the AU and AL indices. Ann Geophys 40:11–22. https://doi.org/10.5194/angeo40112022
Sanchez S, Wicht J, Bärenzung J (2020) Prediction of the geomagnetic secular variation based on the ensemble sequential assimilation of geomagnetic field models by dynamo simulations. Earth Planets Space 72:157. https://doi.org/10.1186/s4062302001279y
Walleshauser B, Bollt E (2022) Predicting sea surface temperatures with coupled reservoir computers. Nonlin Process Geophys 29:255–264. https://doi.org/10.5194/npg292552022
Whaler KA, Beggan CD (2017) Derivation and use of core surface flows for forecasting secular variation. J Geophys Res Solid Earth 120:1400–1414. https://doi.org/10.1002/2014JB011697
Funding
This study was supported by a research program funded by the Joint Support Center for Data Science, Research Organization of Information and Systems (ROISDSJOINT 007RP2022, 003RP2023). The work of SN was supported in part by the Japan Society for the Promotion of Science (JSPS) KAKENHI, Grant Number JP23K20375.
Author information
Authors and Affiliations
Contributions
SN and TH conceived the study. SN and SS developed the numerical technique and evaluated the technique. All authors have read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Nakano, S., Sato, S. & Toh, H. Shortterm prediction of geomagnetic secular variation with an echo state network. Earth Planets Space 76, 121 (2024). https://doi.org/10.1186/s4062302402064x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s4062302402064x