0% found this document useful (0 votes)
6 views

Rodrguez MachineLearningbasedPrediction 2022

Uploaded by

Majid Yousefi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Rodrguez MachineLearningbasedPrediction 2022

Uploaded by

Majid Yousefi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Machine Learning-based Prediction of Sunspots using Fourier Transform Analysis of the

Time Series
Author(s): José-Víctor Rodríguez, Ignacio Rodríguez-Rodríguez and Wai Lok Woo
Source: Publications of the Astronomical Society of the Pacific, 2022 December, Vol. 134,
No. 1042 (2022 December), pp. 1-7
Published by: Astronomical Society of the Pacific
Stable URL: https://www.jstor.org/stable/10.2307/27303191

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected].

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

Astronomical Society of the Pacific is collaborating with JSTOR to digitize, preserve and extend access to
Publications of the Astronomical Society of the Pacific

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December https://doi.org/10.1088/1538-3873/aca4a3
© 2022. The Astronomical Society of the Pacific. All rights reserved.

Machine Learning-based Prediction of Sunspots using Fourier Transform


Analysis of the Time Series
José-Víctor Rodríguez1,2 , Ignacio Rodríguez-Rodríguez3 , and Wai Lok Woo4
1
Universidad Politécnica de Cartagena, Departamento de Tecnologías de la Información y las Comunicaciones, E-30202, Cartagena, Spain
2
Universidad de Granada, Departamento de Física Teórica y del Cosmos, E-18071, Granada, Spain; [email protected], [email protected]
3
Universidad de Málaga, Departamento de Ingeniería de Comunicaciones, BioSIP Group, E29071, Málaga, Spain; [email protected]
4
Department of Computer and Information Sciences, Northumbria University, Newcastle upon Tyne, NE1 8ST, UK; [email protected]
Received 2022 November 6; accepted 2022 November 21; published 2022 December 19

Abstract
The study of solar activity holds special importance since the changes in our star’s behavior affect both the Earth’s
atmosphere and the conditions of the interplanetary environment. They can interfere with air navigation, space
flight, satellites, radar, high-frequency communications, and overhead power lines, and can even negatively
influence human health. We present here a machine learning-based prediction of the evolution of the current
sunspot cycle (solar cycle 25). First, we analyze the Fourier Transform of the total time series (from 1749 to 2022)
to find periodicities with which to lag this series and then add attributes (predictors) to the forecasting models to
obtain the most accurate result possible. Consequently, we build a trained model of the series considering different
starting points (from 1749 to 1940, with 1 yr steps), applying Random Forests, Support Vector Machines, Gaussian
Processes, and Linear Regression. We find that the model with the lowest error in the test phase (cycle 24) arises
with Random Forest and with 1915 as the start year of the time series (yielding a Root Mean Squared Error of 9.59
sunspots). Finally, for cycle 25 this model predicts that the maximum number of sunspots (90) will occur in 2025
March.
Unified Astronomy Thesaurus concepts: Sunspots (1653); Solar activity (1475); Time series analysis (1916);
Random Forests (1935); Linear regression (1945); Gaussian Processes regression (1930); Support vector
machine (1936)

1. Introduction 25). However, the variations that these cycles undergo in both
Changes in solar activity affect both the conditions of amplitude and duration (Cameron & Schüssler 2017)—point-
the interplanetary environment and the Earth’s atmosphere ing to the existence of other concurrent cycles of different
(Hiremath 2006; Pulkkinen 2007; Hathaway 2015; Sagir et al. periods as well as additional underlying phenomena—make the
2015; Kim et al. 2018), such that an increase in our star’s accurate prediction of the SSN evolution of the current and
intensity may not only impact aircraft navigation, space flight, future solar cycles an ongoing research challenge. In fact, it
satellites, radar, high-frequency communications, and overhead should be mentioned that solar cycles may not actually form a
power lines (Lybekk et al. 2012; Lewandowski 2015), but multi-periodic system at all, but rather constitute a weakly
could also be harmful to humans (Azcárate et al. 2016; Qu chaotic system that is unlike a periodic, multi-periodic, or
2016). Therefore, predicting solar activity is an area of great quasi-periodic dynamical system (Carbonell et al. 1994;
research interest to anticipate the potential impact of the Sun’s Letellier et al. 2006; Hanslmeier & Brajša 2010). In any case,
intensity on space technology and life on Earth in general. while numerous papers, using different approaches, have
In this sense, the number of sunspots (SSN) (dark areas that attempted to predict the SSN evolution of the current solar
appear on the solar disk) is one of the most important and cycle (Han & Yin 2019; Labonville et al. 2019; Kakad et al.
simplest indicators to measure the Sun’s activity (Usoskin 2020; Kitiashvili 2020; McIntosh et al. 2020), the great dis-
2017), not least because it correlates with several other phe- parity in the results of these and other works (Nandy 2021)
nomena, such as solar flares (Liu et al. 2008). As the SSN is a makes it necessary to continue the search for alternative and
directly observable parameter, there are public records on it innovative methods for the most accurate forecasting of solar
spanning 1749 to the present day. A simple observation of SSN activity.
behavior over this period (almost 300 yr) clearly reveals a A recent approach to the prediction of time series, such as
fundamental cyclical pattern of solar activity that repeats SSN, is the use of machine learning (ML) techniques. The
approximately every 11 yr (we are currently in cycle number currently available computational power, together with the

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

Figure 1. 13 month smoothed monthly total SSN between 1749 and 2022. Figure 2. Fourier Transform (periodogram) of the total time series as a function
of the period (years/cycle) Figure 1. 1749 and 2022.

latest artificial intelligence algorithms, enables the detection of


patterns in extensive data sequences, permitting the modeling, the SSN per month in a 13 month period, centered on the
classifying, or predicting of the behavior of the astrophysical corresponding month, is used to derive the 13 month smoothed
phenomena underlying these time series. In this regard, several monthly SSN (Smoothing function: equal weights = 1,
works using ML techniques have already attempted to predict excluding the first elements and last elements (−6 and +6
the SSN behavior of cycle 25 (Okoh et al. 2018; Pala & Atici months) = 0.5, normalization by a factor of 1/12). In the data
2019; Covas et al. 2019; Dani & Sulistiani 2019; Dang et al. series, neither the first six months nor the last six months have
2022). However, even within this approach, there is still a smoothed values. In applying the different modeling algo-
considerable disparity in the results; thus, it seems clear that the rithms, the last 131 of the total number of records (part of cycle
efforts to employ ML techniques in the search for accurate SSN 24 and the beginning of cycle 25) are always used for the test
prediction should continue, which in turn implies the applica- stage, while the remainder (with a variable number of records
tion of alternative strategies as well as the consideration of the depending on the considered series start year, selected in 1 yr
most up-to-date available data. steps from 1749) is dedicated to the training and validation
Regarding the above, this paper presents an ML-based pre- stages (applying a cross-validation of 6-folds with an 8/2
diction of the evolution of sunspot cycle 25. First, the Fourier ratio). This way, the validation stage monitors the training stage
Transform of the total time series (1749–2022) is analyzed to to adjust the hyperparameters, while the test phase evaluates the
identify periodicities with which to lag this series and thus add goodness of each trained model by comparing its prediction
attributes (predictors) to the forecasting models, aiming to against the 131 forthcoming records (which are actually
obtain the most accurate results possible. Then, four different known). Thus, once the best model (with a lower Root Mean
ML models are applied, namely, Random Forests (RF), Sup- Squared Error, RMSE) has been found, together with the
port Vector Machines (SVM), Gaussian Processes (GP), and optimal length of the historical data, a prediction of the SSN
Linear Regression (LR), with which different trained models of evolution in cycle 25 (with a special interest in observing the
the series are built considering different starting points (from maximum) and the beginning of cycle 26 (again totaling 131
1749 to 1940, with 1 yr steps). This will demonstrate which positions ahead) is carried out.
model and data range minimize the error.
2.2. Fourier Transform Analysis of the Time Series
2. Methodology To find periodicities with which to lag the sunspot series and
thus add attributes (predictors) to the forecasting algorithms (to
2.1. Data Description obtain the most accurate result possible), Figure 2 shows the
This work uses the sunspot database corresponding to the Fourier Transform (periodogram) of the total time series (from
World Data Center SILSO (V2.0), Royal Observatory of Bel- 1749 to 2022), expressed as a function of periods in years/
gium, Brussels. Specifically, we consider 3273 records of the cycle.
13 month smoothed monthly total SSN, from July 1749 to As can be observed, a peak (period) of maximum power
March 2022 (Figure 1). The “tapered-boxcar” running mean of clearly emerges around 11 yr (as mentioned in the Introduction)

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

Table 1 2.3. Implementation and Algorithms Considered


Powers of the Five Periods Selected from the Periodogram
All the algorithms in this work are implemented in Java
Period Power
(version 8.0) and on an Intel(R) Core(TM) i7-7500U PC with a
10.91 5215127457 CPU of 2.90 GHz and 16 GB of RAM. Meanwhile, as pre-
8.52 613702634
54.53 547208560
viously mentioned, the following modeling and prediction
15.15 281656205 techniques are applied: LR, SVM, RF and GP. Each of these is
5.45 240345289 briefly described below.
As one of the more simple techniques, LR is employed to
estimate the model parameters, aiming to minimize the sum of
the squared errors (Shmueli & Lichtendahl 2016). This tech-
Table 2 nique can be modified using partial least squares/penalized
The Ranked Relevance of the Five Attributes Selected
models, e.g., least absolute shrinkage and selection operator
Average Merit Attribute (LASSO) or ridge regression. These models are particularly apt
14.696 Lag of 131 positions (10.91 yr) as their interpretation is fairly straightforward. Meanwhile, the
8.045 Lag of 65 positions (5.45 yr) coefficients demonstrating the relationships are usually simple
4.053 Lag of 182 positions (15.15 yr)
to calculate, allowing several features to be used. On the other
2.55 Lag of 102 positions (8.52 yr)
0.261 Lag of 654 positions (54.53 yr) hand, such models show limited performance (Faloutsos et al.
2018), although the predictor/response relationship being
located on a hyperplane facilitates good results. However, for
higher-order (e.g., cubic or quadratic) relationships, nonlinear
and such a period can also be seen with the naked eye—in the relationships may not be well-captured by such models, indi-
time domain—in Figure 1. However, there are other periods of cating the need for a different approach (Kalekar 2004).
lower intensity that must be taken into account in order to Non-linear trends can be comprehended using other models,
include the greatest amount of relevant information on the wherein it is not necessary to precisely know the nonlinearity
series in the modeling process. In this sense, the five periods of type before the model is built. A commonly used model, SVM,
a higher power in the periodogram (zoomed in Figure 3), i.e., uses dual learning algorithms, which calculate the dot-products
5.45, 8.52, 10.91, 15.15, and 54.53 yr, are considered, giving of data in processing (Vapnik 2013), whereby a kernel function
rise to five attributes (variables) lagged by the following may be employed to ensure the proper calculation of such dot-
number of record positions with respect to the original time products under variable rates (Schölkopf & Smola 2003). This
series: 65, 102, 131, 182 and 654, respectively. In other words, enables SVM to identify the hyperplane that separates the
other than the obvious 10.91 yr periodicity, the additional examples to the maximum extent (maximum margin). There-
attributes are the 5.45-, 8.52, 15.15 and 54.53 yr periodicities, fore, SVMs can resist model overfitting and yet show good
which will be taken into account to provide relevant informa- generalization performance due to the use of the max-margin
tion for the SSN prediction. criterion during optimization. Furthermore, unlike other solu-
In this way, the originally univariate problem becomes auto- tions that give only local optima, because of its corresponding
regressive/multivariate (with five predictors to be included in convex optimization formulation, an SVM converges onto a
the algorithms); therefore, it is likely to yield a more accurate global optimum (Kuhn & Johnson 2013).
model and a subsequently more precise prediction. The powers Recently, Regression Trees, a suite of modeling algorithms,
of the five selected periods are shown in decreasing order in has been subject to much research attention. Tree-based models
Table 1. identify predictors for partitioning data using if/then state-
The relevance of the five attributes (corresponding to the five ments, and then use a model to forecast outcomes within such
selected periodicities) for building the model was evaluated. subsets (Fierrez et al. 2018). From a statistical point of view,
This was done by obtaining a ranking generated via the LR incorporating randomness in the tree’s construction allows any
classifier (employed for feature selection purposes, with para- correlations between predictors to be minimized, as is the case
meters ridge = 10−8; seed = 1; threshold = 0.01 and a 6-fold in RF (Liaw & Wiener 2002). All developed models for a given
cross validation) (Novakovic et al. 2011). Table 2 presents the set are then used to make predictions on new data sets; the final
results. forecast comprises the average of these predictions. By
Clearly, the attribute with the highest relevance corresponds choosing robust complex learners with low bias, RF models are
to the 131-position lag (arising from the 10.91 yr periodicity), able to reduce the variance, minimizing the number of errors
followed by the 65-position lag (5.45 yr periodicity) and the and facilitating the removal of noisy responses (Oshiro
182-position lag (15.15 yr periodicity). et al. 2012).

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

Figure 3. Five main periods of the time series periodogram: (a) 5.45, (b) 8.52, (c) 10.91, (d) 15.15, and e) 54.53 yr.

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

Figure 4. Average RMSE values in the test stage for the four algorithms Figure 5. Example of the training stage with the RF algorithm and time series
considered as a function of the year of the time series start. from 1915.

(b) Seed: 1.
Finally, using Radial Basis Function Kernels (RBF) (c) Polynomial kernel: size of the cache = 250,007;
(Blomqvist et al. 2020) and other comparative strategies, GP exponent = 1.0.
can ensure overall consistency and offer unrestricted basic 4. SVM:
functions. GP generally extracts discernible reactions from a set (a) C parameter: 1.0.
of training data points (function values), subsequently model- (b) Polynomial kernel: size of cache = 250,007;
ing them as multivariate standard random features (Seeger exponent = 1.0.
2004), making it non-parametric. It assumes that the function (c) Sequential minimal optimization (SMO) with epsilon
data values have a priority distribution, ensuring the smooth for round-off error = 10−12; epsilon parameter for
operation of the function. If the vectors being compared are epsilon insensitive loss function = 0.001; toler-
close regarding their separation and sensitivity, the function ance = 0.001; seed = 1.
values correlate closely, with divergence producing decay.
Thus, we may make assumptions to estimate the distribution of The purpose of this analysis is two-fold. First, to determine
the unpredicted function data, applying basic probability which model is the best fit, and second, from which start year
manipulation. the series provides the most adequate quantity (and quality) of
information to obtain a more accurate prediction.
Clearly, the minimum error is reached when considering the
3. Results and Discussion time series start in 1915 and with the RF model (an average
Figure 4 presents the average RMSE obtained for the test RMSE of 9.59 sunspots); however, for that same year, the GP
data with the different algorithms while considering, in each and LR algorithms are also close, with SVM being slightly
case, a different time series start year (with 1 yr steps). The more distant. Therefore, it is worth noting that, in this case,
initialization parameters assumed for the models are the considering 1915 as the starting year of the time series repre-
following: sents the optimal compromise between, on the one hand,
1. RF: having a sufficiently large number of records to perform a good
(a) Bag size (percentage of training set size): 100. prediction and, on the other hand, the fact that the more recent
(b) Number of threads: 1. these data are, the better they reflect the current behavior of the
(c) Number of trees: 100. changing solar activity and, consequently, the more accurate
(d) Maximum depth of trees: unlimited. the forecast will be. It is also worth noting how the RMSE
(e) Number of random features: 0. seems to have a certain oscillating behavior (with ups and
(f) Seed: 1. downs common to the four models) with increasing time series
2. LR: start year. This fact could be explained by the increasing and
(a) Ridge parameter: 10−8. decreasing influence of the different considered periods as the
3. GP: progressive cutting of the historical data eliminates elements
(a) Level of Gaussian Noise: 1.0. that are more or less relevant in shaping such periods.

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

In view of the results, the model predicts that the maximum


number of sunspots in cycle 25 will occur in March 2025, with
a total of 90.

4. Conclusions
This paper presented the ML-based prediction of the SSN
evolution of cycle 25. For this purpose, a Fourier Transform
analysis of the total time series (from 1749 to 2022) was first
performed to identify the five most intense periodicities. The
series was then lagged corresponding to the positions of these
periods, meaning the resulting sequences were considered as
additional attributes (predictors) to be included in the modeling
algorithms used, i.e., RF, SVM, GP and LR. The study was
also carried out considering different starting points of the time
series (from 1749 to 1940, with 1 yr steps) to determine which
Figure 6. Comparison of the real data and predicted data for the test stage start year could provide the most adequate quantity (and
(cycle 24 and beginning of 25) with the RF algorithm.
quality) of information to obtain the minimum error and per-
form a more accurate prediction. The obtained results show that
the model with the lowest error in the test stage (cycle 24) is
RF, together with the use of 1915 as the time series start
(yielding an average RMSE of 9.59 sunspots). Finally, this
model predicted that the maximum number of sunspots in cycle
25 will occur in March 2025, with a total of 90.

ORCID iDs
José-Víctor Rodríguez https://orcid.org/0000-0002-
3298-6439
Ignacio Rodríguez-Rodríguez https://orcid.org/0000-0002-
0118-3406
Wai Lok Woo https://orcid.org/0000-0002-8698-7605

References
Azcárate, T., Mendoza, B., & Levi, J. R. 2016, AdSpR, 58, 2116
Figure 7. Predicted data with the obtained RF model for cycle 25 and the Blomqvist, K., Kaski, S., & Heinonen, M. 2020, Proceedings of the Mining
beginning of 26. Data for Financial Applications (Ghent) (Berlin: Springer), 582
Cameron, R. H., & Schüssler, M. 2017, ApJ, 843, 111
Carbonell, M., Oliver, R., & Ballester, J. L. 1994, A&A, 290, 983
Covas, E., Peixinho, N., & Fernandes, J. 2019, SoPh, 294, 24
Therefore, with the best method (RF) and the optimal Dang, Y., Chen, Z., Li, H., & Shu, H. 2022, Appl. Artif. Intell., 36, 1
Dani, T., & Sulistiani, S. 2019, JPhCS, 1231, 012022
starting point of the time series (1915) identified, the results Faloutsos, C., Gasthaus, J., Januschowski, T., & Wang, Y. 2018, Proc. VLDB
obtained for this algorithm and data range are shown in the Endow, 11, 2102
following figures. Figure 5 depicts a comparison of the mod- Fierrez, J., Morales, A., Vera-Rodriguez, R., & Camacho, D. 2018, Inf. Fusion,
44, 57
eled and real data in a time slot of the training stage Han, Y. B., & Yin, Z. Q. 2019, SoPh, 294, 107
(since 1969). Hanslmeier, A., & Brajša, R. 2010, A&A, 509, A5
As can be seen, the agreement is excellent. On the other Hathaway, D. H. 2015, LRSP, 12, 4
Hiremath, K. M. 2006, JApA, 27, 367
hand, Figure 6 shows the model prediction for the test stage Kakad, B., Kumar, R., & Kakad, A. 2020, SoPh, 295, 88
(part of cycle 24 and the beginning of cycle 25) compared with Kalekar, P. S. 2004, Time Series Forecasting Using Holt-Winters Exponential
the real known data. Smoothing, Kanwal Rekhi School of Information Technology, Powai,
Mumbai, 04329008
Again, a good fit is observed in both the behavior of the Kim, K. B., Kim, J. H., & Chang, H. Y. 2018, JASS, 35, 151
curve (period) and the amplitude, yielding the aforementioned Kitiashvili, I. N. 2020, ApJ, 890, 36
average RMSE of 9.59 sunspots. Finally, in Figure 7, based on Kuhn, M., & Johnson, K. 2013, Applied Predictive Modeling (1st edn.; New
York: Springer)
the model obtained, a prediction of the SSN evolution for cycle Labonville, F., Charbonneau, P., & Lemerle, A. 2019, SoPh, 294, 82
25 and the beginning of cycle 26 (until 2033) is presented. Letellier, C., Aguirre, L. A., Maquet, J., & Gilmore, R. 2006, A&A, 449, 379

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms
Publications of the Astronomical Society of the Pacific, 134:124201 (7pp), 2022 December Rodríguez et al

Lewandowski, K. 2015, J. Polish Safety and Reliability Association, 6, 91 Pala, Z., & Atici, R. 2019, SoPh, 294, 1
Liaw, A., & Wiener, M. 2002, R News, 2, 18 Pulkkinen, T. 2007, LRSP, 4, 1
Liu, C., Deng, N., Liu, Y., et al. 2008, ApJ, 622, 722 Qu, J. 2016, Reviews in Medical Virology, 26, 309
Lybekk, B., Pedersen, A., Haaland, S., et al. 2012, JGRA, 117, A1 Sagir, S., Karatay, S., Atici, R., Yesil, A., & Ozcan, O. 2015, AdSpR, 55, 106
McIntosh, S. W., Chapman, S., Leamon, R. J., Egeland, R., & Watkins, N. W. Schölkopf, B., & Smola, A. J. 2003, A Short Introduction to Learning with
2020, SoPh, 295, 163 Kernels. In Advanced Lectures on Machine Learning (Berlin: Springer), 41
Nandy, D. 2021, SoPh, 296, 54 Seeger, M. 2004, IJNS, 14, 69
Novakovic, J., Strbac, P., & Bulatovic, D. 2011, J. Oper. Res, 21, 119 Shmueli, G., & Lichtendahl, K. C., Jr. 2016, Practical Time Series Forecasting
Okoh, D. I., Seemala, G. K., Rabiu, A. B., et al. 2018, SpWea, 16, 1424 with r: A Hands-on Guide (Green Cove Springs, FL: Axelrod Schnall
Oshiro, T. M., Perez, P. S., & Baranauskas, J. A. 2012, How Many Trees in A Publishers)
Random Forest? In International Workshop on Machine Learning and Data Usoskin, I. G. 2017, LRSP, 14, 1
Mining in Pattern Recognition (Berlin: Springer), 154 Vapnik, V. 2013, The Nature of Statistical Learning Theory (Berlin: Springer)

This content downloaded from


70.64.121.80 on Tue, 10 Jun 2025 14:41:41 UTC
All use subject to https://about.jstor.org/terms

You might also like