0% found this document useful (0 votes)
311 views

Artificial Neural Network Model For Forecasting Foreign Exchange Rate

The present statistical models used for forecasting cannot effectively handle uncertainty and instability nature of foreign exchange data. In this work, an artificial neural network foreign exchange rate forecasting model (AFERFM) was designed for foreign exchange rate forecasting to correct some of these problems. The design was divided into two phases, namely: training and forecasting. In the training phase, back propagation algorithm was used to train the foreign exchange rates and learn how to approximate input. Sigmoid Activation Function (SAF) was used to transform the input into a standard range [0, 1]. The learning weights were randomly assigned in the range [-0.1, 0.1] to obtain the output consistent with the training. SAF was depicted using a hyperbolic tangent in order to increase the learning rate and make learning efficient. Feed forward Network was used to improve the efficiency of the back propagation. Multilayer Perceptron Network was designed for forecasting. The datasets from oanda website were used as input in the back propagation for the evaluation and forecasting of foreign exchange rates. The design was implemented using matlab7.6 and visual studio because of their supports for implementing forecasting system. The system was tested using mean square error and standard deviation with learning rate of 0.10, an input layer, 3 hidden layers and an output layer. The best known related work, Hidden Markov foreign exchange rate forecasting model (HFERFM) showed an accuracy of 69.9% as against 81.2% accuracy of AFERFM. This shows that the new approach provided an improved technique for carrying out foreign exchange rate forecasting.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
311 views

Artificial Neural Network Model For Forecasting Foreign Exchange Rate

The present statistical models used for forecasting cannot effectively handle uncertainty and instability nature of foreign exchange data. In this work, an artificial neural network foreign exchange rate forecasting model (AFERFM) was designed for foreign exchange rate forecasting to correct some of these problems. The design was divided into two phases, namely: training and forecasting. In the training phase, back propagation algorithm was used to train the foreign exchange rates and learn how to approximate input. Sigmoid Activation Function (SAF) was used to transform the input into a standard range [0, 1]. The learning weights were randomly assigned in the range [-0.1, 0.1] to obtain the output consistent with the training. SAF was depicted using a hyperbolic tangent in order to increase the learning rate and make learning efficient. Feed forward Network was used to improve the efficiency of the back propagation. Multilayer Perceptron Network was designed for forecasting. The datasets from oanda website were used as input in the back propagation for the evaluation and forecasting of foreign exchange rates. The design was implemented using matlab7.6 and visual studio because of their supports for implementing forecasting system. The system was tested using mean square error and standard deviation with learning rate of 0.10, an input layer, 3 hidden layers and an output layer. The best known related work, Hidden Markov foreign exchange rate forecasting model (HFERFM) showed an accuracy of 69.9% as against 81.2% accuracy of AFERFM. This shows that the new approach provided an improved technique for carrying out foreign exchange rate forecasting.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

World of Computer Science and Information Technology Journal (WCSIT)

ISSN: 2221-0741
Vol. 1, No. 3,110-118, 2011

Artificial Neural Network Model for Forecasting


Foreign Exchange Rate

Adewole Adetunji Philip Akinwale Adio Taofiki Akintomide Ayo Bidemi


Department of Computer Science Department of Computer Science Department of Computer Science
University of Agriculture University of Agriculture University of Agriculture
Abeokuta, Nigeria Abeokuta, Nigeria Abeokuta, Nigeria
[email protected] [email protected] [email protected]

Abstract— The present statistical models used for forecasting cannot effectively handle uncertainty and instability nature of
foreign exchange data. In this work, an artificial neural network foreign exchange rate forecasting model (AFERFM) was
designed for foreign exchange rate forecasting to correct some of these problems. The design was divided into two phases,
namely: training and forecasting. In the training phase, back propagation algorithm was used to train the foreign exchange rates
and learn how to approximate input. Sigmoid Activation Function (SAF) was used to transform the input into a standard range [0,
1]. The learning weights were randomly assigned in the range [-0.1, 0.1] to obtain the output consistent with the training. SAF was
depicted using a hyperbolic tangent in order to increase the learning rate and make learning efficient. Feed forward Network was
used to improve the efficiency of the back propagation. Multilayer Perceptron Network was designed for forecasting. The datasets
from oanda website were used as input in the back propagation for the evaluation and forecasting of foreign exchange rates. The
design was implemented using matlab7.6 and visual studio because of their supports for implementing forecasting system. The
system was tested using mean square error and standard deviation with learning rate of 0.10, an input layer, 3 hidden layers and an
output layer. The best known related work, Hidden Markov foreign exchange rate forecasting model (HFERFM) showed an
accuracy of 69.9% as against 81.2% accuracy of AFERFM. This shows that the new approach provided an improved technique
for carrying out foreign exchange rate forecasting.

Keywords- Artificial Neural Network; Back propagation Algorithm; Hidden Markov Model; Baum- Weld Algorithm; Sigmoid
Activation Function and Foreign Exchange Rate.

information from the past behavior of financial markets to fully


I. INTRODUCTION capture the dependency between future and past prices. One
All standard paper components have been specified for general assumption made in such cases is that the historical
three reasons: (1) ease of use when formatting individual data incorporate all those behavior. As a result, the historical
papers, (2) automatic compliance to electronic requirements data is the major player in the prediction process. Because of
that facilitate the concurrent or later production of electronic the high volatility, complexity and noise market environments,
products, and (3) conformity of style throughout conference neural networks techniques are prime candidates for prediction
proceedings. Margins, column widths, line spacing, and type purpose.
styles are built-in; examples of the type styles are provided
throughout this document and are identified in italic type,
within parentheses, following the example. Some components, The information that is not included in the model is
such as multi-leveled equations, graphics, and tables are not considered as noise. The non-stationary characteristic implies
prescribed, although the various table text styles are provided. that the distribution of financial time series is changing over
The formatter will need to create these components, time. By deterministically chaotic, one means that financial
incorporating the applicable criteria that follow. time series are short-term random but long-term deterministic.
In recent years, neural networks have been successfully used
Financial time series forecasting is regarded as one of the for modeling financial time series. Neural networks are
most challenging applications of modern time series universal function approximators that can map any non-linear
forecasting. As explained by [16], financial time series are function without a priori assumptions about the properties of
inherently noisy, non-stationary and deterministically chaotic. the data.
The noisy characteristic refers to the unavailability of complete

1
WCSIT 1 (3), 110 -118, 2011
proposed method. Also, section 4 gives an overview of the
datasets that we have used while section 5 gives experiments
Neural networks are also more noise tolerant, having the that we have performed and the descriptions of the results.
ability to learn complex systems with incomplete and corrupted Section 6 gives the conclusion and future research directions of
data. In addition, they are more flexible, having the capability the work.
to learn dynamic systems through a retraining process using
new data patterns. The foreign exchange market is the largest
and most liquid of the financial market with an estimated $3 II. REVIEW OF RELATED WORKS
trillion traded everyday as at 2008 [5]. Foreign exchange rates
are amongst the most important economic indices in the Different methods are used in Foreign Exchange Rates
international monetary markets. The forecasting of them poses prediction. These methods are distinguishable from each other
many theoretical and experimental challenges given the by what they hold to be constant into the future. For example,
abandonment of the field exchange rates, the implementation according to [3] hidden markov models are unstable to be taken
of the floating exchange rate system in the 1970s and foreign in as a trading tool on foreign exchange data with too many
exchange rates are affected by many highly correlated factors influencing the results. The HMMs attempt to generate
economic, political and even psychological factors. The or predict an output signal given a model [4]. HMMs according
interaction of these factors is in a very complex fashion. to [3] does not improve results as one might have expected.
Therefore, to forecast the changes of foreign exchange rates is The HMMs attempt to generate or predict an output signal
generally very difficult. Researchers and practitioners have given a model [4]. [17] investigated the stability of robustness
been striving for an explanation of the movement of exchange of alternative novel Neural Network architectures when applied
rates. Thus, various kinds of forecasting methods have been to the task of forecasting and trading the
developed by many researchers and experts. Technical and Euro/Dollar(EUR/USD) for exchange rate using the European
fundamental analyses are the basic and major forecasting Central Bank(ECB) fixing series with only auto regressive
methodologies which are in popular use in financial terms as inputs. Also, according to [6], Artificial neural
forecasting. Like many other economic time series, foreign networks (ANNs) are mathematical models simulating the
exchange market has its own trend, cycle, season and learning and decision making processes of the human brain.
irregularity. Thus to identify, model, extrapolate and recombine The foreign exchange market, unlike the stock market is an
these patterns and to give foreign exchange market (forex) over the counter market, that is built by a number of different
forecasting is the major challenge. banks. The participants of the foreign exchange market can be
roughly divided into the central bank, commercial banks, non-
bank financial entities, commercial companies and retail traders
Well-trained ANNs can predict complex biological [3]. The central bank has a significant influence in the foreign
patterns, structures, or functions of newly discovered exchange markets by virtue of their role in controlling the
sequences. The foreign exchange market not only has known countries’ money supply, inflation, and/or interest rates. ANNs
inputs and outputs but is also affected by external information were originally developed to model human brain function.
causing uncertainty. The Hidden Markov Model (HMM) ANNs are parameterized graphical models consisting of
approach basically is used to predict the hidden relationship networks with three prime architectures: recurrent, feed-
between inputs and outputs, and has been one of the most forward and layered [9]. [1] conducted a survey on the use of
attractive research areas in the field of information systems. neural networks in business application that contains a list of
This approach can be used in simulating the foreign exchange works covering bankruptcy prediction. [21] focuses on
market by taking a small subset of known information to portifolio optimization and short term equity forecasting. [15]
reduce the effect of this uncertainty and noise. mentioned the varying degree that ANN has the capability to
forecast financial markets. [7] proposed a novel flexible model
called neuron coefficient smooth transition auto regression
A neural network is an alternative powerful data modeling (NCSTAR), an ANN to test for and model the nonlinearities in
tool that is able to capture and represent complex input/output monthly exchange rates. Traditionally, statistical models such
relationships. This study describes the application of neural as Box-Jenkins models dominate the time series forecasting
networks in foreign exchange rates forecasting among major [11]. [29] suggested that the relationship between neural
currencies USA dollar, European Currency (EURO), Great networks and conventional statistical approaches for time series
Britain Pound (GB) and Japanese Currency (Yen) against forecasting are complementary. [24] also indicated that
Nigerian Money (Naira). Technological indicators and true traditional statistical techniques for forecasting have reached
series data are fed to neural networks to capture the movement their limitation in applications with nonlinearities in the data
in currency exchange rates. set such as stock indices. Neural Network technology has
seen many application areas in business especially when the
problem domain involves classification, recognition and
One of the significant contributions of this paper is our predictions. According to a survey research conducted by [8]
ability to propose an Artificial Neural Networks model that more than 127 neural network business applications had been
outperformed Hidden Markov Model. Also, we have been able published in international journals up to September, 1994. The
to substantiate weaknesses of the existing models. The number rose to 213 after a year. [10] said that the multilayer
remainders of this paper are organized as follows. Section 2 feed forwards are one of the most important and most popular
will discuss related works and section 3 will discuss the

2
WCSIT 1 (3), 110 -118, 2011
classes of ANNs in real world applications. According to him, - Given an output sequence, find the most likely set of state
a multilayer perceptron has three distinctive characteristics: transition and output probabilities solved by the Baum-Welch
algorithm In this work, Baum-Welch algorithm was used.
- The model of each neuron in the network
includes usually a non-linear activation function,
sigmoids or hyperbolic. A. Theoretical aspect of Baum-Welch Algorithm Let us define
- The network contains one or more layers of
hidden neurons that are not part of the input or ξ t(i, j), the joint probability of being in state qi at time t and
output of the network to learn complex and
highly nonlinear tasks by extracting state qj at time t +1 , given the model and the observed
progressively more meaningful features from the
input patterns. sequence:
The network exhibits a high degree or connectivity from
one layer to the next one.  (i, j )  P(q(t )
 q, q(t  1)
(1)
III. METHODOLOGY  q j | ,  )
The proposed forecasting of Foreign exchange rate used
AFERFM with the considerations of the existing HFERFM. Therefore we get

A. Hidden Markov Model i (i)aij b j (o(t  1))i 1 ( j )


 t (i, j )  (2)
P(O | )
The Hidden Markov Model (HMM) is a variant of a finite
state machine having a set of hidden states Q, an output
alphabet (observations), O, transition probabilities, A, output The probability of output
(emission) probabilities, B, and initial state probabilities, Π.
The current state is not observable instead, each state produces sequence can be exp ressed as
an output with a certain probability (B). Usually the states, Q,
and outputs, O, are understood, so an HMM is said to be a N N
triple, ( A, B, Π ). P( |  )  t ( )aij b j ( (t  1)) t 1 ( j )
i 1 j 1
Hidden states Q = { qi; i = 1, . . . , N }.
Transition probabilities A = {aij = P(qj at t +1 | qi at t)}, N
where P(a | b) is the conditional probability of a given b, t = 1, .
. . , T is time, and qi in Q. Informally, A is the probability that
  ( ) ( )
i 1
t t (3)
the next state is qj given that the current state is qi.
Observations (symbols) O = { ok }, k = 1, . . . , M .
The probability of being
Emission probabilities B = { bik = bi(ok) = P(ok | qi) },
where ok in O. Informally, B is the probability that the output in state qi at time t :
is ok given that the current state is qi.
Initial state probabilities Π = {pi = P(qi at t = 1)}.
N
t ( ) t ( )
t (i)   t (i, j ) (4)
The model is characterized by the complete set of j 1 P ( |  )
parameters: Λ = {A, B, Π}
Estimate
There are 3 canonical problems to solve with HMMs:
initial probabilities  p i ( ) (5)
- Given the model parameters, compute the probability
of a particular output sequence. This problem is solved transition
r 1
by the Forward and Backward algorithms.
- Given the model parameters, find the most likely   t (i, j ) (6)
sequence of (hidden) states which could have generated a given probability a  i 1
r 1


ij
output sequence.This is solved by the Viterbi algorithm and t (i )
Posterior decoding. i 1

3
WCSIT 1 (3), 110 -118, 2011
emission IV. BACK PROPAGATION ALGORITHM OF
ARTIFICIAL NEURAL NETWORK
  t ( j) (7)
*

probability b  t 1

  t ( j)
jk r
t 1 Training basically involves feeding training samples as
input vectors through a neural network, calculating the error of
Baum-Welch Algorithm the output layer, and then adjusting the weights of the network
Baum-Welch Algorithm will learn the to minimize the error. Each "training epoch" involves one
exposure of the network to a training sample from the training
set, and adjustment of each of the weights of the network one
Input: A set of observed sequence O1 , O2 , O3 , O4 ,... layer by layer.
Initialization: select arbitrary model parameters
The back propagation algorithm has emerged as one of the
  aij , ei () ; data  d P(O \  )
' d '
most used learning procedures for multilayer networks as
repeat
shown in fig. 2. They have been shown to have great potentials
for financial forecasting. The process of determining the
{   ' , S  S' magnitude of the weight factors that result in accurate output is
called training. Several methods are available to accomplish
for each sequence, Od this but the back propagation algorithm method is the most
{
commonly used. The method according to [14] is based on the
calculate  (t , i) for O d using forward algorithm determining the error between the predicted output variables
calculate  (t , i ) for O
d
using backward
and the known values of the training data set. The error
parameter is commonly defined as the root mean square of the
algorithm
errors for all the processes take the form of determining the
calculate the contribution of Od to A using partial derivatives of the errors with respect to each of the
weights. The algorithm used to propagate the error correction
1
Aij   d  (t, i)aij ei (Otd1 )  (t  1, i)
back into the network according to [14] is generally of the
p(O d ) t form:

calculate the contribution of d to E using E (8)


wij new  wij old 
 wij
1
Ei ( )     (t , i)  (t , i)
d P( d ) {t \ Otd  }
} Where E is the error parameter,  is the proportional factor
called the learning rate.
Aij Ei ( )
aij  ; ei ( )  The process of adjusting weights is continued until the error
A
i
ij  r Ei (r ) is less than some desired limit after which the network is
considered trained. Once the network is trained, it can receive
data   d P(Od / dij , ei ())
new input data that were not used for training and apply the
weight factors obtained during training [23]. Input vectors are
} applied to the network and calculated gradients at each training
until the change in data is less than some predefined threshold sample are added to determine the change in weights and
biases. Traingdx function is used for the training of the
network. The network is created using newff. The newff
Figure 1: Baum-Welch Algorithm creates feed-forward back-propagation network. The
connection of the input to hidden layer, then the first hidden
parameters from the data and implicitly discovers the motif. layer to output layer is automatically achieved when newff
We use Viterbi algorithm to determine the motif for the states function is called. And as each layer has its own transfer
of each input data. The algorithm is depicted in fig. 1. While function, the newff provides a means of specifying the transfer
this method may require significantly more effort due to the function of the layers in its syntax. Selection of training
amount of data needed, it may result in a more accurate samples from the training set involved going through each
projection if the data is accurate, hence the use of AFERFM training sample in order. The network performance was
assessed using ―outside" samples which make up the
"validation" set. Test was done using samples outside the
training set which enable us to confirm that the network is also
capable of classifying.

4
WCSIT 1 (3), 110 -118, 2011
V. DATA COLLECTION

A currency exchange rate usually consists of two numbers, VI. IMPLEMENTATION AND RESULTS
the bid and the ask price. For simplicity, exchange rate was
considered as a single number. The data used is constituted of
the daily averages and was downloaded from The algorithm of back propagation for AFERFM described
http://www.oanada.com website. The size of data set amounts in figure 2 and Baum-Welch Algorithm in figure 1 were
to approximately 800 daily price quotes from the years 2003- developed on visual studio and MATLAB 7.6. Each training
2005 for each currency in which 500 daily data for training, sample is of the form (x, t ) where x is the input vector and t
200 daily data for validation and 100 daily data for testing as is the target vector and η is the learning rate. The parameters of
shown in table 1. The first 500 daily data are used as inputs to ni, nh and no are the number of input, hidden and output nodes
the neural network. The first 500 daily data are fed to the respectively. The input i to unit j is denoted xij and its weight
neural networks to predict the following 100 daily’s rate after by wij. Data collected from oanda website served as input to
training and validation. Sigmoid Activation Function (SAF) both models. The results of the training of both models in
was used to transform the input into a standard range [0, 1]. Nigerian Money (Naira) as against USA dollars, Europeans
The equation and the graph are depicted in fig. 3. Money (EURO), Great Britain Pound (GBP) and Japanese
money ( Yen ) are illustrated in table 2 to 5. Looking at table 2
to 5, the values of both AFERFM and HFERFM for USA
TABLE I. DISTRIBUTION OF SAMPLE DATA dollar, EURO, GBP and Japanese Yen are very close as against
Types of data sets Size of data sets Nigerian currency (Naira). From the results of table 2 to 5,
Train data 500 AFERFM and HFERFM accuracy forecast were computed
Validation data 200 together with mean square error (MSE) and standard deviation
Test data 100 (SDEV) as follows:
Total 800

AFERFM accuracy
1. Create a feed-forward network ni inputs, nh
hidden units, and no output units.  ( pv)   (av) (9)
2. Initialize all weights to small random values (e.g. forecast  *100
between -0.1 and 0.1)  (av)
3. Until termination condition is met, do
4. For each training sample vector(x, t ) do
5. Compute the output ou for every unit of x HFERFM accuracy
6. For each output unit k, calculate
k=0k(1-0k)(tk-0k)  ( pv)   (av) (10)
7. For each hidden unit h,calculate forecast  *100
h=0h(1-0h) kwkh where  (av)
k downstream(h)
8. Update each network weight wji as
follows:
wjiwji+wji where wji =ηjxji Where standard deviation is
  E ( X   )2
Figure 2: Back propagation algorithm
and Mean square error is

( FE )
MSE 
N 1

and FE is forecast error and was calculated as different between


actual value and predicted value. The task is to have minimal
value of MSE. The results of MSE , SDEV and percentage
accuracy of both models for one day, one year, five years and
10 years are depicted in table 6 to 9. Looking at the table 6 to
9, The MSE and SDEV of HFERFM for USA dollar, EURO,
GBP and Yen as against Naira are close to 1 compared with the
values of AFERFM. This indicates the weakness of HFERFM
values for prediction. For all the tests 1 to 4, the values of
percentage accuracies of AFERFM are higher than HFERFM.
This is evidence that AFERFM is better than HFERFM for
prediction.
Figure 3: Sigmoid Activation Function (SAF)

5
WCSIT 1 (3), 110 -118, 2011
TABLE III. NAIRA AS AGAINST EURO
The forecasting of foreign exchange rate into distance S/n Actual AFERFM HFERFM
future using both models is illustrated in fig. 4 to 7. 1 154.37 154.38 154.37
Looking at these fig. 4, 5 and 6 except fig. 7, they 2 152.74 152.76 152.74
produced various magnitudes of foreign exchange rates 3 152.65 152.67 152.65
4 151.90 151.92 151.90
forecasting where each of them could be compared 5 156.51 156.53 156.54
together with actual values. They also showed the highest . . . .
magnitudes levels of uncertainty and non-statonary nature . . . .
of currencies to AFERFM and HFERFM. As indicated in . . . .
the figures, the rate of magnitude level of AFERFM is 65 156.81 156.81 156.81
certain and stationary compared with the HFERFM. 66 155.64 155.64 155.64
67 158.51 158.49 158.51
68 151.68 151.73 151.69
69 151.68 151.73 151.68
VII. CONCLUSION AND FUTURE WORK 70 151.70 151.73 151.70
. . . .
From the various tests performed on the results of the train . . . .
and validation, it was confirmed that AFERFM performs better 97 151.86 151.89 151.86
98 156.53 156.53 156.53
in estimating the foreign exchange rates. The percentage 99 157.35 157.34 157.35
accuracies are good evidences of the fact that given enough 100 158.14 158.11 158.14
data at its disposal, the AFERFM can ensure foreign exchange
rate. It can also be observed that when the performance of
AFERFM is compared with the HFERFM of Foreign Exchange
Rates projection, the formal performs better than the latter. The TABLE IV. NAIRA AS AGAINST GBP
best known related work, Hidden Markov Model, HFERFM S/n Actual AFERFM HFERFM
showed an accuracy of 69.9%. The evaluation results showed 1 206.78 206.76 206.78
that AFERFM had an accuracy of 81.2%. This shows that the 2 206.78 206.76 206.78
new approach provides an improved technique for carrying out 3 213.15 206.75 206.78
foreign exchange rates forecasting. The scope of the work had 4 213.21 212.33 212.43
been on the Nigerian foreign Exchange Rates as against four 5 213.58 213.13 213.15
. . . .
currencies, an improvement of the work can be made by
. . . .
extending the work to other country foreign exchange rates.. . . . .
The research work can also be improved by developing an 65 206.78 205.21 213,21
analytical model using artificial neural network with other 66 206.78 205.58 213.21
model or any suitable analytical method. 67 206.78 205.78 213.58
68 213.85 206.78 206.78
69 213.48 206.68 206.78
TABLE II. NAIRA AS AGAINST USA DOLLARS 70 214.23 206.68 206.78
S/n Actual AFERFM HFERFM . . . .
1 129.63 129.61 130.06 . . . .
2 129.63 129.61 130.06 97 214.54 206.69 206.74
3 133.17 133.20 132.74 98 206.78 213.46 213.49
4 133.62 133.65 133.07 99 206.78 214.13 214.25
5 133.89 133.92 133.28 100 206.78 206.76 206.78
. . . .
. . . .
. . . .
65 129.63 129.61 130.06
66 129.63 129.61 130.06
67 134.04 134.11 133.45
68 133.83 133.86 133.23
69 129.63 129.61 130.06
70 133.83 133.86 133.23
. . . .
. . . .
97 129.63 129.61 130.06
98 129.63 129.61 130.06
99 134.90 134.93 134.25
100 134.93 134.96 134.29

6
WCSIT 1 (3), 110 -118, 2011
TABLE V. NAIRA AS AGAINST J APANESE YEN TABLE VIII. TABLE 8: PERCENTAGE ACCURACY OF THE AFERFM AND
HFERFM FOR GREAT BRITIAN POUNDS AGAINST N IGERIAN CURRENCY
S/n Actual AFERFM HFERFM (NAIRA)
1 1.099 1.099 1.099
2 1.086 1.086 1.086 M T Days ME S %
3 1.086 1.086 1.086 H 1 1 0.99 0.99 68
4 1.084 1.083 1.084 A 1 1 0.25 0.26 80
5 1.123 1.123 1.123 H 2 365 0.99 0.99 68
. . . . A 2 365 0.25 0.26 80
. . . . H 3 1825 0.99 0.99 68
. . . . A 3 1825 0.25 0.26 80
65 1.124 1.125 1.124 H 4 3650 0.99 0.99 66
66 1.121 1.121 1.121 A 4 3650 0.25 0.26 80
where H = HFERFM, A=AFERFM, T= Test, ME= MSE, S= SDEV,
67 1.137 1.137 1.137
M=Model, % = percentage accuracy
68 1.092 1.092 1.092
69 1.092 1.092 1.092
70 1.092 1.091 1.091
. . . . TABLE IX. PERCENTAGE ACCURACY OF THE AFERFM AND
. . . . HFERFM FOR JAPANESE YEN AGAINST N IGERIAN CURRENCY (N AIRA)
97 1.092 1.096 1.096 M T Days ME S %
98 1.138 1.138 1.138 H 1 1 0.99 0.99 66
99 1.136 1.136 1.136 A 1 1 0.25 0.26 80
100 1.141 1.141 1.141 H 2 365 0.99 0.99 68
A 2 365 0.25 0.26 80
H 3 1825 0.99 0.99 66
A 3 1825 0.25 0.26 80
TABLE VI. PERCENTAGE ACCURACY OF THE AFERFM AND HFERFM H 4 3650 0.99 0.99 66
FOR USA DOLLARS AGAINST N IGERIAN C URRENCY (NAIRA)
A 4 3650 0.25 0.26 80
where H = HFERFM, A=AFERFM, T= Test, ME= MSE, S= SDEV,
M T days ME S % M=Model, % = percentage accuracy
H 1 1 0.99 0.99 68
A 1 1 0.25 0.26 80
H 2 365 0.97 0.97 69
A 2 365 0.25 0.24 81
H 3 1825 0.98 0.98 68
A 3 1825 0.25 0.25 80
H 4 3650 0.99 0.99 67
A 4 3650 0.25 0.25 79
Where H = HFERFM, A=AFERFM, T= Test, ME= MSE, S= SDEV,
M=Model , % = percentage accuracy

TABLE VII. PERCENTAGE ACCURACY OF THE AFERFM AND HFERFM


FOR EURO AGAINST NIGERIAN CURRENCY (NAIRA)

M T Days ME S %
H 1 1 0.99 0.99 67
A 1 1 0.25 0.25 80
H 2 365 0.99 0.99 68
A 2 365 0.25 0.25 80
H 3 1825 0.99 0.99 68
A 3 1825 0.25 0.25 80 Figure 4: Forecasting Exchange Rate for USA dollars against Nigerian
H 4 3650 0.99 0.99 65 Currency (Naira)
A 4 3650 0.25 0.25 79
where H = HFERFM, A=AFERFM, T= Test, ME= MSE, S= SDEV,
M=Model, % = percentage accuracy

7
WCSIT 1 (3), 110 -118, 2011

Figure 7: Forecasting Exchange Rate for Japanese Yen against Nigeria


Figure 5: Forecasting Exchange Rate for EURO against Nigeria Currency (Naira)
Currency (Naira)

REFERENCES
[1] A. Atiya and S. Yaser, Introduction to Financial Forecasting, Journal of
Applied Intelligence, (1996) Vol. 6, pp. 205-213
[2] M. Adya and F. Collopy, How Effective are Neural Networks at
Forecasting and Prediction?, A Review and Evaluation, Journal of
Forecasting, (1998), vol. 17 pp. 487-495
[3] P. Idval and C. Johnson, University Essay from Linkopings Universitet,
Matematiska Institutionen, Linkopings Univesitet, 2008
[4] Y. Lu and P. Perron, Modeling and Forecasting Stock Return Volatily
using a Random Level Shift model, Journal of Emprical Finance, (2001),
Vol. 17(1), pg 138-156, Elsevier
[5] M. Fliess and C. Join, Time Series Technical Analysis via new fast
Esimation Methods: A Preliminary Study in Mathematical Finance, 23rd
IAR Workshop on Advanced Control and Diagnosis, Coventry, United
Kingdom, 2008
[6] K. Jason, Forecasting Financial Markets using Neural Networks: An
Analysis of Methods and Accuracy, Naval Postgraduate School,
Monterey, California, USA, 1998
[7] H. Rose and D Dannie, Computational Genomics and Molecular
Biology, Carnegie Mello University, Pittsburgh, 2006
[8] E.M. Azoff, Neural network time series forecasting of financial markets.
Chichester; New York, Wiley, 1994
[9] P. Bald and S. Brunak, Bioinformatics: The Machine Learning
Approach, The MIT Press, Cambridge, 2001
[10] M. C. Bishop, Neural Networks for Pattern Recognition, Oxford
Figure 6: Forecasting Exchange Rate for Great Britain Pound against University Press, Oxford, United Kingdom, 1995
Nigeria Currency (Naira)
[11] G.E.P Box and G. Jenkins, Developed a two-Coefficient Time Series
Model of Factored Form,
www.census.gov/ts/papers/findleymartinwills2002.pdf, 1996
[12] B. Vanstone and C. Tan, Artificial Neural Networks in Financial
Trading, Intelligent Information System, M. Khosrow Edition, IGI
Publishing, 2005
[13] B. Vanstone and C. Tan, Evaluating the Application of Neural Networks
and Fundamental Analysis in the Austrialian Stock Market, In
Proceeding of the IASTED Inteligence, Calgary-Alberta, Canada, 2005
[14] M. Caudill and C. Butler, Understanding Neural Networks:
Massachusetts MIT Press, (1992) Vol. 1 pg 354, Cambridge

8
WCSIT 1 (3), 110 -118, 2011
[15] C. Jacobs, Average Speed Prediction Using Artificial Neural Networks: [23] S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice
Published M.Sc., Project submitted to the Faculty of Information Hall, Englewood Cliffs, Second Edition, 1998
Technology and System, Delft University of Technology, 2003 [24] J. Yao, Y. Li and C. L. Tan, Option Price Forecasting using Neural
[16] G. Deboeck, Trading on the Edge: Neural, Genetic and Fuzzy Systems Netwokr, International Journal of Management Science, (200), Vol. 28,
for Chaotic Financial Markets, New York Wiley, 1994 pp 455- 466, Omega
[17] C. Dunis, J. Laws and G. Sermpinis, Higher Order and Recurrent [25] R. R. Lawrence, A Tutorial on Hidden Markov Models and Selected
Neural. Architectures for Trading the EUR/USD Exchange Rate, Journal Applications in Speech Recognition. Proceedings of the IEEE, 77 (2),
of Quantitative Finance, CIBEF Working Paper at http://www.cbef.com, (1989), pp. 257-286
2010 [26] C. Man-Chung and W. Chi-Chung, Financial Time Series Forecasting
[18] S. Eriksson and C. Roding, Algorithmic Trading Uncovered – Impacts by Neural Network Using Conjugate Gradient Learning Algorithm and
on an Electronic. Exchange of Increasing Automation in Futures Multiple Linear Regression Weight Initialization. Computing in
Trading, Royal Institute of Technology, Stockholm, 2007 Economics and Finance 61, Society for Computational Economics, 2000
[19] E. Gately, Neural Networks for Financial Forecasting, John Wiley and [27] C. M. Marcelo and C.E. Pedreira, "What are the effects of forecasting
Sons, Inc., 1996 linear time series with neural networks," Textos para discuss ão 446,
[20] L. Harris, Trading and Exchanges: Market Microstructure for ideas.repec.org/e/ppe25.html, 2001
Practioners, Oxford University Press, 2002 [28] P. J. Werbos, Beyond regression: New Tools for Prediction and Analysis
[21] K. Holmstrom and T. Hellstrom, Predicting the Stock Market. Technical in the Behavioral Sciences. Ph.D. dissertation, Committee on Applied
Report Series IMa-TOM-1997-07, Malardalen University, Vasteras, Mathematics,Harvard University, 1994
Sweden, 1997 [29] H. White, Economic prediction using neural networks: The case of IBM
[22] P.H. Halmari, Using Neural Network to Forecast Automobiles Sales in daily stock returns. IEEE International Conference on Neural Networks,
Voge. W.G and Mickle M. H. the Twenty third annual Pittsburgh (1988),pp 451-459, San Diego
Conference on Modelling and Simulation, University of Pittsburgh,
U.S.A, 1992

You might also like