Open In App

Time Series Forecasting using Recurrent Neural Networks (RNN) in TensorFlow

Last Updated : 09 Apr, 2025
Comments
Improve
Suggest changes
Like Article
Like
Report

Time series data (such as stock prices) are sequence that exhibits patterns such as trends and seasonality. Each data point in a time series is linked to a timestamp which shows the exact time when the data was observed or recorded. Many fields including finance, economics, weather forecasting and machine learning use this type of data. Due to these characteristics we can use Recurrent Neural Networks (RNN) for prediction as they work on sequential data.

To demonstrate the same we're going to use stock price data the most popular type of time series data.

1. Importing required libraries

We will import some libraries such as numpy, pandas, matplotlib, yfinance, scikit learn and tenserflow.

Python
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import yfinance as yf
from sklearn.metrics import mean_squared_error, mean_absolute_error
from sklearn.preprocessing import MinMaxScaler
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, SimpleRNN

2. Fetching Data from Yahoo Finance

We are fetching data from yahoo finance using its API yfinance.

  • values.reshape(-1, 1): reshapes the data to be in a 2D array (required for RNN input).
  • We extract the 'Close' price from the dataset which is commonly used for forecasting in stock prediction models.
  • values.reshape(-1, 1): reshapes the data into a 2D array for use in the model.
Python
ticker = 'AAPL'  
data = yf.download(ticker, start="2020-01-01", end="2025-01-01")
data = data['Close'].values.reshape(-1, 1)

3. Normalize the Data

Normalizing data between 0 and 1 prevents large values from dominating smaller ones making training more efficient, improving model stability and convergence especially for deep learning models.

  • MinMaxScaler: is used to normalize the stock price data to a range between 0 and 1.
  • scaler.fit_transform(data): Fits the scaler to the dataset and scales all values accordingly.
Python
scaler = MinMaxScaler(feature_range=(0, 1))
scaled_data = scaler.fit_transform(data)

4. Prepare the Data for Time Series

We will create a function to create a dataset for the training of our RNN model.

  • create_dataset is a function that generates the input data (X) and target data (y). It slides a window of length time_step (60) over the stock prices and stores the next price as the target.
  • The X array is reshaped into a 3D array as required by the SimpleRNN layer: [samples, time steps, features].
Python
def create_dataset(data, time_step=60):
    X, y = [], []
    for i in range(len(data) - time_step - 1):
        X.append(data[i:(i + time_step), 0])
        y.append(data[i + time_step, 0])
    return np.array(X), np.array(y)

X, y = create_dataset(scaled_data)
X = X.reshape(X.shape[0], X.shape[1], 1)

5. Split the Data into Training and Testing Sets

We will split the data into training and testing part into the ratio of 80% Training and 20% Testing data.

Python
train_size = int(len(X) * 0.8)
X_train, X_test = X[:train_size], X[train_size:]
y_train, y_test = y[:train_size], y[train_size:]

6. Build the RNN Model

The model which we are using here is a Recurrent Neural Network (RNN). It s used for sequential data modeling such as time series forecasting.

  • The model is built using SImpleRNN layers.
  • units=50 defines the number of units (neurons) in each RNN layer.
  • return_sequences=True in the first RNN layer ensures that the output is fed to the next RNN layer.
  • Dense(units=1) is the output layer, predicting a single value (the next stock price).
  • The model is compiled using the Adam optimizer and Mean Squared Error loss function.
Python
model = Sequential()
model.add(SimpleRNN(units=50, return_sequences=True, input_shape=(X_train.shape[1], 1)))
model.add(SimpleRNN(units=50, return_sequences=False))
model.add(Dense(units=1))
model.compile(optimizer='adam', loss='mean_squared_error')

7. Training the Model

The model is trained on historical data and used for making predictions.

  • model.fit(X_train, y_train, epochs=20, batch_size=64): Trains the model for 20 epochs using batches of 64 samples at a time.
  • model.predict(X_test): Generates predictions for the test data.
  • scaler.inverse_transform(predictions): Converts the predicted values back to the original scale since they were normalized earlier using MinMaxScaler.
Python
model.fit(X_train, y_train, epochs=20, batch_size=64)

predictions = model.predict(X_test)
predictions = scaler.inverse_transform(predictions)

8. Evaluating the model

The model’s performance is evaluated using standard error metrics. These metrics provide insights into how well the model's predictions align with actual values.

  • mean_squared_error(y_test, predictions): Computes the Mean Squared Error (MSE) which measures the average squared difference between actual and predicted values.
  • np.sqrt(mse): Computes the Root Mean Squared Error (RMSE) which provides error in the same units as the target variable.
  • mean_absolute_error(y_test, predictions): Computes the Mean Absolute Error (MAE) which measures the average absolute difference between actual and predicted values.
Python
mse = mean_squared_error(y_test, predictions)
rmse = np.sqrt(mse)
mae = mean_absolute_error(y_test, predictions)

print(f"Mean Squared Error (MSE): {mse}")
print(f"Root Mean Squared Error (RMSE): {rmse}")
print(f"Mean Absolute Error (MAE): {mae}")

Output:

Mean Squared Error (MSE): 40110.58794903078
Root Mean Squared Error (RMSE): 200.276279047297
Mean Absolute Error (MAE): 199.34561473233896

These values suggest that while the model is making reasonable predictions we can further fine tune like refining data preprocessing or using more advanced architectures like LSTM or GRU for better accuracy.

9. Visualize the Results

This visualization compares the actual stock prices with the predicted stock prices, helping assess the model’s performance visually.

Python
plt.figure(figsize=(10,6))
plt.plot(scaler.inverse_transform(y_test.reshape(-1, 1)), color='blue', label='Real Stock Price')
plt.plot(predictions, color='red', label='Predicted Stock Price')
plt.title(f'{ticker} Stock Price Prediction')
plt.xlabel('Time')
plt.ylabel('Stock Price')
plt.legend()
plt.show()

Output:

download
Model Prediction

The graph shows the performance of a RNN model in predicting Apple (AAPL) stock prices. The blue line represents the actual stock prices from the test data while the red line shows the predicted values. The RNN model successfully captures the underlying trends in the stock price with the predicted values closely following the actual prices. Although minor deviations are observed the overall prediction shows that model effectively learned the temporal patterns in stock data.


Next Article

Similar Reads