Like Elman Neural Networks, Jordan Neural Networks are also a simple recurrent neural network. They have proven to be a popular tool for applied time series modeling and are often trained alongside Elman because the two are very similar.
Jordan Neural Networks have a single hidden layer. The only difference between Elman and Jordan is that the context layer neurons are fed from the output layer instead of the hidden layer. Thus, Jordan “remembers” the output from the previous time-step.
Like Elman, Jordan neural networks are useful for predicting time series observations which have a short-term memory.
First, we take log and then scale the data to a range of [0,1]. Then, we use quantmod package to create 12 time lagged attributes (12 because the data is monthly).
= read.csv("data.csv")[,3]
data = ts(data, start = c(2001, 1), frequency = 12)
data #
= read.csv("data.csv")[,1]
date = as.Date(date, format = "%d/%m/%Y")
date
#Log-Transformation
= ts(log(data), start = c(2001, 1), frequency = 12)
y
#Normalization
= function(x){(x-min(x))/(max(x)-min(x))}
range.data = min(y) #12.57856
min.data = max(y) #14.10671
max.data = range.data(y)
y = function(x, xmin, xmax){x*(xmax-xmin)+xmin}
unscale.data
#Lag Selection
require(quantmod)
= as.zoo(y)
y = Lag(y, k = 1)
x1 = Lag(y, k = 2)
x2 = Lag(y, k = 3)
x3 = Lag(y, k = 4)
x4 = Lag(y, k = 5)
x5 = Lag(y, k = 6)
x6 = Lag(y, k = 7)
x7 = Lag(y, k = 8)
x8 = Lag(y, k = 9)
x9 = Lag(y,k = 10)
x10 = Lag(y,k = 11)
x11 = Lag(y,k = 12)
x12 = cbind(x1,x2,x3,x4,x5,x6,x7,x8,x9,x10,x11,x12)
x = cbind(y, x)
x = x[-(1:12),] #Missing Value Removal
x = nrow(x) #236 observations
n #
= 224
n.train = 1:(n-12)
train = x$y
outputs = x[, 2:13]
inputs #
require(RSNNS)
To train the network, we use 224 values. The first 12 values were removed so that we have values at all the lags we use.
The package RSNNS contains function jordan which estimates an Jordan Neural Network.
We build 3 models with different number of nodes in the hidden layers.
#
set.seed(2018)
= jordan(inputs[train], outputs[train], size = 64, learnFuncParams = c(0.01),maxit = 1000)
fit1 #
set.seed(2018)
= jordan(inputs[train], outputs[train], size = 106, learnFuncParams = c(0.01),maxit = 1000)
fit2 #
set.seed(2018)
= jordan(inputs[train], outputs[train], size = 109, learnFuncParams = c(0.01),maxit = 1000) fit3
The plotIterativeError function plots the iterative error over the sample.
The plotRegressionError function helps visualize the relationship between the Actual and Predicted values.
Note: Only the code for first plot is shown, other 2 plots are created with the same code.
par(mfrow=c(1,2))
plotIterativeError(fit1)
plotRegressionError(outputs[1:n.train,], fit1$fitted.values)
par(mfrow=c(1,1))
title("Model 1 Error and Fit")
We use the predict function to make predictions. Now, because we had taken log and normalized all values, the predicted ones are not in original units.
To convert them back to original units, we take exponential and unscale them using a user-define function. We do the same with the actual test data values as well.
# Prediction
= predict(fit1, inputs[-train])
pred1 = exp(unscale.data(pred1, min.data, max.data))
output.pred1
= predict(fit2, inputs[-train])
pred2 = exp(unscale.data(pred2, min.data, max.data))
output.pred2
= predict(fit3, inputs[-train])
pred3 = exp(unscale.data(pred3, min.data, max.data))
output.pred3
# Actual Data
= exp(unscale.data(outputs[225:236], min.data, max.data))
output.actual = as.matrix(output.actual)
output.actual = rownames(output.actual) #Prediction Dates
pred.dates #
= cbind(as.ts(output.actual), # Actual
result as.ts(output.pred1), # Model.1
as.ts(output.pred2), # Model.2
as.ts(output.pred3)) # Model.3
library(Metrics)
round(c( mape(result$Actual, result$Model.1),
rmse(result$Actual, result$Model.1),
mape(result$Actual, result$Model.2),
rmse(result$Actual, result$Model.2),
mape(result$Actual, result$Model.3),
rmse(result$Actual, result$Model.3) ),5)
## [1] 0.07592 74657.97582 0.09367 93776.58728 0.05725 71608.57912
Overall, the Jordan Neural Network is able to capture the trend, seasonality and most of the underlying dynamics of our time series data.
U.S. Consumption of Electricity Generated by Natural Gas | ||||
---|---|---|---|---|
Dates | Actual1 | Model 11 | Model 21 | Model 31 |
Sep 2020 | 1006071.1 | 1064280.4 | 1122930.6 | 1025189.8 |
Oct 2020 | 924056.2 | 904008.6 | 918533.6 | 909775.6 |
Nov 2020 | 737935.2 | 837158.3 | 832361.0 | 819863.2 |
Dec 2020 | 839912.6 | 895515.0 | 912376.2 | 828382.0 |
Jan 2021 | 833783.3 | 962217.8 | 930818.0 | 885501.5 |
Feb 2021 | 759358.2 | 866818.8 | 849875.3 | 815062.7 |
Mar 2021 | 715165.1 | 811165.7 | 826456.9 | 773812.0 |
Apr 2021 | 724125.8 | 740006.6 | 771464.3 | 710971.7 |
May 2021 | 787027.2 | 826697.5 | 841688.4 | 845217.3 |
Jun 2021 | 1051774.8 | 1044398.2 | 1077180.5 | 1012727.0 |
Jul 2021 | 1199673.3 | 1283053.1 | 1359408.4 | 1398298.0 |
Aug 2021 | 1223328.0 | 1287099.5 | 1350961.4 | 1204196.9 |
Source: US Energy Information Adminstration | ||||
1
Thousand Mcf
|