Section 5.2 homework for Time Series Analysis
5.1 Complete their parts and add the following:
g) Find the residual ACF and PACF plots for the ARFIMA(1,d,0) model. What do these plots suggest about the model?
h) Find using the ARFIMA(1,d,0) model.
Partial answers:
a)
> load("C:\\chris\\unl\\Dropbox\\NEW\\STAT_time_series\\tsa3.rda")
> head(arf)
[1] -0.02936339 0.74870380 -0.33855273 -1.03319141 -0.26266177 -0.49428885
> x<-arf
> win.graph(width = 8, height = 6, pointsize = 10)
> plot(x = x, ylab = expression(x[t]), xlab = "t", type = "l", col = "red", lwd =
1, main = "Plot of fracdiff.dat", panel.first=grid(col = "gray", lty =
"dotted"))
> points(x = x, pch = 20, col = "blue")
Possibly an upward trend indicating nonstationarity in the mean.
b)
> par(mfrow = c(1,2))
> acf(x = x, lag.max = 100, type = "correlation", main = "Estimated ACF for
fracdiff.dat", xlim = c(1,100), ylim = c(-1,1))
> pacf(x = x, lag.max = 100, main = "Estimated PACF for fracdiff.dat", xlim =
c(1,100), ylim = c(-1,1))
> par(mfrow = c(1,1))
ACF goes to 0 very slowly indicating non-stationarity.
c)
> mean(x)
[1] 3.746601
> x.adj<-x-mean(x)
> library(fracdiff)
> mod.fit<-fracdiff(x = x.adj, nar = 1)
> mod.fit
Call:
fracdiff(x = x.adj, nar = 1)
Coefficients:
d ar ma
0.2646309 0.8630677 0.0000000
a list with components:
[1] "log.likelihood" "n" "msg" "d" "ar"
[6] "ma" "covariance.dpq" "stderror.dpq" "correlation.dpq" "h"
[11] "d.tol" "M" "hessian.dpq" "length.w" "call"
> summary(mod.fit)
Call:
fracdiff(x = x.adj, nar = 1)
Coefficients:
Estimate Std. Error z value Pr(>|z|)
d 0.264631 0.009653 27.41 <2e-16 ***
ar 0.863068 0.016921 51.01 <2e-16 ***
ma 0.000000 0.009653 0.00 1
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
sigma[eps] = 0.9871988
[d.tol = 0.0001221, M = 100, h = 1.483e-05]
Log likelihood: -1406 ==> AIC = 2818.562 [3 deg.freedom]
The estimated model is (1 – 0.8631B)(1 – B)0.2646(xt – 3.7466) = wt
Both hypothesis tests indicate strong evidence against the parameters being equal to 0 due to the small p-values.
d)The ACF is decaying slowly, and the plot of xt vs. t shows some possible linear trend toward the last half of it.
e)
> #ACF and PACF of first differences
> par(mfrow = c(1,2))
> acf(x = diff(x = x, lag = 1, differences = 1), type = "correlation", lag.max =
100, ylim = c(-1,1), main = expression(paste("Estimated ACF plot for ",
x[t] - x[t-1])))
> abline(v = seq(from = 10, to = 100, by = 10), lty = "dotted", col = "gray")
> pacf(x = diff(x = x, lag = 1, differences = 1), lag.max = 100, ylim = c(-1,1),
xlab = "h", main = expression(paste("Estimated PACF plot for ", x[t] –
x[t-1])))
> abline(v = seq(from = 10, to = 100, by = 10), lty = "dotted", col = "gray")
par(mfrow = c(1,1))
The ACF no longer has the slow decaying look to it like it did before. There are some larger autocorrelations and partial autocorrelations around lag 13. If more was known about how the data was collected (was there a season?), this would lead to an investigation.
f)I think Shumway and Stoffer are expecting an ARIMA(1,1,0) model to be fit, but it is not clear from the problem itself. They fit this model in their answer key.
> mod.fit<-arima(x = x, order = c(1, 1, 0))
> mod.fit
Call:
arima(x = x, order = c(1, 1, 0))
Coefficients:
ar1
0.1698
s.e. 0.0312
sigma^2 estimated as 0.9994: log likelihood = -1417.24, aic = 2838.49
> examine.mod(mod.fit.obj = mod.fit, mod.name = "ARMA(1,1,0)", max.lag = 100)
$z
ar1
5.442477
$p.value
ar1
5.254481e-08
There still appears to be some autocorrelations around lag 13 that need to be investigated.
g)Find the residual ACF and PACF plots for the ARFIMA(1,d,0) model. What do these plots suggest about the model?
> mod.fit<-fracdiff(x = x.adj, nar = 1)
> mod.fit
Call:
fracdiff(x = x.adj, nar = 1)
Coefficients:
d ar ma
0.2646309 0.8630677 0.0000000
a list with components:
[1] "log.likelihood" "n" "msg" "d" "ar"
[6] "ma" "covariance.dpq" "stderror.dpq" "correlation.dpq" "h"
[11] "d.tol" "M" "hessian.dpq" "length.w" "call"
> pi.vec<-numeric(length(x.adj))
> pi.vec[1]<--mod.fit$d/(0+1)
> w<-numeric(length(x.adj))
> w[1]<-x.adj[1]
> for (j in 2:(length(x))) {
pi.vec[j]<-(j-1-mod.fit$d)*pi.vec[j-1]/(j-1+1)
}
> w[2]<-x.adj[2] + pi.vec[1]*x.adj[1] - mod.fit$ar*x.adj[1]
> for (j in 3:(length(x))) {
w[j]<-x.adj[j]+sum(pi.vec[1:(j-1)]*rev(x.adj[1:(j-1)])) -
mod.fit$ar*x.adj[j-1] - mod.fit$ar*sum(pi.vec[1:(j-2)]*rev(x.adj[1:(j-2)]))
}
> all<-data.frame(x.adj, w, pi.vec)
> head(all, n = 10)
x.adj w pi.vec
1 -3.775964 -3.77596400 -0.26463092
2 -2.997897 1.26025278 -0.09730070
3 -4.085153 -1.19943424 -0.05628421
4 -4.779792 -0.67054280 -0.03848952
5 -4.009262 0.72425410 -0.02875451
6 -4.240889 -0.50647944 -0.02269387
7 -4.206783 -0.14490323 -0.01859396
8 -3.778315 0.22211935 -0.01565465
9 -3.463896 0.05306272 -0.01345494
10 -3.505799 -0.28765277 -0.01175339
> par(mfrow = c(1,2))
> acf(x = w, lag.max = 100, type = "correlation", main = "Est. ACF for
residuals", ylim = c(-1,1), panel.first=grid(col = "gray", lty = "dotted"))
> pacf(x = w, lag.max = 100, main = "Est. PACF for residuals", ylim = c(-1,1),
panel.first=grid(col = "gray", lty = "dotted"))
> par(mfrow = c(1,1))
It appears the model is pretty good! There are only a few marginally significant autocorrelations and partial autocorrelations (some are to be expected given the large number of lags examined). This indicates the resulting residuals are approximately white noise as given in the model’s assumptions.
h)Find using the ARFIMA(1,d,0) model.
= E(x1001|I1000)
= E(w1001-1x1000-2x998-…-999x1+1x999+11x998+…+11000x1|I1000)
= -1x1000-2x998-…-1000x1+1x1000+11x999+11000x1
> x.adj.1001<--pi.vec%*%rev(x.adj) + mod.fit$ar*x.adj[1000] +
mod.fit$ar*pi.vec[1:999]%*%rev(x.adj[1:999])
> x.adj.1001
[,1]
[1,] -4.2509
> x.adj.1001+mean(x)
[,1]
[1,] -0.5042992
1