ar {ts} | R Documentation |
Fit Autoregressive Models to Time Series
Description
Fit an autoregressive time series model to the data, by default selecting the complexity by AIC.
Usage
ar(x, aic = TRUE, order.max = NULL,
method=c("yule-walker", "burg", "ols", "mle"), na.action, series)
ar.burg(x, aic = TRUE, order.max = NULL, na.action, demean = TRUE, series,
var.method = 1)
ar.yw(x, aic = TRUE, order.max = NULL, na.action, demean = TRUE, series)
ar.ols(x, aic = TRUE, order.max = NULL, na.action, demean = TRUE, series)
ar.mle(x, aic = TRUE, order.max = NULL, na.action, demean = TRUE, series)
predict(ar.obj, newdata, n.ahead = 1, se.fit = TRUE)
Arguments
x |
A univariate or multivariate time series. |
aic |
Logical flag. If |
order.max |
Maximum order (or order) of model to fit. Defaults
to |
method |
Character string giving the method used to fit the
model. Must be one of the strings in the default argument
(the first few characters are sufficient). Defaults to
|
na.action |
function to be called to handle missing values. |
demean |
should a mean be estimated during fitting? |
series |
name for the series. Defaults to
|
var.method |
the method to estimate the innovations variance (see Details). |
ar.obj |
a fit from |
newdata |
data to which to apply the prediction. |
n.ahead |
number of steps ahead at which to predict. |
se.fit |
logical: return estimated standard errors of the prediction error? |
Details
For definiteness, note that the AR coefficients have the sign in
(x[t] - m) = a[0] + a[1]*(x[t-1] - m) + ... + a[p]*(x[t-p] - m) + e[t]
and a[0] is zero except to an OLS fit (which can fit non-stationary models).
ar
is just a wrapper for the functions ar.yw
,
ar.burg
, ar.ols
and ar.mle
.
Order selection is done by AIC if aic
is true. This is
problematic, as of the methods here only ar.mle
performs
true maximum likelihood estimation. The AIC is computed as if the variance
estimate were the MLE, omitting the determinant term from the
likelihood. Note that this is not the same as the Gaussian likelihood
evaluated at the estimated parameter values. In ar.yw
the
variance matrix of the innovations is computed from the fitted
coefficients and the autocovariance of x
and in ar.ols
from the variance matrix of the residuals.
ar.burg
allows two methods to estimate the innovations
variance and hence AIC. Method 1 is to use the update given by
the Levinson-Durbin recursion (Brockwell and Davis, 1991, (8.2.6)
on page 242), and follows S-PLUS. Method 2 is the mean of the sum
of squares of the forward and backward prediction errors
(as in Brockwell and Davis, 1996, page 145). Percival and Walden
(1998) discuss both.
Remember that ar
includes by default a constant in the model, by
removing the overall mean of x
before fitting the AR model,
or (ar.mle
) estimating a constant to subtract or
(ar.ols
) subtracting the mean and estimating an additive
constant.
ar.ols
fits the general AR model (containing an intercept
if demean = TRUE
) to a possibly non-stationary and/or
multivariate system of series x
. The resulting
unconstrained least squares estimates are consistent, even if
some of the series are non-stationary and/or co-integrated.
Value
For ar
and its methods a list of class "ar"
with
the following elements:
order |
The order of the fitted model. This is chosen by
minimizing the AIC if |
ar |
Estimated autoregression coefficients for the fitted model. |
var.pred |
The prediction variance: an estimate of the portion of the variance of the time series that is not explained by the autoregressive model. |
x.mean |
The estimated mean of the series used in fitting and for use in prediction. |
x.intercept |
( |
aic |
The value of the |
n.used |
The number of observations in the time series. |
order.max |
The value of the |
partialacf |
The estimate of the partial autocorrelation function
up to lag |
resid |
residuals from the fitted model, conditioning on the
first |
method |
The value of the |
series |
The name(s) of the time series. |
asy.var.coef |
(univariate, not |
asy.se.coef |
( |
For predict.ar
, a time series of predictions, or if
se.fit = TRUE
, a list with components pred
, the
predictions, and se
, the estimated standard errors. Both
components are time series.
Note
Only the univariate cases of ar.burg
and ar.mle
are
implemented.
Fitting by method="mle"
to long series can be very slow.
Author(s)
Martyn Plummer, univariate case of ar.yw
, ar.mle
and C code for ar.burg
by B.D. Ripley,
ar.ols
by Adrian Trapletti.
References
Brockwell, P. J. and Davis, R. A. (1991) Time Series and Forecasting Methods. Second edition. Springer, New York. Section 11.4.
Brockwell, P. J. and Davis, R. A. (1996) Introduction to Time Series and Forecasting. Springer, New York. Sections 5.1 and 7.6.
Luetkepohl, H. (1991): Introduction to Multiple Time Series Analysis. Springer Verlag, NY, pp. 368-370.
Percival, D. P. and Walden, A. T. (1998) Spectral Analysis for Physical Applications. Cambridge University Press.
Whittle, P. (1963) On the fitting of multivariate autoregressions and the approximate canonical factorization of a spectral density matrix. Biometrika 40, 129-134.
Examples
data(lh)
ar(lh)
ar(lh, method="burg")
ar(lh, method="ols")
ar(lh, F, 4) # fit ar(4)
data(LakeHuron)
ar(LakeHuron)
ar(LakeHuron, method="burg")
ar(LakeHuron, method="ols")
data(sunspot)
sunspot.ar <- ar(sunspot.year)
sunspot.ar
ar(x = sunspot.year, method = "burg")
ar(x = sunspot.year, method = "ols")
## Not run: ## next is slow and may have convergence problems,
## as it cares about invertibility
ar(x = sunspot.year, method = "mle")
## End(Not run)
predict(sunspot.ar, n.ahead=25)
data(BJsales)
ar(ts.union(BJsales, BJsales.lead))
data(EuStockMarkets)
x <- diff(log(EuStockMarkets))
ar.ols(x, order.max=6)