This learning note is from the "Time series analysis-based on R" written by teacher Wang
After the preprocessing of a time series, it is shown that the model has the value of extracting information, then the next model is established to make the prediction. Here are three important models for fitting time series. I. AR model
The AR (p) model has denoted form as follows, in which P is the autoregressive order.
Called a central AR (p) model.
The two most commonly used fitting functions for model fitting are Arima.sim function fitting and filter function fitting, and the Arima.sim function can fit the stationary ar sequence, MA sequence, stationary arma sequence and Arima sequence. The function commands are as follows:
Arima.sim (n, List (ar=,ma=,order=), sd=)
# N: Fitted sequence length
# list: Specifies the parameters of the specific parameter model, wherein:
(1) Fitting the stationary AR (p) model, to give the regression coefficients, If the fitting AR model is specified as a nonstationary model, the system will make an error.
(2) to fit the MA (q) model, the moving average coefficient is given.
(3) fitting the stationary Arma (P,Q) model requires both the regression coefficient and the moving average coefficient, if the specified model is a non-stationary system, the error will be given.
(4) Fit Arima (P,D,Q) model, in addition to the need to give out the regression coefficient and moving average coefficient, also need to add order options, Order=c (p,d,q), where p is autoregressive coefficient, D is the difference order, Q is the moving average.
# SD: Specifies the standard deviation of the sequence, the default sd=1.
The filter function can directly fit the AR sequence (whether stationary or not) and the MA sequence. The function commands are as follows:
# e: Random fluctuation sequence variable name.
# Method: Specifies whether the fitting is an AR model or a MA model.
(1) method= "recursive" as AR model,
(2) methid= "convolution" as MA model.
# Circular: An option dedicated to fitting the MA model, circular=t can avoid errors in NA data.
Take the following four models for example, first examine their smoothness:
X1<-arima.sim (N=100,list (ar=0.8))
X3<-arima.sim (N=100,list (Ar=c (1,-0.5)))
) X2<-filter (e,filter = -1.1,method = "recursive")
x4<-filter (E,filter = C (1,0.5), method = "recursive")
Ts.plot (x2) Ts.plot (x3) ts.plot (
The results are as follows:
According to the graph, the X1 and X3 models can be judged as stationary sequences, and the X2 and X4 models are non-stationary sequences (x2 and X4 models use the Arima.sim function to fit the error, which indicates that the sequence is non-stationary and fitted with the filter function). The graphic method is a kind of intuitive but rough discriminant method, then we introduce the accurate two kinds of stability discriminant methods: characteristic root discrimination and stationary domain discrimination.
The necessary and sufficient conditions for distinguishing the AR model from the characteristic root are:
The necessary and sufficient conditions for stationary regions to discriminate AR models are:
The above four models are distinguished by the characteristic root and stationary domain as follows:
The autocorrelation coefficients of the Stationary AR (P) model have two significant properties, such as trailing property and exponential attenuation.
Take the above (1) and (3) stationary AR models as examples:
X1<-arima.sim (N=1000,list (ar=0.8))
X3<-arima.sim (N=1000,list (ar=c))
From the graph we can see whether AR (1) model or AR (2) model, whether they are positive root or negative root, their autocorrelation coefficients are both trailing and exponential attenuation to zero property.
The partial autocorrelation coefficients of the Stationary AR (p) model have P-step truncation.
Also take the above (1) and (3) stationary AR models as examples:
Because of the randomness of the samples, the autocorrelation coefficients of the samples are not as strict as the theoretical partial autocorrelation coefficients. However, it is shown from the graph that the sample bias autocorrelation coefficient of ar (1) model is not zero, and the degree of deviation from 1 is zero, and that in the AR (2) model, the sample bias autocorrelation coefficient 2 is obviously not zero, and the myopia is zero after 2 order. The sample autocorrelation diagram visually validates the truncation of the AR model's partial autocorrelation coefficient.
Second, MA model
The denoted model of MA (q) (q is the moving highest order) model is as follows:
is a central MA (q) model.
The MA (q) model has a Q-order truncated property, with four models as examples below:
X1<-arima.sim (N=1000,list (ma=-2))
X2<-arima.sim (N=1000,list (ma=-0.5))
X3<-arima.sim (n=1000, List (Ma=c ( -4/5,16/25)))
X4<-arima.sim (N=1000,list (Ma=c ( -5/4,25/16)) ACF (x1) ACF (x2) ACF (x3
The sample autocorrelation diagram clearly shows the characteristics of the MA (1) model autocorrelation coefficient 1-order truncated, and the MA (2) model autocorrelation coefficient 2-order truncated tail.
The MA (q) model has a trailing nature of the bias autocorrelation coefficient:
PACF (x2) PACF (x3) pacf (
The partial autocorrelation graph shows that the partial autocorrelation coefficients of the MA (1) model or MA (2) model have different drag-tails.
Three, the ARMA model Central RAMA (P,Q) model is abbreviated as: When Q=0, the RAMA (P,Q) model degenerated into AR (p) model, and when p =, the RAMA (P,Q) model degenerated into the MA (q) model, and the RAMA (P,Q) model had autocorrelation coefficients not truncated, The partial autocorrelation coefficient also does not truncate the property. The following arma (1,1) model is used as an example: the properties of the Synthetic AR (p) model, MA (q) model and ARMA (P,Q) model autocorrelation coefficient and the partial autocorrelation coefficient are obtained as follows: