Abstract
Lattice algorithms for estimating the parameters of a multivariate autoregression are generalized to deal with subset models in which some of the coefficient matrices are constrained to be zero. We first establish a recursive prediction-error version of the empirical Yule-Walker equations. The estimated coefficient matrices obtained from these recursions are the coefficients of the best linear one-step predictors of the process under the assumption that the autocovariance function is the same as the sample autocovariance function. By modifying the recursions to allow for certain inherent shortcomings, we then derive new estimators which generalize the Vieira-Morf, Nutall-Strand and Burg estimators to the multivariate subset case. We show that the new estimators minimize weighted sums of squares of the forward and backward prediction errors in recursive schemes which closely resemble the original scheme of Burg. The performances of the estimators are compared in a simulation study.
Original language | English |
---|---|
Pages (from-to) | 197-213 |
Number of pages | 17 |
Journal | Statistica Sinica |
Volume | 15 |
Issue number | 1 |
State | Published - Jan 2005 |
Keywords
- Lattice algorithm
- Linear prediction
- Multistep prediction
- Multivariate autoregression
- Recursive autoregression
- VAR process