Abstract:
Vector autoregression (VAR) is a fundamental tool for modeling the joint dynamics of multivariate time series. However, as the number of component series is increased, the VAR model quickly becomes overparameterized, making reliable estimation difficult and impeding its adoption as a forecasting tool in high dimensional settings. A number of authors have sought to address this issue by incorporating regularized approaches, such as the lasso, that impose sparse or low-rank structures on the estimated coefficient parameters of the VAR. More traditional approaches attempt to address overparameterization by selecting a low lag order, based on the assumption that dynamic dependence among components is short-range.
However, these methods typically assume a single, universal lag order that applies across all components, unnecessarily constraining the dynamic relationship between the components and impeding forecast performance. The lasso-based approaches are more flexible but do not incorporate the notion of lag order selection.
We propose a new class of regularized VAR models, called hierarchical vector autoregression (HVAR), that embed the notion of lag selection into a convex regularizer. The key convex modeling tool is a group lasso with nested groups which ensure the sparsity pattern of autoregressive lag coefficients honors the ordered structure inherent to VAR.
We provide computationally efficient algorithms for solving HVAR problems that can be parallelized across the components. A simulation study shows the improved performance in forecasting and lag order selection over previous approaches, and a macroeconomic application further highlights forecasting improvements as well as the convenient, interpretable output of a HVAR model.
Abstract:
The vector autoregression (VAR) has long proven to be an effective method for modeling the joint dynamics of macroeconomic time series as well as forecasting. A major shortcoming of the VAR that has hindered its applicability is its heavy parameterization: the parameter space grows quadratically with the number of series included, quickly exhausting the available degrees of freedom. Consequently, forecasting using VARs is intractable for low-frequency, high-dimensional macroeconomic data. However, empirical evidence suggests that VARs that incorporate more component series tend to result in more accurate forecasts. Conventional methods that allow for the estimation of large VARs either tend to require ad hoc subjective specifications or are computationally infeasible.
Moreover, as global economies become more intricately intertwined, there has been substantial interest in incorporating the impact of stochastic, unmodeled exogenous variables. Vector autoregression with exogenous variables (VARX) extends the VAR to allow for the inclusion of unmodeled variables, but it similarly faces dimensionality challenges.
We introduce the VARX-L framework, a structured family of VARX models, and provide methodology that allows for both efficient estimation and accurate forecasting in high-dimensional analysis.
VARX-L adapts several prominent scalar regression regularization techniques to a vector time series context in order to greatly reduce the parameter space of VAR and VARX models. We also highlight a compelling extension that allows for shrinking toward reference models, such as a vector random walk. We demonstrate the efficacy of VARX-L in both low- and high-dimensional macroeconomic forecasting applications and simulated data examples. Our methodology is easily reproducible in a publicly available R package.
Abstract:
Many recent developments in the statistical time series literature have centered around incorporating the Lasso, a feature
selection procedure originally designed for least squares problems, to applications with time dependent data. The Lasso requires
the specification of a penalty parameter that determines the degree of sparsity to impose. The most popular cross validation
approaches that respect time dependence are very computationally intensive and not appropriate for modeling certain classes
of time series. We propose a novel adaptive penalty parameter selection procedure that takes advantage of the sequentially
observed nature of time series data to improve both computational performance and forecast accuracy in comparison to existing
methods.
Abstract:
The assumption of strict stationarity is too strong for observations in many financial time series applications; however,
distributional properties may be at least locally stable in time. We define multivariate measures of homogeneity to quantify local stationarity and an empirical approach for robustly estimating time varying windows of stationarity. Finally, we
consider bivariate series that are believed to be cointegrated locally, assess our estimates, and discuss applications in financial asset pairs trading.