Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Regression coefficient and autoregressive order shrinkage and selection via the lasso

Regression coefficient and autoregressive order shrinkage and selection via the lasso SummaryThe least absolute shrinkage and selection operator (‘lasso’) has been widely used in regression shrinkage and selection. We extend its application to the regression model with autoregressive errors. Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients and the other for autoregression coefficients). These tuning parameters can be easily calculated via a data-driven method, but the resulting lasso estimator may not be fully efficient. To overcome this limitation, we propose a second lasso estimator which uses different tuning parameters for each coefficient. We show that this modified lasso can produce the estimator as efficiently as the oracle. Moreover, we propose an algorithm for tuning parameter estimates to obtain the modified lasso estimator. Simulation studies demonstrate that the modified estimator is superior to the traditional estimator. One empirical example is also presented to illustrate the usefulness of lasso estimators. The extension of the lasso to the autoregression with exogenous variables model is briefly discussed. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of the Royal Statistical Society Series B (Statistical Methodology) Oxford University Press

Regression coefficient and autoregressive order shrinkage and selection via the lasso

Loading next page...
 
/lp/oxford-university-press/regression-coefficient-and-autoregressive-order-shrinkage-and-ixL3Exg3FJ

References (48)

Copyright
© 2007 Royal Statistical Society
ISSN
1369-7412
eISSN
1467-9868
DOI
10.1111/j.1467-9868.2007.00577.x
Publisher site
See Article on Publisher Site

Abstract

SummaryThe least absolute shrinkage and selection operator (‘lasso’) has been widely used in regression shrinkage and selection. We extend its application to the regression model with autoregressive errors. Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients and the other for autoregression coefficients). These tuning parameters can be easily calculated via a data-driven method, but the resulting lasso estimator may not be fully efficient. To overcome this limitation, we propose a second lasso estimator which uses different tuning parameters for each coefficient. We show that this modified lasso can produce the estimator as efficiently as the oracle. Moreover, we propose an algorithm for tuning parameter estimates to obtain the modified lasso estimator. Simulation studies demonstrate that the modified estimator is superior to the traditional estimator. One empirical example is also presented to illustrate the usefulness of lasso estimators. The extension of the lasso to the autoregression with exogenous variables model is briefly discussed.

Journal

Journal of the Royal Statistical Society Series B (Statistical Methodology)Oxford University Press

Published: Jan 12, 2007

Keywords: Autoregression with exogenous variables; Lasso; Oracle estimator; Regression model with autoregressive errors

There are no references for this article.