Access the full text.
Sign up today, get DeepDyve free for 14 days.
A. Schick (1996)
Efficient estimation in a semiparametric additive regression model with autoregressive errorsStochastic Processes and their Applications, 61
H. Zou (2006)
The Adaptive Lasso and Its Oracle PropertiesJournal of the American Statistical Association, 101
A. Zellner, L. Klein, R. Ball, A. Hazlewood, P. Vandome (1962)
An Econometric Model of the United KingdomEconometrica, 30
R. Engle, C. Granger, J. Rice, A. Weiss (1986)
Semiparametric estimates of the relation between weather and electricity sales
Runze Li, Hua Liang (2008)
Variable Selection in Semiparametric Regression Modeling.Annals of statistics, 36 1
T. Hesterberg, Nam-Hee Choi, L. Meier, C. Fraley (2008)
Least angle and ℓ1 penalized regression: A reviewStatistics Surveys, 2
Jianqing Fan, Runze Li (2001)
Variable Selection via Nonconcave Penalized Likelihood and its Oracle PropertiesJournal of the American Statistical Association, 96
Jiti Gao (1997)
Adaptive parametric test in a semiparametric regression modelCommunications in Statistics-theory and Methods, 26
Efron Efron, Hastie Hastie, Johnstone Johnstone, Tibshirani Tibshirani (2004)
Least angle regression (with discussion)Annals of Statistics, 32
Saleh Saleh (1992)
On shrinkage estimation of the parameters of an autoregression Gaussian processTheory of Probability and Applications, 37
A. Schick (1994)
Estimation of the autocorrelation coefficient in the presence of a regression trendStatistics & Probability Letters, 21
Jiti Gao (1994)
Asymptotic theory for partly linear modelsCommunications in Statistics-theory and Methods, 24
S. Hamilton, Y. Truong (1997)
Local Linear Estimation in Partly Linear ModelsJournal of Multivariate Analysis, 60
Hung Chen, J. Shiau (1994)
Data-Driven Efficient Estimators for a Partially Linear ModelAnnals of Statistics, 22
G. Aneiros-Pérez, A. Quintela-del-Río (2002)
Plug-in bandwidth choice in partial linear models with autoregressive errorsJournal of Statistical Planning and Inference, 100
Hung Chen (1988)
Convergence Rates for Parametric Components in a Partly Linear ModelAnnals of Statistics, 16
W. Härdle, Hua Liang, Jiti Gao (2000)
Partially Linear Models
Xiao Ni, Hao Zhang, Daowen Zhang (2009)
Automatic model selection for partially linear modelsJournal of multivariate analysis, 100 9
S. Ahmed, K. Doksum, S. Hossain, Jinhong You (2007)
SHRINKAGE, PRETEST AND ABSOLUTE PENALTY ESTIMATORS IN PARTIALLY LINEAR MODELSAustralian & New Zealand Journal of Statistics, 49
Hansheng Wang, Runze Li, Chih-Ling Tsai (2007)
Tuning parameter selectors for the smoothly clipped absolute deviation method.Biometrika, 94 3
A. Bowman, A. Azzalini (1999)
Applied smoothing techniques for data analysis : the kernel approach with S-plus illustrationsJournal of the American Statistical Association, 94
R. Tibshirani (1996)
Regression Shrinkage and Selection via the LassoJournal of the royal statistical society series b-methodological, 58
D. Nicholls, B. Quinn (1982)
Random Coefficient Autoregressive Models: An Introduction
Jian Huang (2002)
A note on estimating a partly linear model under monotonicity constraintsJournal of Statistical Planning and Inference, 107
A. Schick (1998)
An Adaptive Estimator of the Autocorrelation Coefficient in Regression Models with Autoregressive ErrorsJournal of Time Series Analysis, 19
H. Leeb, B. Pötscher (2005)
MODEL SELECTION AND INFERENCE: FACTS AND FICTIONEconometric Theory, 21
R. Schmalensee, Thomas Stoker (1999)
Household Gasoline Demand in the United StatesEconometrica, 67
Jinhong You, Gemai Chen (2002)
PARAMETER ESTIMATION IN A PARTLY LINEAR REGRESSION MODEL WITH RANDOM COEFFICIENT AUTOREGRESSIVE ERRORSCommunications in Statistics - Theory and Methods, 31
G. Reinsel, R. Velu (1998)
Multivariate Reduced-Rank Regression: Theory and Applications
P. Speckman (1988)
Kernel smoothing in partial linear modelsJournal of the royal statistical society series b-methodological, 50
M. Priestley, M. Chao (1972)
Non‐Parametric Function FittingJournal of the royal statistical society series b-methodological, 34
Stephen Donald, Whitney Newey (1994)
Series estimation of semilinear modelsJournal of Multivariate Analysis, 50
S. Ahmed (2001)
Shrinkage Estimation of Regression Coefficients From Censored Data With Multiple Observations
G. Aneiros-Pérez, W. González-Manteiga, P. Vieu (2004)
Estimation and testing in a partial linear regression model under long-memory dependenceBernoulli, 10
J. Horowitz (2009)
Semiparametric and Nonparametric Methods in EconometricsOberwolfach Reports, 4
Zhengyuan Zhu, Yufeng Liu (2009)
Estimating spatial covariance using penalised likelihood with weighted L 1 penaltyJournal of Nonparametric Statistics, 21
© Institute of Mathematical Statistics, 2004 LEAST ANGLE REGRESSION
Hua Liang, W. Härdle, R. Carroll (1999)
Estimation in a semiparametric partially linear errors-in-variables modelAnnals of Statistics, 27
S. Hwang, I. Basawa (1993)
Parameter estimation in a regression model with random coefficient autoregressive errorsJournal of Statistical Planning and Inference, 36
In partially linear models, we consider methodology for simultaneous model selection and parameter estimation with random coefficient autoregressive errors by using lasso and shrinkage strategies. We provide natural adaptive estimators that significantly improve upon the classical procedures in the situation where some of the predictors are nuisance variables that may or may not affect the association between the response and the main predictors. In the context of two competing partially linear regression models (full and submodels), we consider an adaptive shrinkage estimation strategy and propose the shrinkage estimator and the positive‐rule shrinkage estimator. We develop the properties of these estimators by using the notion of asymptotic distributional risk. The shrinkage estimators are shown to have a higher efficiency than the classical estimators for a wide class of models. For the lasso‐type estimation strategy, we devise efficient algorithms to obtain numerical results. We compare the relative performance of lasso with the shrinkage estimator and the other estimators. Monte Carlo simulation experiments are conducted for various combinations of the nuisance parameters and sample size, and the performance of each method is evaluated in terms of simulated mean squared error. The comparison reveals that lasso and shrinkage strategies outperform the classical procedure. The relative performance of lasso and shrinkage strategies is comparable. The shrinkage estimators perform better than the lasso strategy in the effective part of the parameter space when, and only when, there are many nuisance variables in the model. A data example is showcased to illustrate the usefulness of suggested methods. Copyright © 2011 John Wiley & Sons, Ltd.
Applied Stochastic Models in Business and Industry – Wiley
Published: May 1, 2012
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.