Penalized principal component regression
MetadataShow full item record
When using linear regression problems, an unbiased estimate is produced by the Ordinary Least Squares. There are two serious drawbacks to using OLS; one is if X is less than full rank then the estimator is no longer unique and two if the design matrix is collinear then the estimates can have extremely large variances. To address these two problems, several penalized regression methods have been developed such as Ridge, Lasso and Bridge, all of which have improved OLS in some aspects. We study principal component (PC) regression with several different penalties to see if there is another way to improve the above-mentioned methods. Using various simulations and a real setting, we compare and contrast the different types of regressions methods to our PC regression methods. It is shown that PC regression in combination with two different penalties perform well in different situations when evaluating the simulation results .