Show simple item record

dc.contributor.authorByrd, Ayanna
dc.date.accessioned2014-03-04T03:18:56Z
dc.date.available2014-03-04T03:18:56Z
dc.date.issued2008-05
dc.identifier.otherbyrd_ayanna_n_200805_ms
dc.identifier.urihttp://purl.galileo.usg.edu/uga_etd/byrd_ayanna_n_200805_ms
dc.identifier.urihttp://hdl.handle.net/10724/24569
dc.description.abstractWhen using linear regression problems, an unbiased estimate is produced by the Ordinary Least Squares. There are two serious drawbacks to using OLS; one is if X is less than full rank then the estimator is no longer unique and two if the design matrix is collinear then the estimates can have extremely large variances. To address these two problems, several penalized regression methods have been developed such as Ridge, Lasso and Bridge, all of which have improved OLS in some aspects. We study principal component (PC) regression with several different penalties to see if there is another way to improve the above-mentioned methods. Using various simulations and a real setting, we compare and contrast the different types of regressions methods to our PC regression methods. It is shown that PC regression in combination with two different penalties perform well in different situations when evaluating the simulation results .
dc.languageeng
dc.publisheruga
dc.rightspublic
dc.subjectpenalized regression
dc.subjectRidge
dc.subjectLasso
dc.subjectBridge
dc.subjectPrinciple Component
dc.titlePenalized principal component regression
dc.typeThesis
dc.description.degreeMS
dc.description.departmentStatistics
dc.description.majorStatistics
dc.description.advisorCheolwoo Park
dc.description.committeeCheolwoo Park
dc.description.committeeWilliam McCormick
dc.description.committeeJeongyoun Ahn


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record