Sufficient dimension reduction and sufficient variable selection
MetadataShow full item record
The development in theory and methodology for sufficient dimension reduction has provided a powerful tool to tackle the challenging problem of high dimensional data analysis. It has been widely applied in many scientific fields in recent years. However, the probabilistic assumption and the non-exhaustiveness are two major concerns in the inverse approaches. The forward method, such as minimum average variance estimation (MAVE), does not require any probabilistic assumption and can exhaust the central mean subspace (CMS) even for dependent data, but its computational burden increases very fast with an increase of the sample size. In this dissertation, we propose three projects. The first one is to further develop sparse MAVE so that it can not only exhaust the CMS, but also achieve sparseness. In addition, we add new and more efficient criteria to estimate the structural dimension and to select the shrinkage tuning parameters. The second project is to extend the sparse MAVE to three different scenarios: with categorical variable W, multivariate response and the generalized partial linear single-index models (GPLSIM). The third project is to develop local dimension reduction methods with variable selection via k-nearest-neighbor idea, particularly for SIR, although many other methods can be adapted here. The new method can not only remove the linearity condition, but also exhaust the entire central subspace (CS).