参考:http://scikit-learn.org/stable/modules/unsupervised_reduction.html
对于高维features,常常需要在supervised之前unsupervised dimensionality reduction。
下面三节的翻译会在之后附上。
4.4.1. PCA: principal component analysis
decomposition.PCA looks
for a combination of features that capture well the variance of the original features. See Decomposing
signals in components (matrix factorization problems). 翻译文章参考:http://blog.csdn.net/mmc2015/article/details/46867597。
Examples
4.4.2. Random projections
The module: random_projection provides several toolsfor data reduction
by random projections. See the relevant section of the documentation: Random
Projection.
Examples
4.4.3. Feature agglomeration(特征集聚)
cluster.FeatureAgglomeration applies Hierarchical
clustering to group together features that behave similarly.
Examples
Feature scaling
Note that if features have very different scaling or statistical properties, cluster.FeatureAgglomeration may
not be able to capture the links between related features. Using a preprocessing.StandardScaler can
be useful in these settings.
Pipelining:The unsupervised data reduction and the supervised estimator can be chained in one step. See Pipeline:
chaining estimators.
版权声明:本文为博主原创文章,未经博主允许不得转载。