赞
踩
目录
Python数据集可视化:抽取数据集的两个特征进行二维可视化、主成分分析PCA对数据集降维进行三维可视化(更好地理解维度之间的相互作用)
#PCA对数据集降维进行三维可视化(更好地理解维度之间的相互作用)
- print(__doc__)
-
-
- # Code source: Gaël Varoquaux
- # Modified for documentation by Jaques Grobler
- # License: BSD 3 clause
-
- import matplotlib.pyplot as plt
- from mpl_toolkits.mplot3d import Axes3D
- from sklearn import datasets
- from sklearn.decomposition import PCA
-
- # import some data to play with
- iris = datasets.load_iris()
- X = iris.data[:, :2] # we only take the first two features.
- y = iris.target
-
- x_min, x_max = X[:, 0].min() - .5, X[:, 0].max() + .5
- y_min, y_max = X[:, 1].min() - .5, X[:, 1].max() + .5
-
- plt.figure(2, figsize=(8, 6))
- plt.clf()
-
- # Plot the training points
- plt.scatter(X[:, 0], X[:, 1], c=y, cmap=plt.cm.Set1,
- edgecolor='k')
- plt.xlabel('Sepal length')
- plt.ylabel('Sepal width')
-
- plt.xlim(x_min, x_max)
- plt.ylim(y_min, y_max)
- plt.xticks(())
- plt.yticks(())
- # To getter a better understanding of interaction of the dimensions
- # plot the first three PCA dimensions
- fig = plt.figure(1, figsize=(8, 6))
- ax = Axes3D(fig, elev=-150, azim=110)
- X_reduced = PCA(n_components=3).fit_transform(iris.data)
- ax.scatter(X_reduced[:, 0], X_reduced[:, 1], X_reduced[:, 2], c=y,
- cmap=plt.cm.Set1, edgecolor='k', s=40)
- ax.set_title("PCA directions for viz")
- ax.set_xlabel("1st eigenvector")
- ax.w_xaxis.set_ticklabels([])
- ax.set_ylabel("2nd eigenvector")
- ax.w_yaxis.set_ticklabels([])
- ax.set_zlabel("3rd eigenvector")
- ax.w_zaxis.set_ticklabels([])
-
- plt.show()
参考:sklearn
参考:The Iris Dataset
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。