当前位置:   article > 正文

吴恩达深度学习:1.3样例用一个隐含层神经网络对数据进行分类_as an example, if you would like to set the entrie

as an example, if you would like to set the entries of a matrix x to 0 and 1

coding: utf-8

# Planar data classification with one hidden layer

用一个隐含层神经网络对数据进行分类

Welcome to your week 3 programming assignment. It’s time to build your first neural network,

which will have a hidden layer. You will see a big difference between this model and the one

you implemented using logistic regression.

**You will learn how to:

- (1)Implement a 2-class classification neural network with a single hidden layer

- (2)Use units with a non-linear activation function, such as tanh

- (3)Compute the cross entropy loss 计算交叉熵损失

- (4)Implement forward and backward propagation

## 1 - Packages

1、准备对应的包

Let’s first import all the packages that you will need during this assignment.

- numpy is the fundamental package for scientific computing with Python.

- sklearn provides simple and efficient tools for data mining and data analysis.

- matplotlib is a library for plotting graphs in Python.

- testCases provides some test examples to assess the correctness of your functions

- planar_utils provide various useful functions used in this assignment

Package imports

import numpy as np
import matplotlib.pyplot as plt
from testCases import *
import sklearn
import sklearn.datasets
import sklearn.linear_model
from planar_utils import plot_decision_boundary, sigmoid, load_planar_dataset, load_extra_datasets

np.random.seed(1) # set a seed so that the results are consistent

## 2 - Dataset ##查看数据

2.1准备数据集

First, let’s get the dataset you will work on. The following code will load a “flower” 2-class

dataset into variables X and Y.

X, Y = load_planar_dataset()

Visualize the dataset using matplotlib. The data looks like a “flower” with some red (label y=0)

and some blue (y=1) points. Your goal is to build a model to fit this data.

2.2显示数据

Visualize the data:

plt.scatter(X[0, :], X[1, :], c=np.squeeze(Y), s=40, cmap=plt.cm.Spectral)
plt.show()

You have:

- a numpy-array (matrix) X that contains your features (x1, x2)

- a numpy-array (vector) Y that contains your labels (red:0, blue:1).

Lets first get a better sense of what our data is like.

How many training examples do you have? In addition, what is the shape of the variables X and Y?

How do you get the shape of a numpy array?

2.3输出数据的维度

shape_X = X.shape#X.shape是2*400
shape_Y = Y.shape#Y.shape是1*400

m = shape_X[1] # training set size

print(‘The shape of X is: ’ + str(shape_X))#(2,400)
print(‘The shape of Y is: ’ + str(shape_Y))#(1,400)
print(‘I have m = %d training examples!’ % (m))#m=400

## 3 - Simple Logistic Regression

用简单的分类器进行分类

Before building a full neural network, lets first see how logistic regression performs on this problem.

You can use sklearn’s built-in functions to do that. Run the code below to train a logistic regression

classifier on the dataset.

Train the logistic regression classifier

clf = sklearn.linear_model.LogisticRegressionCV()
clf.fit(X.T, Y.T)

You can now plot the decision boundary of these models. Run the code below.

Plot the decision boundary for logistic regression

plot_decision_boundary(lambda x: clf.predict(x), X, Y)
plt.title(“Logistic Regression”)

LR_predictions = clf.predict(X.T)#所有预测的结果400个行向量
print(‘Accuracy of logistic regression: %d ’ % float(
(np.dot(Y, LR_predictions) + np.dot(1 - Y, 1 - LR_predictions)) / float(Y.size) * 100) +
‘% ’ + “(percentage of correctly labelled datapoints)”)

np.dot(Y, LR_predictions)是LR_predictions是预测为1的正确的数量,

np.dot(1 - Y, 1 - LR_predictions)是预测为0正确的数量

Interpretation: The dataset is not linearly separable, so logistic regression

doesn’t perform well.Hopefully a neural network will do better. Let’s try this now!

## 4 - Neural Network model

4、用神经网络模型来处理数据

Logistic regression did not work well on the “flower dataset”. You are going to

train a Neural Network with a single hidden layer.

Reminder: The general methodology to build a Neural Network is to:

(1.) Define the neural network structure ( # of input units, # of hidden units, etc).

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Cpp五条/article/detail/687656
推荐阅读
相关标签
  

闽ICP备14008679号