赞
踩
前言: 本文利用python实现多元线性回归模型的梯度下降算法,以二元线性回归模型为例,实现梯度下降算法,以及训练得到的三维图形结果展示。
一、二元线性回归模型的梯度下降算法代码
本数据的学习率选择0.0001,初始参数选择0,最大梯度下降迭代次数为1000次
import numpy as np from numpy import genfromtxt import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D # 读取数据 data = genfromtxt(r'Delivery.csv',delimiter=',') # 切分数据 x_data = data[:, 0:-1] # 所有行的从第0个列至倒数第一个列(不包括倒数第一个列)的数据 y_data = data[:, -1] # 学习率 lr = 0.0001 # 二元线性模型: y_data = theta0 + theta1 * x_data[0] + theta2 * x_data[1] theta0_init = 0 theta1_init = 0 theta2_init = 0 # 最大迭代次数 ite = 1000 # 代价函数 def compute_error(theta0, theta1, theta2, x_data, y_data): totleError = 0 for i in range(0, len(x_data)): totleError += ((theta0 + theta1 * x_data[i, 0] + theta2 * x_data[i, 1]) - y_data[i]) ** 2 return totleError / float(len(x_data)) / 2.0 # 梯度下降 def gradient_descent_runner(theta0, theta1, theta2, lr, ite, x_data, y_data): m = float(len(x_data)) # 样本容量 for i in range(ite): theta0_grad = 0 theta1_grad = 0 theta2_grad = 0 for j in range(0, len(x_data)): theta0_grad += (1 / m) * ((theta0 + theta1 * x_data[j, 0] + theta2 * x_data[j, 1]) - y_data[j]) theta1_grad += (1 / m) * ((theta0 + theta1 * x_data[j, 0] + theta2 * x_data[j, 1]) - y_data[j]) * x_data[j, 0] theta2_grad += (1 / m) * ((theta0 + theta1 * x_data[j, 0] + theta2 * x_data[j, 1]) - y_data[j]) * x_data[j, 1] # 同步更新参数 theta0 -= lr * theta0_grad theta1 -= lr * theta1_grad theta2 -= lr * theta2_grad return theta0, theta1, theta2 if __name__ == '__main__': print('Starting theta0 = {0}, theta1 = {1}, theta2 = {2}, error = {3}'. format(theta0_init, theta1_init, theta2_init, compute_error(theta0_init, theta1_init, theta2_init, x_data,y_data))) print('Gradient Descent Running...') theta0_end, theta1_end, theta2_end = gradient_descent_runner(theta0_init, theta1_init, theta2_init, lr, ite, x_data, y_data) print('After {0} iterations theta0 = {1}, theta1 = {2}, theta2 = {3}, error = {4}'. format(ite, theta0_end, theta1_end, theta2_end, compute_error(theta0_end, theta1_end, theta2_end,x_data, y_data))) # 绘制3D图 # 创建图像 fig = plt.figure() # 图像加入3D视图中 ax = Axes3D(fig) x0 = x_data[:, 0] x1 = x_data[:, 1] ax.scatter(x0, x1, y_data, c='r', marker='o', s=100) # 生成格网矩阵 x0, x1 = np.meshgrid(x0, x1) z = theta0_end + theta1_end * x0 + theta2_end * x1 # 绘制3d ax.plot_surface(x0, x1, z) # 设置坐标轴 ax.set_xlabel('Miles') ax.set_ylabel('Num of deliverys') ax.set_zlabel('Time') # 显示图像 plt.show()
二、训练结果展示
Starting theta0 = 0, theta1 = 0, theta2 = 0, error = 23.639999999999997
Gradient Descent Running...
After 1000 iterations theta0 = 0.006971416196678633, theta1 = 0.08021042690771771, theta2 = 0.07611036240566814, error = 0.3865635716109059
可以看出训练后的模型误差得到了大幅度的下降
可以看出训练后的模型与样本数据拟合效果较好
三、本文数据下载
链接:https://pan.baidu.com/s/1RW3IyWwZQSDwik-09fT_OQ
提取码:9nxf
赞
踩
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。