python3 自編線性回歸(4種方法)

2021-09-29 14:19:27 字數 2049 閱讀 7220

import numpy as np

x = np.array([0, 1, 2, 3])

y = np.array([-1, 0.2, 0.9, 2.1])

x, y = x, y

# 直線回歸方程求解(y=bx+a+e)

def regressgion(x, y):

x_mean = np.mean(x)

y_mean = np.mean(y)

sp = sum(((x[i] - x_mean) * (y[i] - x_mean) for i in range(len(x))))

ss_x = sum(((x[i] - x_mean) ** 2 for i in range(len(x))))

b = sp / ss_x

a = y_mean - b * x_mean

return b, a

b, a = regressgion(x, y)

計算方法參考:直線回歸和相關------(一)回歸和相關的概念與直線回歸(含最小二乘推導)(

# 直線回歸矩陣求解(y=b0+b1*x+e)

def regression(x, y):

y = y.t

x = np.vstack([np.ones(len(x)), x]).t

coef = np.mat(np.dot(x.t, x)).i * np.mat(np.dot(x.t, y)).t

b0, b1 = round(float(coef[0]), 2), round(float(coef[1]), 2)

return b0, b1 # -0.95, 1

b0, b1 = regression(x, y)

計算方法參考:直線回歸和相關------(三)直線回歸的矩陣求解以及公式推導(

# 呼叫np.linalg.lstsql

a = np.vstack([x, np.ones(len(x))]).t

m, c = np.linalg.lstsq(a, y, rcond=none)[0] # m=1,c=-0.95

# 呼叫sklearn

from sklearn.linear_model import linearregression

# reshape your data either using array.reshape(-1, 1) if your data has a single feature

# or array.reshape(1, -1) if it contains a single sample.

reg = linearregression().fit(x.reshape(-1, 1), y)

slope = reg.coef_[0] # reg.coef_為陣列

intercept = reg.intercept_

# 繪圖比較

《機器學習實戰》線性回歸python3原始碼

開啟pycharm建立乙個regression.py檔案,輸入如下 coding utf 8 from numpy import seterr divide ignore invalid ignore 普通線性回歸 def loaddata filename 開啟乙個用逗號 分隔的文字檔案,該檔案的...

python3實現線性單元

理論知識見 直接上python3的 coding utf 8 import matplotlib.pyplot as plt from functools import reduce class perceptron object 初始化,輸入訓練數目,啟用函式 def init self,inpu...

Python3 入門4 異常

本文由 luzhuo 編寫,請保留該資訊.原文 以下 以python3.6.1為例 less is more coding utf 8 exception.py 異常 import sys 捕獲異常 可靈活組合 defexcep try except try print ex except 捕獲所有...