Pytorch基礎 logistic回歸

2021-10-06 01:37:02 字數 3461 閱讀 9176

import torch

import torch.nn as nn

import numpy as np

print

(torch.__version__)

1.1.0
logistic回歸是一中廣義線性回歸,與多重線性回歸分析有很多相同之處。它們模型形式上基本相同,都具有wx+b,其中w和b是待求解引數,區別在於因變數不同,多重線性回歸直接將wx+b作為因變數,即y=wx+b,而logistic回歸則通過函式l將wx+b對應乙個隱狀態p,p=l(wx+b),根據p與1-p的大小決定因變數的值。

l為logistic函式時為logistic回歸,l為多項式函式時為多項式回歸

logistic回歸主要進行二分類**:sigmoid函式就是常見的logistic函式,因為sigmoid函式的輸出時0~1之間的概率,當概率大於0.5時**為1,小於0.5時為0

# 載入資料

data = np.loadtxt(

'./data/german.data-numeric'

)n, l = data.shape

# 資料歸一化

for j in

range

(l -1)

: meanval = np.mean(data[

:, j]

) stdval = np.std(data[

:, j]

) data[

:, j]

=(data[

:, j]

- meanval)

/ stdval

# 打亂資料

np.random.shuffle(data)

# 劃分訓練集和測試集

train_data = data[

:900

,:l -1]

train_tag = data[

:900

, l -1]

-1test_data = data[

900:

,:l -1]

test_tag = data[

900:

, l -1]

-1

# 定義網路

class

net(nn.module)

:def

__init__

(self)

:super

(net, self)

.__init__(

) self.fc = nn.linear(24,

2)self.sigmoid = nn.sigmoid(

)def

forward

(self, x)

: out = self.fc(x)

out = self.sigmoid(out)

return out

def

test

(pred, lab)

: t = pred.

max(-1

)[1]

== lab

return torch.mean(t.

float()

)

net = net(

)critertion = nn.crossentropyloss(

)# 使用交叉熵損失

optimizer = torch.optim.adam(net.parameters())

# adam優化器

epochs =

1000

for epoch in

range

(epochs)

: net.train(

)# 指定模型為訓練模型,計算梯度

x = torch.from_numpy(train_data)

.float()

y = torch.from_numpy(train_tag)

.long()

y_pred = net(x)

loss = critertion(y_pred, y)

# 計算損失

optimizer.zero_grad(

)# 權重置零

loss.backward(

)# 反向傳播

optimizer.step()if

(epoch +1)

%100==0

: net.

eval()

#指定模型計算模式

test_in = torch.from_numpy(test_data)

.float()

test_t = torch.from_numpy(test_tag)

.long()

test_out = net(test_in)

# 使用測試函式計算準確率

accu = test(test_out, test_t)

print

('epoch: {}, loss: {}, accuracy:{}'

.format

( epoch +

1, loss.item(

), accu)

)

epoch: 100, loss: 0.6658604145050049, accuracy:0.699999988079071

epoch: 200, loss: 0.6306068897247314, accuracy:0.8199999928474426

epoch: 300, loss: 0.6095178723335266, accuracy:0.8100000023841858

epoch: 400, loss: 0.5955496430397034, accuracy:0.8100000023841858

epoch: 500, loss: 0.5853410363197327, accuracy:0.800000011920929

epoch: 600, loss: 0.5774123072624207, accuracy:0.8199999928474426

epoch: 700, loss: 0.5710282921791077, accuracy:0.8199999928474426

epoch: 800, loss: 0.5657661557197571, accuracy:0.8199999928474426

epoch: 900, loss: 0.5613517165184021, accuracy:0.8199999928474426

epoch: 1000, loss: 0.5575944185256958, accuracy:0.8199999928474426

邏輯回歸 之 Logist 推導

可以咱學校教材大二版的 概率論與數理統計 山大版,來整一波,為了簡化推導形式呢,這裡就假設2個樣本空間的形式來展開,基於 條件概率 全概率與貝葉斯作為核心.栗子 全概率與貝葉斯 設某廠有甲,乙,丙 三個車間都生產 a 產品.已知 先驗概率 各車間產量分別佔全廠的 25 35 40 且各車間的次品率分...

pytorch基礎函式

返回乙個張量,從標準正態分佈 均值為0,方差為1 中抽取的一組隨機數。張量的形狀由引數sizes定義。import torch import torch.nn.functional as f x1 torch.tensor 1,2,3,4 1,3,4,5 3,4,5,6 y11 f.softmax ...

Pytorch基礎操作

import torch import numpy as np x torch.empty 5,3 print x 初始化乙個隨機矩陣 x torch.zeros 5,3,dtype torch.long print x 構建乙個全0矩陣 x x.new ones 5,3,dtype torch.d...