基於python的BP神經網路及異或實現過程解析

2022-10-03 18:42:09 字數 3958 閱讀 7396

bp神經網路是最簡單的神經網路模型了,三層能夠模擬非線性函式效果。

難點:

'''neural networks

created on 2019.9.24

author: vince

'''import math

import logging

import numpy

import random

import matplotlib.pyplot as plt

'''neural network

'''class neuralnetwork:

def __init__(self, layer_nums, iter_num = 10000, batch_size = 1):

selwww.cppcns.comf.__ili = 0;

self.__hli = 1;

self.__oli = 2;

self.__tln = 3;

if lwww.cppcns.comen(layer_nums) != self.__tln:

raise exception("layer_nums length must be 3");

self.__layer_nums = layer_nums; #array [layer0_num, layer1_num ...layern_num]

self.__iter_num = iter_num;

self.__batch_size = batch_size;

def train(self, x, y):

x = numpy.array(x);

y = numpy.array(y);

self.l = ;

#initialize parameters

self.__weight = ;

self.__bias = ;

self.__step_len = ;

for layer_index in range(1, self.__tln):

self.__weight.append(numpy.random.rand(self.__layer_nums[layer_index - 1], self.__layer_nums[layer_index]) * 2 - 1.0);

self.__bias.append(numpy.random.rand(self.__layer_nums[layer_index]) * 2 - 1.0);

self.__step_len.append(0.3);

logging.info("bias:%s" % (self.__bias));

logging.info("weight:%s" % (self.__weight));

for iter_index in range(self.__iter_num):

sample_index = random.randint(0, len(x) - 1);

logging.debug("-----round:%s, select sample %s-----" % (iter_index, sample_index));

output = self.forward_pass(x[sample_index]);

g = (-output[2] + y[sample_index]) * self.activation_drive(output[2]);

logging.debug("g:%s" % (g));

for j in range(len(output[1])):

self.__weight[1][j] += self.__step_len[1] * g * output[1][j];

self.__bias[1] -= self.__step_len[1] * g;

e = ;

for i in range(sel程式設計客棧f.__layer_nums[self.__hli]):

e.append(numpy.dot(g, self.__weight[1][i]) * self.activation_drive(output[1][i]));

e = numpy.array(e);

logging.debug("e:%s" % (e));

for j in range(len(output[0])):

self.__weight[0][j] += self.__step_len[0] * e * output[0][j];

self.__bias[0] -= self.__step_len[0] * e;

l = 0;

for i in range(len(x)):

predictions = self.forward_pass(x[i])[2];

l += 0.5 * numpy.sum((predictions - y[i]) ** 2);

l /= len(x);

self.l.append(l);

logging.debug("bias:%s" % (self.__bias));

logging.debug("weight:%s" % (self.__weight));

logging.debug("loss:%s" % (l));

logging.info("bias:%s" % (self.__bias));

logging.info("weight:%s" % (self.__weight));

logging.info("l:%s" % (self.l));

def activation(self, z):

return (1.0 / (1.0 + numpy.exp(-z)));

dewww.cppcns.comf activation_drive(self, y):

return y * (1.0 - y);

def forward_pass(self, x):

data = numpy.copy(x);

result = ;

result.append(data);

for layer_index in range(self.__tln - 1):

data = self.activation(numpy.dot(data, self.__weight[layer_index]) - self.__bias[layer_index]);

result.append(data);

return numpy.array(result);

def predict(self, x):

return self.forward_pass(x)[self.__oli];

def main():

logging.basicconfig(level = logging.info,

format = '%(asctime)s %(filename)s[line:%(lineno)d] %(levelname)s %(message)s',

datefmt = '%a, %d %b %y %h:%m:%s');

logging.info("trainning begin.");

nn = neuralnetwork([2, 2, 1]);

x = numpy.array([[0, 0], [1, 0], [1, 1], [0, 1]]);

y = numpy.array([0, 1, 0, 1]);rfwhc

nn.train(x, y);

logging.info("trainning end. predict begin.");

for x in x:

print(x, nn.predict(x));

plt.plot(nn.l)

plt.show();

if __name__ == "__main__":

main();

具體收斂效果

本文標題: 基於python的bp神經網路及異或實現過程解析

本文位址:

BP神經網路(基於MATLAB)

clc clear all 匯入資料 load s data.mat s含量所用資料 n 12 n 是自變數的個數 m 1 m 是因變數的個數 讀取訓練資料 train num 1600 訓練樣本數 train data s data 1 train num,特徵值歸一化 train input,m...

基於BP神經網路的分類

使用的是乙個簡單的資料集fisheriris,該資料集資料類別分為3類,setosa,versicolor,virginica。每類植物有50個樣本,共150個樣本代表150朵花瓣。每個樣本有4個屬性,分別為花萼長,花萼寬,花瓣長,花瓣寬。其中meas是150 4的矩陣代表著有150個樣本每個樣本有...

BP神經網路

基本bp神經網路演算法包括 訊號的前向傳播 誤差的反向傳播 也即計算實際輸出時按照輸入到輸出的方向進行,權值閾值調整則相反。bp是一種多層前饋神經網路,由輸入層 隱含層和輸出層組成。層與層之間有兩種訊號在流動 一種是從輸入流向輸出的工作訊號,是輸入和權值的函式 另一種是輸入流向輸出的訊號,即誤差。隱...