利用Keras進行網路編寫

2021-10-22 21:32:13 字數 2488 閱讀 4934

一.使用keras.model中的model.sequential

二.使用類建立模型

以下是乙個自編碼器的例子:

from keras.layers import lambda, input, dense, lstm, repeatvector, timedistributed

import keras

from keras import layers

from keras.models import model,sequential

from keras.datasets import mnist

from keras.losses import mse, binary_crossentropy

from keras.utils import plot_model

from keras import backend as k

from keras import losses

import numpy as np

import matplotlib.pyplot as plt

window_size = 500

features = 1

data = np.random.uniform(-0.1, 0.1, size=(5, 500))

data = data.cumsum(axis=1)

plt.show()

data = data.reshape(5, window_size, features)

# model = sequential()

## model.add(lstm(256, input_shape=(window_size, features),return_sequences=true))

# model.add(lstm(128, input_shape=(window_size, features),return_sequences=true))

# # model.add(repeatvector(window_size)) # 維度增加到512

## model.add(lstm(128, input_shape=(window_size, features),return_sequences=true))

# model.add(lstm(256, input_shape=(window_size, features),return_sequences=true))

# model.add(timedistributed(dense(1))) # 維度減少到1

class ae_lstm(keras.model):

def __init__(self, window_size, features):

super(ae_lstm, self).__init__(name='ae_lstm')

self.lstm_1 = layers.lstm(256,input_shape=(window_size,features),return_sequences=true)

self.lstm_2 = layers.lstm(128,input_shape=(window_size,features),return_sequences=true)

self.lstm_3 = layers.lstm(128,input_shape=(window_size,features),return_sequences=true)

self.lstm_4 = layers.lstm(256,input_shape=(window_size,features),return_sequences=true)

self.distributed = layers.timedistributed(dense(1))

def call(self, inputs):

x = self.lstm_1(inputs)

x = self.lstm_2(x)

x = self.lstm_3(x)

x = self.lstm_4(x)

x = self.distributed(x)

return x

model = ae_lstm(window_size,features)

model.compile(optimizer='adam', loss='mse')

model.fit(data, data, epochs=100, verbose=1)

yhat = model.predict(np.expand_dims(data[1, :, :], axis=0), verbose=0)# axis=0即在0維增加乙個維度,verbose = 0 為不在標準輸出流輸出日誌資訊

yhat = yhat.reshape(-1)

plt.plot(yhat,c='b')

d = np.expand_dims(data[1, :, :], axis=0)

d = d.reshape(-1)

plt.plot(d,c='r')

plt.show()

神經網路學習小記錄6 利用Keras進行簡單分類

全部 上一步講了如何構建回歸演算法,這一次將怎麼進行簡單分類。np utils.to categorical用於將標籤轉化為形如 nb samples,nb classes 的二值序列。假設num classes 10。如將 1,2,3,4 轉化成 0,1,0,0,0,0,0,0 0,0,1,0,0...

利用keras定義簡單神經網路

我是乙個python小白,最近在學習一些有關神經網路的知識 偶然之間發現了一本有關keras的書還不錯,有詳細的 我也不知道該怎麼學習keras,先從分析 開始,內容會有點囉唆,大神輕噴。以下 引自 keras深度學習實戰 from future import print function impo...

keras 編寫自己的Layer

1.layer重寫的基本思路?keras層的骨架。只需要實現三個方法即可 build input shape 這是你定義權重的地方。這個方法必須設self.built true,可以通過呼叫super layer self build 完成。build,這個函式用來確立這個層都有哪些引數,哪些引數是...