Keras實現雙向LSTM 隨機生成的資料

2021-10-23 18:03:13 字數 2210 閱讀 1012

例如如下 10 個隨機數:

0.63144003 0.29414551 0.91587952 0.95189228 0.32195638 0.60742236 0.83895793 0.18023048 0.84762691 0.29165514

累加值超過設定好的閾值時可標記為 1,否則為 0,例如閾值為 2.5,則上述輸入的結果為:

0 0 0 1 1 1 1 1 1 1

和單向 lstm 的區別是用到 bidirectional:

model.add(bidirectional(lstm(20, return_sequences=true), input_shape=(n_timesteps, 1)))

from random import random

from numpy import array

from numpy import cumsum

from keras.models import sequential

from keras.layers import lstm

from keras.layers import dense

from keras.layers import timedistributed

from keras.layers import bidirectional

# create a sequence classification instance

def get_sequence(n_timesteps):

# create a sequence of random numbers in [0,1]

x = array([random() for _ in range(n_timesteps)])

# calculate cut-off value to change class values

limit = n_timesteps/4.0

# determine the class outcome for each item in cumulative sequence

y = array([0 if x < limit else 1 for x in cumsum(x)])

# reshape input and output data to be suitable for lstms

x = x.reshape(1, n_timesteps, 1)

y = y.reshape(1, n_timesteps, 1)

return x, y

# define problem properties

n_timesteps = 10

# define lstm

model = sequential()

model.add(bidirectional(lstm(20, return_sequences=true), input_shape=(n_timesteps, 1)))

model.add(timedistributed(dense(1, activation='sigmoid')))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['acc'])

# train lstm

for epoch in range(1000):

# generate new random sequence

x,y = get_sequence(n_timesteps)

# fit model for one epoch on this sequence

model.fit(x, y, epochs=1, batch_size=1, verbose=2)

# evaluate lstm

x,y = get_sequence(n_timesteps)

yhat = model.predict_classes(x, verbose=0)

for i in range(n_timesteps):

print('expected:', y[0, i], 'predicted', yhat[0, i])

keras中lstm理解(一)

說到lstm,無可避免的首先要提到最簡單最原始的rnn。在這一部分,我的目標只是理解 迴圈神經網路 中的 迴圈 二字,不打算扔出任何公式,順便一提曾經困惑過我的keras中的輸入資料格式。我們經常可以看到有人說,lstm適合時序序列,變長序列,尤其適合自然語言處理。那麼是什麼賦予它可以處理變長序列的...

Keras中LSTM引數的含義

units 是輸出的維度。在下圖中,中間的綠色cell 裡面有四個黃色小框,每乙個小黃框代表乙個前饋網路層,對,就是經典的神經網路的結構,num units就是這個層的隱藏神經元個數,就這麼簡單。其中1 2 4的啟用函式是 sigmoid,第三個的啟用函式是 tanh。參考 units還可以理解為 ...

keras使用LSTM生成文字

本文主要介紹使用lstm實現字元級文字生成。下面是示例 coding utf 8 in 1 import keras import numpy as np path keras.utils.get file nietzsche.txt origin text open path read lower...