Keras 自定義層

2021-10-05 09:53:40 字數 4023 閱讀 9997

keras自定義或者重寫層,需要實現三個方法:

問題:參考 keras 自定義層

最後舉乙個conditional layer normalization的例子

基於conditional layer normalization的條件文字生成

# 自定義層需要實現三個方法

class

layernormalization

(layer)

:"""(conditional) layer normalization

hidden_*系列引數僅為有條件輸入時(conditional=true)使用

hidden_units 降維的維度,用於輸入的條件矩陣過大,先降維再變換

hidden_activation 一般採用線性啟用

"""def__init__

( self,

center=

true

, scale=

true

, epsilon=

none

, conditional=

false

, hidden_units=

none

, hidden_activation=

'linear'

, hidden_initializer=

'glorot_uniform'

,**kwargs

):super

(layernormalization, self)

.__init__(

**kwargs)

self.center = center

self.scale = scale

self.conditional = conditional

self.hidden_units = hidden_units

self.hidden_activation = activations.get(hidden_activation)

self.hidden_initializer = initializers.get(hidden_initializer)

self.epsilon = epsilon or1e-

12def

build

(self, input_shape)

:super

(layernormalization, self)

.build(input_shape)

# self.built=true

if self.conditional:

shape =

(input_shape[0]

[-1]

,)else

: shape =

(input_shape[-1

],)if self.center:

self.beta = self.add_weight(

shape=shape, initializer=

'zeros'

, name=

'beta'

)if self.scale:

self.gamma = self.add_weight(

shape=shape, initializer=

'ones'

, name=

'gamma'

)if self.conditional:

if self.hidden_units is

notnone

:# 用於降維

self.hidden_dense = dense(

units=self.hidden_units,

activation=self.hidden_activation,

use_bias=

false

, kernel_initializer=self.hidden_initializer

)if self.center:

self.beta_dense = dense(

units=shape[0]

, use_bias=

false

, kernel_initializer=

'zeros'

)if self.scale:

self.gamma_dense = dense(

units=shape[0]

, use_bias=

false

, kernel_initializer=

'zeros'

)def

call

(self, inputs)

:"""如果是條件layer norm,則預設以list為輸入,第二個是condition

"""if self.conditional:

inputs, cond = inputs

# 用於降維

if self.hidden_units is

notnone

: cond = self.hidden_dense(cond)

# 擴充維度保證與inputs維度相同

for _ in

range

(k.ndim(inputs)

- k.ndim(cond)):

cond = k.expand_dims(cond,1)

if self.center:

beta = self.beta_dense(cond)

+ self.beta

if self.scale:

gamma = self.gamma_dense(cond)

+ self.gamma

else

:if self.center:

beta = self.beta

if self.scale:

gamma = self.gamma

outputs = inputs

if self.center:

# layer normalization 取乙個batch,一列的yi'yang

mean = k.mean(outputs, axis=-1

, keepdims=

true

) outputs = outputs - mean

if self.scale:

variance = k.mean(k.square(outputs)

, axis=-1

, keepdims=

true

) std = k.sqrt(variance + self.epsilon)

outputs = outputs / std

outputs = outputs * gamma

if self.center:

outputs = outputs + beta

return outputs

# input_shape是乙個list 定義輸出維度

defcompute_output_shape

(self, input_shape)

:if self.conditional:

return input_shape[0]

else

:return input_shape

# 融合當前類和父類的config

defget_config

(self)

: config =

base_config =

super

(layernormalization, self)

.get_config(

)return

dict

(list

(base_config.items())

+list

(config.items())

)

keras自定義層

keras學習筆記 二 keras實現自定義層 例1,參考博文1中的自定義層attentiondecoder。def build self,input shape for model details that correspond to the matrices here.self.batch si...

Keras 中自定義層

keras中自定義層非常普遍。對於簡單,無狀態的自定義操作,可以通過 layers.core.lambda層來實現。對於包含了可訓練權重的自定義層,就需要自己手動去實現。需要實現三個方法 build input shape 定義權重,這個方法必須設 self.built true,可以通過呼叫 su...

Keras函式式API與自定義層

函式式api從乙個例子開始from keras.layers import x input shape 10,y dense 10 x 正常情況下怎麼使用類例項 可能你對上面的例子感到習以為常,但是看看正常情況下是怎樣使用類的 class a object def init self,var sel...