tensorflow複雜度學習率

2021-10-08 01:19:39 字數 3646 閱讀 4290

import tensorflow as tf
w = tf.variable(tf.constant(

5, dtype=tf.float32)

)epoch =

40lr_base =

0.2# 最初學習率

lr_decay =

0.99

# 學習率衰減率

lr_step =

1# 餵入多少輪batch_size後,更新一次學習率

for epoch in

range

(epoch)

:# for epoch 定義頂層迴圈,表示對資料集迴圈epoch次,此例資料集資料僅有1個w,初始化時候constant賦值為5,迴圈100次迭代。

lr = lr_base * lr_decay **

(epoch / lr_step)

with tf.gradienttape(

)as tape:

# with結構到grads框起了梯度的計算過程。

loss = tf.square(w +1)

grads = tape.gradient(loss, w)

# .gradient函式告知誰對誰求導

w.assign_sub(lr * grads)

# .assign_sub 對變數做自減 即:w -= lr*grads 即 w = w - lr*grads

print

("after %s epoch,w is %f,loss is %f,lr is %f"

%(epoch, w.numpy(

), loss, lr)

)

after 0 epoch,w is 2.600000,loss is 36.000000,lr is 0.200000

after 1 epoch,w is 1.174400,loss is 12.959999,lr is 0.198000

after 2 epoch,w is 0.321948,loss is 4.728015,lr is 0.196020

after 3 epoch,w is -0.191126,loss is 1.747547,lr is 0.194060

after 4 epoch,w is -0.501926,loss is 0.654277,lr is 0.192119

after 5 epoch,w is -0.691392,loss is 0.248077,lr is 0.190198

after 6 epoch,w is -0.807611,loss is 0.095239,lr is 0.188296

after 7 epoch,w is -0.879339,loss is 0.037014,lr is 0.186413

after 8 epoch,w is -0.923874,loss is 0.014559,lr is 0.184549

after 9 epoch,w is -0.951691,loss is 0.005795,lr is 0.182703

after 10 epoch,w is -0.969167,loss is 0.002334,lr is 0.180876

after 11 epoch,w is -0.980209,loss is 0.000951,lr is 0.179068

after 12 epoch,w is -0.987226,loss is 0.000392,lr is 0.177277

after 13 epoch,w is -0.991710,loss is 0.000163,lr is 0.175504

after 14 epoch,w is -0.994591,loss is 0.000069,lr is 0.173749

after 15 epoch,w is -0.996452,loss is 0.000029,lr is 0.172012

after 16 epoch,w is -0.997660,loss is 0.000013,lr is 0.170292

after 17 epoch,w is -0.998449,loss is 0.000005,lr is 0.168589

after 18 epoch,w is -0.998967,loss is 0.000002,lr is 0.166903

after 19 epoch,w is -0.999308,loss is 0.000001,lr is 0.165234

after 20 epoch,w is -0.999535,loss is 0.000000,lr is 0.163581

after 21 epoch,w is -0.999685,loss is 0.000000,lr is 0.161946

after 22 epoch,w is -0.999786,loss is 0.000000,lr is 0.160326

after 23 epoch,w is -0.999854,loss is 0.000000,lr is 0.158723

after 24 epoch,w is -0.999900,loss is 0.000000,lr is 0.157136

after 25 epoch,w is -0.999931,loss is 0.000000,lr is 0.155564

after 26 epoch,w is -0.999952,loss is 0.000000,lr is 0.154009

after 27 epoch,w is -0.999967,loss is 0.000000,lr is 0.152469

after 28 epoch,w is -0.999977,loss is 0.000000,lr is 0.150944

after 29 epoch,w is -0.999984,loss is 0.000000,lr is 0.149434

after 30 epoch,w is -0.999989,loss is 0.000000,lr is 0.147940

after 31 epoch,w is -0.999992,loss is 0.000000,lr is 0.146461

after 32 epoch,w is -0.999994,loss is 0.000000,lr is 0.144996

after 33 epoch,w is -0.999996,loss is 0.000000,lr is 0.143546

after 34 epoch,w is -0.999997,loss is 0.000000,lr is 0.142111

after 35 epoch,w is -0.999998,loss is 0.000000,lr is 0.140690

after 36 epoch,w is -0.999999,loss is 0.000000,lr is 0.139283

after 37 epoch,w is -0.999999,loss is 0.000000,lr is 0.137890

after 38 epoch,w is -0.999999,loss is 0.000000,lr is 0.136511

after 39 epoch,w is -0.999999,loss is 0.000000,lr is 0.135146

學習時間複雜度和空間複雜度

評判演算法的標準 時間複雜度 一般是最壞情況下的時間複雜度 根據數量級,描述時間複雜度,一般用 大 o 表示,記做 o f n n為資料的規模 常見數量級函式 當 n 增大時,數量級函式增長幅度排名 1.常數函式 n 100 1 次 sum 1 n n 2 1 次 print sum 1 次 上面的...

複雜度分析 時間複雜度 空間複雜度

執行效率是演算法的乙個重要的考量指標,演算法的執行效率用時間 空間複雜度來衡量。今天我們來學習一下複雜度的分析。通常我們可以通過執行程式來獲得演算法的真正的執行時間,這種方法我們可以稱為事後統計法,但這種方法得到的是具體的資料,測試結果很依賴測試環境,而且受資料規模影像最大。因此,我們需要乙個不需要...

演算法複雜度 時間複雜度和空間複雜度

1 時間複雜度 1 時間頻度 乙個演算法執行所耗費的時間,從理論上是不能算出來的,必須上機執行測試才能知道。但我們不可能也沒有必要對每個演算法都上機測試,只需知道哪個演算法花費的時間多,哪個演算法花費的時間少就可以了。並且乙個演算法花費的時間與演算法中語句的執行次數成正比例,哪個演算法中語句執行次數...