使用TensorFlow v2庫實現線性回歸

2022-06-14 17:42:14 字數 3741 閱讀 1763

使用tensorflow v2庫實現線性回歸

此示例使用簡單方法來更好地理解訓練過程背後的所有機制

from __future__ import absolute_import, division, print_function

import tensorflow as tf

import numpy as np

rng = np.random

# 引數

learning_rate =

0.01

training_steps =

1000

display_step =

50

# 訓練資料

x = np.array(

[3.3

,4.4

,5.5

,6.71

,6.93

,4.168

,9.779

,6.182

,7.59

,2.167

,7.042

,10.791

,5.313

,7.997

,5.654

,9.27

,3.1])

y = np.array(

[1.7

,2.76

,2.09

,3.19

,1.694

,1.573

,3.366

,2.596

,2.53

,1.221

,2.827

,3.465

,1.65

,2.904

,2.42

,2.94

,1.3])

n_samples = x.shape[

0]

# 隨機初始化權重,偏置

w = tf.variable(rng.randn(

),name=

"weight"

)b = tf.variable(rng.randn(

),name=

"bias"

)# 線性回歸(wx b)

deflinear_regression

(x):

return w * x b

# 均方差

defmean_square

(y_pred,y_true)

:return tf.reduce_sum(tf.

pow(y_pred-y_true,2)

)/(2

* n_samples)

# 隨機梯度下降優化器

optimizer = tf.optimizers.sgd(learning_rate)

# 優化過程

defrun_optimization()

:# 將計算封裝在gradienttape中以實現自動微分

with tf.gradienttape(

)as g:

pred = linear_regression(x)

loss = mean_square(pred,y)

# 計算梯度

gradients = g.gradient(loss,

[w,b]

)# 按gradients更新 w 和 b

zip(gradients,

[w,b]

))

# 針對給定訓練步驟數開始訓練

for step in

range(1

,training_steps 1):

# 執行優化以更新w和b值

run_optimization(

)if step % display_step ==0:

pred = linear_regression(x)

loss = mean_square(pred, y)

print

("step: %i, loss: %f, w: %f, b: %f"

%(step, loss, w.numpy(

), b.numpy())

)

output:

step: 50, loss: 0.210631, w: 0.458940, b: -0.670898

step: 100, loss: 0.195340, w: 0.446725, b: -0.584301

step: 150, loss: 0.181797, w: 0.435230, b: -0.502807

step: 200, loss: 0.169803, w: 0.424413, b: -0.426115

step: 250, loss: 0.159181, w: 0.414232, b: -0.353942

step: 300, loss: 0.149774, w: 0.404652, b: -0.286021

step: 350, loss: 0.141443, w: 0.395636, b: -0.222102

step: 400, loss: 0.134064, w: 0.387151, b: -0.161949

step: 450, loss: 0.127530, w: 0.379167, b: -0.105341

step: 500, loss: 0.121742, w: 0.371652, b: -0.052068

step: 550, loss: 0.116617, w: 0.364581, b: -0.001933

step: 600, loss: 0.112078, w: 0.357926, b: 0.045247

step: 650, loss: 0.108058, w: 0.351663, b: 0.089647

step: 700, loss: 0.104498, w: 0.345769, b: 0.131431

step: 750, loss: 0.101345, w: 0.340223, b: 0.170753

step: 800, loss: 0.098552, w: 0.335003, b: 0.207759

step: 850, loss: 0.096079, w: 0.330091, b: 0.242583

step: 900, loss: 0.093889, w: 0.325468, b: 0.275356

step: 950, loss: 0.091949, w: 0.321118, b: 0.306198

step: 1000, loss: 0.090231, w: 0.317024, b: 0.335223

import matplotlib.pyplot as plt

# 繪製圖

使用TensorFlow v2庫實現線性回歸

使用tensorflow v2庫實現線性回歸 此示例使用簡單方法來更好地理解訓練過程背後的所有機制 from future import absolute import,division,print function import tensorflow as tf import numpy as n...

Tk庫的使用 2

sample code from programing ruby,page 250 require tk class gifviewer def initialize filelist setup viewer filelist enddef run tk.mainloop enddef setup...

如何使用libxml2庫?

libxml2庫是幹什麼的?很多人就開始說,是搞xml的 解析xml格式的 讀取xml檔案的.其實說的都不錯,但是對libxml2庫的理解狹隘了一點。libxml2現在不僅僅可以解析xml extensible markup language 格式,包括html hypertext markup l...