SparkMLlib SGD隨機梯度下降演算法

2021-07-16 04:56:01 字數 4206 閱讀 4170

**:

package mllib

import org.apache.log4j.

import org.apache.spark.

import scala.collection.mutable.hashmap

/***

隨機梯度下降演算法

* created by

汪本成

on 2016/8/5.

*/object sgd

data

} //

假設a=0

var

a: double = 0

//設定步進係數

var

b: double = 0.1

//設定迭代公式

def

sgd(x: double, y: double) =

def

main(args: array[string]) )

//顯示結果

println("

最終結果a為

" + a)

}}

執行結果:

sgdusing spark's default log4j profile: org/apache/spark/log4j-defaults.properties

slf4j: class path contains multiple slf4j bindings.

slf4j: found binding in [jar:file:/g:/home/download/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/staticloggerbinder.class]

slf4j: found binding in [jar:file:/g:/home/download/spark-1.6.1-bin-hadoop2.6/lib/spark-examples-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/staticloggerbinder.class]

slf4j: see for an explanation.

slf4j: actual binding is of type [org.slf4j.impl.log4jlogge***ctory]

16/08/05 00:48:28 info slf4jlogger: slf4jlogger started

16/08/05 00:48:28 info remoting: starting remoting

16/08/05 00:48:28 info remoting: remoting started; listening on addresses :[akka.tcp:

data:

(23,46)

(50,100)

(32,64)

(41,82)

(17,34)

(8,16)

(35,70)

(44,88)

(26,52)

(11,22)

(29,58)

(38,76)

(47,94)

(20,40)

(2,4)

(5,10)

(14,28)

(46,92)

(40,80)

(49,98)

(4,8)

(13,26)

(22,44)

(31,62)

(16,32)

(7,14)

(43,86)

(25,50)

(34,68)

(10,20)

(37,74)

(1,2)

(19,38)

(28,56)

(45,90)

(27,54)

(36,72)

(18,36)

(9,18)

(21,42)

(48,96)

(3,6)

(12,24)

(30,60)

(39,78)

(15,30)

(42,84)

(24,48)

(6,12)

(33,66)

result:

1:0.0(23,46)

2:4.6000000000000005(50,100)

3:-8.400000000000002(32,64)

4:24.880000000000006(41,82)

5:-68.92800000000003(17,34)

6:51.649600000000035(8,16)

7:11.929920000000003(35,70)

8:-22.82480000000001(44,88)

9:86.40432000000006(26,52)

10:-133.04691200000013(11,22)

11:15.504691199999996(29,58)

12:-23.65891328(38,76)

13:73.84495718400001(47,94)

14:-263.82634158080003(20,40)

15:267.82634158080003(2,4)

16:214.66107326464004(5,10)

17:108.33053663232002(14,28)

18:-40.53221465292802(46,92)

19:155.1159727505409(40,80)

20:-457.3479182516227(49,98)

21:1793.4568811813288(4,8)

22:1076.8741287087973(13,26)

23:-320.46223861263934(22,44)

24:388.95468633516725(31,62)

25:-810.6048413038511(16,32)

26:489.56290478231085(7,14)

27:148.2688714346932(43,86)

28:-480.6872757344877(25,50)

29:726.0309136017315(34,68)

30:-1735.6741926441557(10,20)

31:2.0000000000002274(37,74)

32:1.999999999999386(1,2)

33:1.9999999999994476(19,38)

34:2.000000000000497(28,56)

35:1.9999999999991056(45,90)

36:2.00000000000313(27,54)

37:1.9999999999946787(36,72)

38:2.000000000013835(18,36)

39:1.999999999988932(9,18)

40:1.999999999998893(21,42)

41:2.0000000000012172(48,96)

42:1.9999999999953737(3,6)

43:1.9999999999967615(12,24)

44:2.000000000000648(30,60)

45:1.999999999998704(39,78)

46:2.0000000000037588(15,30)

47:1.9999999999981206(42,84)

48:2.0000000000060134(24,48)

49:1.999999999991581(6,12)

50:1.9999999999966325(33,66)

最終結果a為 2.0000000000077454

16/08/05 00:48:28 info remoteactorrefprovider$remotingterminator: shutting down remote daemon.

process finished with exit code 0

分析:

當α為0.1的時候,一般30次計算就計算出來了;如果是0.5,一般15次計算就有正確結果 。如果是1,則50次都沒有結果

隨機打亂 隨機發牌

一.產生乙個隨機數 1.new random nextint 10 0,10 的隨機數 2.二.如何打亂乙個陣列或隨機排序,除了 random之外collections.shuffle非常方便,且效率高。可以實現隨機打亂列表的功能,實現把員工 順序 打亂等,安全傳輸方面,比如傳送加密前,把資料隨機打...

隨機森林隨機 三

2018年7月22日 陣雨 感冒發燒喉嚨疼,只因為一杯正常冰的奶蓋!以後得少喝加冰的東西了.前面說的是整合學習的兩種形式,這跟隨機森林有啥關係呢?隨機森林是bagging的乙個擴充套件變體.簡單的來說,當bagging演算法中的base learner為決策樹並在訓練決策樹的過程中加入隨機屬性選擇,...

隨機檔案與隨機函式

1.dev random 與 dev urandom 特殊檔案 有些系統會提供兩種隨機偽裝置 dev random 與 dev urandom。這兩個裝置的差別,在 dev random 會一直封鎖,直到系統所產生的隨機數已充分夠用,所以它可以確保高品質的隨機數。相對地,dev urandom 不會...