Spark submit指令碼說明

2021-12-30 10:13:41 字數 765 閱讀 9766

在spark開發中會遇到spark-submit指令碼的編寫,作為小白的我就在這裡對spark-submit進行簡單的說明。

vi wordcount.sh

/usr/local/spark/bin/spark-submit \

--class cn.spark.study.core.wordcountcluster \

--num-executors 3 \

--driver-memory 100m \

--executor-memory 100m \

--executor-cores 3 \

--master spark:

/usr/local/sparktest-0.0.1-snapshot-jar-with-dependencies.jar \

首先執行spark-submit,並傳入引數class,指定要執行的class檔案,num-executors是指明讓spark啟動幾個executor來執行,driver-memory指明driver的記憶體是多少,executor-memory指明每乙個executor的記憶體是多少,executor-cores指明executor-core是幾個,master用來指定standalone模式下spark的集群;/usr/local/sparktest-0.0.1-snapshot-jar-with-dependencies.jar指明jar包的位置。

預設wordcount.sh是不能執行的,改變許可權chmod 777 wordcount.sh

執行./wordcount.sh

Spark submit指令碼說明

在spark開發中會遇到spark submit指令碼的編寫,作為小白的我就在這裡對spark submit進行簡單的說明。vi wordcount.sh usr local spark bin spark submit class cn.spark.study.core.wordcountclus...

spark submit執行jar包指令碼命令

找到spark submit檔案的目錄 目錄 spark submit master spark executor memory 2g total executor cores 10 driver memory 4g class com.test.main.test test.jar引數 maste...

spark submit指令碼引數的設定

driver memory 2g executor memory 4g executor cores 1 num executors 60一共60個executor,每個executor,1個cores,4個g的memory,共使用資源 240g的memory,60個core driver memo...