Spark本地提交Volcano排程支援方案

2021-10-08 14:46:23 字數 2541 閱讀 2589

重構volcano客戶端vcctl

1.1新增命令列

在vcctl中新增命令列

rootcmd.addcommand(buildspark())

rootcmd.addcommand(buildsparkoperator())

1.2建立命令列根及指令

func buildsparkoperator() *cobra.command 

sparksubmitcmd := &cobra.command,

}// 初始化flag

spark_operator.initsparkoperatorflags(sparksubmitcmd)

sparkoperatorcmd.addcommand(sparksubmitcmd)

return sparkoperatorcmd

}

1.3構造yaml檔案並提交本地jar包至檔案伺服器

func runsparkoperatorsubmit() error }}

//構造標籤

sf.spec.driver.labels = map[string]string

sf.spec.driver.volumemounts.volumemount = volumemount}

sf.spec.executor.labels = map[string]string

sf.spec.executor.volumemounts.volumemount = volumemount}

//構建yaml檔案流

fs, err := yaml.marshal(&sf)

if err != nil

//建立yaml檔案

f, err := os.create(sf.metadata.name + ".yaml")

if err != nil

//刪除多餘標籤行用於匹配api

rmvolume := regexp.mustcompile("volume:\n ")

rmvolumemount := regexp.mustcompile("volumemount:\n ")

yamlstring := string(fs)

yamlstring = rmvolume.replaceallstring(yamlstring, "")

yamlstring = rmvolumemount.replaceallstring(yamlstring, "")

//寫入檔案

_, err = f.writestring(yamlstring)

if err != nil

//上傳jar包

uploadfile(cf.filepath, "")

//執行命令列

output, err := cmd.output()

if err != nil

fmt.printf("execute shell:%s finished with output:\n%s", cmd, string(output))

return err

}

yaml檔案結構體定義如下

type sparkoperatorflags struct 

spec struct

volumes struct

driver struct

}executor struct }}

}type volumemount struct

type volume struct

type hostpath struct

1.4修改webhook,使volcano能夠攔截含有標籤的請求

const (

// defaultqueue constant stores the name of the queue as "default"

defaultqueue = "default"

defaultschedulername = "volcano"

init_container_name = "spark-init"

odin_file_server_addr = "10.180.210.37"//"odin-file-server"

odin_file_server_port = 80

odin_file_download_key = "odin.io/filename"

odin_image_registry_addr_key = "odin.registry/addr"

odin_base_image="library/centos-ssh:latest"

)func init()

// 建立mutatingwebhookconfiguration物件

var service = &router.admissionservice,

rule: whv1beta1.rule,

apiversions: string,

resources: string,

},},

},}},

},}

spark 檢視yarn日誌 spark提交任務

standalone模式兩種提交任務方式 standalone client提交任務方式 提交命令 spark submit master spark node1 7077 class org.apache.spark.examples.sparkpi lib spark examples 1.6....

Git 本地提交

1.本地增加檔案 git add filename 2.本地刪除檔案 git rm filename git rm r dirname 和rm 的區別是,如果使用rm刪除會將刪除該檔案的操作提交上去 直觀的來講,git rm 刪除過的檔案,執行 git commit m abc 提交時,會自動將刪除...

Spark學習 Standlone提交模式

standalone client提交模式 1.standalone會通過反射的方式,建立和構造乙個driveractor程序。3.sparkcontext在初始化時,構造出dagscheduler和taskscheduler。6.executor會反向註冊到taskscheduler上去。7.dr...