Hadoop除錯記錄 2

2022-09-14 21:48:41 字數 4214 閱讀 9467

自從上次調通hbase後很久沒有碰hadoop了,今日想寫乙個mapreduce的小程式。於是先執行了下自帶的wordcount示例程式,卻報錯了。

資訊如下:

kevin@ubuntu:~/usr/hadoop/hadoop$ ./bin/hadoop jar hadoop-examples-1.2.1

.jar wordcount readme.txt output

15/05/11

08:20:04 info input.fileinputformat: total input paths to process : 1

15/05/11

08:20:04 info util.nativecodeloader: loaded the native-hadoop library

15/05/11

08:20:04

15/05/11

08:20:05

info mapred.jobclient: running job: job_201505110806_0003

15/05/11

08:20:06 info mapred.jobclient: map 0% reduce 0%

15/05/11

08:20:06

info mapred.jobclient: task id : attempt_201505110806_0003_m_000002_0, status : failed

error initializing attempt_201505110806_0003_m_000002_0:

j**a.io.ioexception: exception reading

file:/~/usr/hadoop/hadoop/tmp/mapred/local/ttprivate/tasktracker/kevin/jobcache/job_201505110806_0003/jobtoken

at org.apache.hadoop.security.credentials.readtokenstoragefile(credentials.j**a:

135)

at org.apache.hadoop.mapreduce.security.tokencache.loadtokens(tokencache.j**a:

178)

at org.apache.hadoop.mapred.tasktracker.initializejob(tasktracker.j**a:

1289

) at org.apache.hadoop.mapred.tasktracker.localizejob(tasktracker.j**a:

1226

) at org.apache.hadoop.mapred.tasktracker$

5.run(tasktracker.j**a:2603

) at j**a.lang.thread.run(thread.j**a:

745)

caused by: j**a.io.filenotfoundexception: file

file:/~/usr/hadoop/hadoop/tmp/mapred/local/ttprivate/tasktracker/kevin/jobcache/job_201505110806_0003/jobtoken does not exist.

at org.apache.hadoop.fs.rawlocalfilesystem.getfilestatus(rawlocalfilesystem.j**a:

402)

at org.apache.hadoop.fs.filterfilesystem.getfilestatus(filterfilesystem.j**a:

255)

at org.apache.hadoop.fs.checksumfilesystem$checksumfsinputchecker.

(checksumfilesystem.j**a:125

) at org.apache.hadoop.fs.checksumfilesystem.open(checksumfilesystem.j**a:

283)

at org.apache.hadoop.fs.filesystem.open(filesystem.j**a:

436)

at org.apache.hadoop.security.credentials.readtokenstoragefile(credentials.j**a:

129)

...

5more

錯誤是無法讀取jobtoken的檔案,由於程式不會出錯,推測是上次在配置hbase時修改hadoop的core-site.xml檔案出錯了。在stackoverflow上有乙個同樣的問題,有人建議刪除 該配置檔案中關於快取檔案目錄的配置,因為這一部分是會被系統預設建立的,不需要手動配置。即下面這一部分:

1

2<

property

>

3<

name

>hadoop.tmp.dir

name

>

4<

value

value

>

5<

description

>a base for other temporary directories.

description

>

6property

>

刪掉這個問題就解決了。關閉hadoop,重啟hadoop。出現新問題 jobtracker is in safe mode:

15/05/11

08:47:40 error security.usergroupinformation: priviledgedactionexception as:kevin cause:org.apache.hadoop.ipc.remoteexception: org.apache.hadoop.mapred.safemodeexception: jobtracker is in

safe mode

at org.apache.hadoop.mapred.jobtracker.checksafemode(jobtracker.j**a:

5188

) at org.apache.hadoop.mapred.jobtracker.getstagingareadir(jobtracker.j**a:

3677

) at sun.reflect.nativemethodaccessorimpl.invoke0(native method)

at sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.j**a:62)

at sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.j**a:43)

at j**a.lang.reflect.method.invoke(method.j**a:

483)

at org.apache.hadoop.ipc.rpc$server.call(rpc.j**a:

587)

at org.apache.hadoop.ipc.server$handler$

1.run(server.j**a:1432

) at org.apache.hadoop.ipc.server$handler$

1.run(server.j**a:1428

) at j**a.security.accesscontroller.doprivileged(native method)

at j**ax.security.auth.subject.doas(subject.j**a:

422)

at org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.j**a:

1190

) at org.apache.hadoop.ipc.server$handler.run(server.j**a:

1426)

bin/hadoop dfsadmin -safemode le**e

問題均解決,wordcount成功執行。

《Hadoop權威指南》閱讀記錄2

第3章 hadoop分布式檔案系統 hadoop distributed filesystem 以流式資料訪問模式來儲存超大檔案 hadoop的構建思路 一次寫入,多次讀取時最高效的訪問模式,讀取整個資料集的時間延遲,比讀取第一條記錄的時間延遲更重要。目前,寫操作總是將資料新增在檔案的末尾,他不支援...

hadoop遠端除錯

jdwp 設定 jvm本身就支援遠端除錯,eclipse也支援jdwp,只需要在各模組的jvm啟動時載入以下引數 dt socket表示使用套接字傳輸。address 8000 jvm在8000埠上監聽請求,這個設定為乙個不衝突的埠即可。server y y表示啟動的jvm是被除錯者。如果為n,則表...

Hadoop除錯日誌

1.export hadoop root logger debug,console 2.使用hadoop shell命令daemonlog 192.168.142.111 9870,這裡的埠是http埠3.通過web介面 使用者可以通過web介面檢視和修改某個類的日誌級別,比如,可通過以下url修改...