${hadoop_home}/bin/Hadoop jobusage:jobclient<command> <args> [-submit <job-file>] [-status <job-id>] [-counter <job-id> <group-name> <counter-name>] [-kill <job-id>] [-abort <job-id>] [-suspend <job-id>[hours]] [-recover <job-id> [-force] [-jobconf name=value] [-file local-path] [-Cachearchive]] [-Set-priority <job-id> <priority>]. Valid values forPriorities Are:very_high High NORMAL low Very_low [-Set-map-capacity <job-id> <map-capacity>] [-Set-reduce-capacity <job-id> <reduce-capacity>] [-Set-map-over-capacity <job-id> <true/false>] [-Set-reduce-over-capacity <job-id> <true/false>] [-events <job-id> < from-Event-#> <#-of-events>] [-history <jobOutputDir>] [-list [All]] [-kill-task <task-id>] [-fail-task <task-id>] [-input-add <job-id> <input>] [-input-done <job-id>]
- -kill <job-id> Kill the final state of a job,job is killed
- -kill-task <task-id> Kill the final state of a task Attempt,task attempt is killed, and the corresponding task restarts a task attempt calculation, and kill does not cause the task to fail
- -fail-task <task-id> fail The final state of a task Attempt,task attempt is failed, if the task attempt fail exceeds a certain number of times (by default 4), the corresponding task fails
- -set-priority <job-id> Set Job priority
- -status <job-id> Get the status of a job
- -list [All] gets the job list, no parameters represent get run job list, parameter All means get all job list
- -suspend <job-id> [hours],-recover <job-id> introduced in the breakpoint restart
The use of Hadoop customers