Error: Mapper class ClassNotFoundException exception occurred on Maxtemperature program executing Hadoop authoritative guide:
To solve: The book on the
jobconf job = new jobconf (maxtemperature.class);
Switch
jobconf job = new jobconf (); Job.setjar ("/root/hadoop-resources/code/maxtemperature.jar");
Here is the process I resolved:
Depressed day, obviously according to the book (The Hadoop Authoritative guide) written on, but in the implementation of Hadoop Maxtemperature always appear mapper class can not find.
An issue that could be an environment variable was found. Then configure the Hadoop environment variable in/etc/profile:
Export hadoop_home="/usr/hadoop"for on dohadoop_ CLASSPATH=${hadoop_classpath}: $f done for in $hadoop _home/lib/* . Jar;dohadoop_classpath=${hadoop_classpath}: $fdoneexport hadoop_classpath= ". $HADOOP _classpath"
Execution, error still exists.
Then found that the online said to need packaging, I think should not need to pack, the book is not to pack, I use the version of Hadoop and he used the same. But first solve the problem, and then retreat to the second.
Package the generated class file as a jar:
JAR-CVF Classes/*. Class
Then use:
Hadoop jar Maxtemperature.jar maxtemperature input/sample.txt output
Execute the command and find that the error is still.
Then found that the online other program execution is normal, found that his jar is placed in the Hadoop installation directory under the bin. So I was full of doubts to copy my jar under the bin, and then execute it, OK ~~~!!!!
So this is a solution that copies the jar to Hadoop and runs properly.
But this kind of solution is too rigid, finally found a method on StackOverflow, the sample code in the book is written
New Jobconf (maxtemperature. Class);
Switch
New jobconf (); Job.setjar ("/root/hadoop-resources/code/maxtemperature.jar");
Then package, run, not the perfect solution ~ ~ ~
Maxtemperature Program Mapper ClassNotFoundException