MapReduce Error:java.io.EOFException at java.io.DataInputStream.readFully (datainputstream.java:197)

Source: Internet
Author: User
I had this error:13/07/23 22:53:05 INFO JVM when I ran MapReduce. Jvmmetrics:initializing JVM Metrics with Processname=jobtracker sessionid= 13/07/23 22:53:05 WARN. Jobclient:use Genericoptionsparser for parsing the arguments. Applications should implement Tool for the same. 13/07/23 22:53:05 WARN mapred. Jobclient:no job jar file set.  user classes May is found. Jobconf (Class) or Jobconf#setjar (String). 13/07/23 22:53:05 INFO input. Fileinputformat:total input paths to process:44 13/07/23 22:53:10 INFO mapred. Jobclient:running job:job_local_0001 13/07/23 22:53:10 INFO input. Fileinputformat:total input paths to process:44 13/07/23 22:53:10 INFO mapred. MAPTASK:IO.SORT.MB = 13/07/23 22:53:14 INFO mapred. Jobclient:  map 0% reduce 0% 13/07/23 22:53:14 INFO mapred. Maptask:data buffer = 79691776/99614720 13/07/23 22:53:14 INFO mapred. Maptask:record buffer = 262144/327680 13/07/23 22:53:14 INFO mapred. maptask:starting flush of map output 13/07/23 22:53:14 WARN Mapred. localjobrunner:job_local_0001 java.io.EOFException at java.io.DataInputStream.readFully (datainputstream.java:197) At Org.apache.hadoop.io.Text.readFields (text.java:265) at Cooccurrence$textpair.readfields (cooccurrence.java:74) At Org.apache.hadoop.io.serializer.writableserialization$writabledeserializer.deserialize ( writableserialization.java:67) at org.apache.hadoop.io.serializer.writableserialization$ Writabledeserializer.deserialize (writableserialization.java:40) at Org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue (reducecontext.java:113) at Org.apache.hadoop.mapreduce.ReduceContext.nextKey (reducecontext.java:92) at Org.apache.hadoop.mapreduce.Reducer.run (reducer.java:175) at org.apache.hadoop.mapred.task$ Newcombinerrunner.combine (task.java:1222) at Org.apache.hadoop.mapred.maptask$mapoutputbuffer.sortandspill ( maptask.java:1265) at Org.apache.hadoop.mapred.maptask$mapoutputbuffer.flush (maptask.java:1129) at Org.apache.hadoop.mapred.maptask$newoutputcollector.close (Maptask.java:549) at Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:623) at Org.apache.hadoop.mapred.MapTask.run (maptask.java:305) at Org.apache.hadoop.mapred.localjobrunner$job.run ( localjobrunner.java:177) 13/07/23 22:53:15 INFO mapred. Jobclient:job complete:job_local_0001 13/07/23 22:53:15 INFO mapred. jobclient:counters:0
1, the program does not warning 2, StackOverflow is said to be a while loop problem: Then I will be in the program of the while loop comment out or this error. 3, StackOverflow also said that the different running time sometimes will appear, sometimes do not appear. There are many ways to do it online, but it still doesn't work out.
The reason for the final discovery: @Override public void ReadFields (Datainput in) throws IOException {First.readfields (in);     Second.readfields (in);      @Override public void Write (DataOutput out) throws IOException {first.write (out);      Out.write (' \ t ')//Cause of error ... Second.write (out);
The reason for the error is that the Readfully method of the Datainput class has read the exception thrown at the end of the file ...
The ReadFields method of the text class that is used directly in the ReadFields method of the Textpair class
public void ReadFields (Datainput in) throws IOException {
int newlength = Writableutils.readvint (in);
Setcapacity (Newlength, false);
in.readfully (bytes, 0, newlength);
length = Newlength;
}
This is the text class ReadFields method source code, which involves two classes, one is the Java Datainput class, one is the Writableutils class, roughly looked, text in the serialization and deserialization, the data length to the first byte, When you serialize, after you write the first variable, then write an escape character, and then continue to write the next variable, while you are not doing the processing when you read, which will result in reading the second variable when the original byte is not the length of the second variable, but you hit T, length error, If the length of the read is shorter than the original length, but will not be wrong (but the result is not), if longer than the original length, it must be eofexception, there is nothing to read.

Just get rid of out.write ("T").
Thanks for cs402 's classmate.
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.