Tag: blog HTTP Java strong file data 2014 SP code
Mapreduce Working Principles
1. Map-reduce working mechanism analysis diagram:
1. first, the first step is to write our map-Reduce program and submit it in a client node. (generally, any node in the hadoop cluster can be used as long as hadoop is installed on the node and connected to the hadoop cluster)
2. After receiving this request, the job client will find jobtracker and request a job ID ). (Jobtracker can be easily found based on our core configuration file)
3. Run the HDFS system to distribute the code of this job,
4. Submit a job
5. initialize the job on the jobtracker side. For example, you can set up a series of data structures in its memory to record the running status of the job and put it in a job queue, wait for the job scheduler to schedule the job.
6. jobtracker will ask the HDFS's namenode about the files in which the data is stored, and then the files are scattered in which nodes. We will learn about the situation separately. because this job is between the map-Reduce program and the data, it is "Running nearby", that is, the program must be placed with the data to be processed, so, this information is required for running this job.
7. the heartbeat relationship between jobtracker and tasktracker is cleared once every minute. You can know which tasktracker can be involved in our calculation. for example, this tasktracker should not be down first, but it is alive. in addition, its compliance should be relatively low. if it is running other jobs. it is not suitable to add new jobs to him when it is busy. It is best to have a free node,
8. determine the tasktracker to run, that is, which tasktracker can be used in our map-reduce calculation. So, the tasktracker will fetch the relevant Java code from HDFS. After obtaining the code, it will start to set up the Java Virtual Machine, it is to run its Java Virtual Machine locally. then run the job.
The general process is like this.
The above content is from: refining the data into gold tutorial.
Mapreduce working principle graphic explanation (refining data into gold)