HDFs, as a distributed file system, is the foundation of all these projects. The analysis of HDFs is good for understanding other systems. Since Hadoop's HDFs and MapReduce are the same project, we put them together for analysis.
If you take the whole Hadoop as a class in Java, then HDFs is the static variable of this class, and the other projects are the methods in Hadoop.
HDFs
Implementation of Hdfs,hadoop Distributed File system
File system abstraction, which can be understood as a unified file access interface that supports multiple file system implementations
FS
File system abstraction, which can be understood as a unified file access interface that supports multiple file system implementations
IPC
An implementation of a simple IPC that relies on the codec functionality provided by IO
Reference: http://zhangyu8374.javaeye.com/blog/86306
io
The presentation layer. Encode/decode various data for easy transmission over the network
MapReduce
Map/reduce implementation of Hadoop
Filecache
Provides a local cache of HDFs files for faster map/reduce data access
IPC
An implementation of a simple IPC that relies on the codec functionality provided by IO
Reference: http://zhangyu8374.javaeye.com/blog/86306
io
The presentation layer. Encode/decode various data for easy transmission over the network
Tread on the footprints of predecessors to learn hadoop--structure, focus