1, the main learning of Hadoop in the four framework: HDFs, MapReduce, Hive, HBase. These four frameworks are the most core of Hadoop, the most difficult to learn, but also the most widely used.
2, familiar with the basic knowledge of Hadoop and the required knowledge such as Java Foundation,Linux Environment, Linux common commands
3. Some basic knowledge of Hadoop:features of Hadoop HDFs file system, Map/reduce, Hadoop mapper class reading, Hadoop reducer class reading, Mapreduce Shuffle and sequencing
4, Hadoop deployment mode is single-machine, pseudo-distributed, fully distributed. On the single-machine mode we can not care about and study, in the study of my personal advice is to build pseudo-distributed, complete distribution is used in production environment, when we put pseudo-distributed, must have a complete distribution of understanding.
5. Learn about the features and stability of each release of Hadoop and choose a stable version that suits your learning.
6. After the environment is set up, the installation will require some basic training level: Hadoop shell command.
7, with the basic study, this time is more suitable to find a book to systematic learning hadoop== "Hadoop authoritative Guide (2nd edition)
The above section is referenced in:Hadoop Novice Learning Guide
[Hadoop]hadoop Learning Route