Liaoliang's most popular one-stop cloud computing big Data and mobile Internet Solution Course V3 Hadoop Enterprise Complete Training: Rocky 16 Lessons (Hdfs&mapreduce&hbase&hive&zookeeper &sqoop&pig&flume&project)

Source: Internet
Author: User
Tags sqoop



Hadoop It is the fact standard software framework of cloud computing, which is the realization of cloud computing idea, mechanism and commercialization, and is the core and most valuable content in the whole cloud computing technology learning.



How to from the perspective of enterprise-level development combat start, in the actual enterprise-level hands-on operation in a comprehensible and gradual grasp Hadoop is at the heart of this course.





The Voice of cloud learners:



How to take the enterprise-level development perspective, constantly hands-on practical operation, gradually grasp the hadoop, until the beginning of the enterprise can be directly to start, is confusing a lot of cloud computing interested in the core of the friend, the course is to solve this problem, the learner only need to follow the video step-by-hand operation, Gain complete pain-free access to Hadoop enterprise-level development.



At the same time, this course will analyze the core source code of Hadoop, so that learners have a certain ability to modify the Hadoop framework, so as to the actual business situation to build their own framework.






Hadoop Field 4 a pioneering



1 , full coverage of Hadoop all core content of



2 , with a focus on hands-on implementation, and step in hand to master Hadoop Enterprise-level combat technology



3 During the course of the lesson, the Hadoop in-depth analysis of the core source, allowing students to transform Hadoop the ability of the framework



4 , with the ability to master Hadoop Complete project analysis, development, deployment of the entire process of capacity





--- Lecturer:



Liaoliang Teacher (email [email protected] phone 18610086859 qq:1740415547)



China's only mobile internet and cloud computing big Data synthesizer;



President and chief expert, cloud computing Big Data Spark Asia-Pacific Institute;






The president and chief expert of Spark's Asia-Pacific Research Institute, Spark source-level expert, has spent more than 2 years on Spark's painstaking research (since January 2012), and has completed a thorough study of the 14 different versions of Spark's source code, while constantly using the various features of spark in the real world, Wrote the world's first systematic spark book and opened the world's first systematic spark course and opened the world's first high-end spark course (covering spark core profiling, source interpretation, performance optimization, and business case profiling). Spark source research enthusiasts, fascinated by Spark's new Big data processing model transformation and application.



Hadoop Source-level experts, who have been responsible for the development of a well-known company's class Hadoop framework, focus on the provision of one-stop Hadoop solutions, as well as one of the first practitioners of cloud computing's distributed Big Data processing, the avid enthusiast of Hadoop, Constantly in the practice of using Hadoop to solve different areas of big data processing and storage, is now responsible for Hadoop in search engine research and development, there is "cloud computing distributed Big Data Hadoop Combat Master Road---from scratch" Cloud computing distributed Big Data Hadoop Combat Master Road---Master Rise "cloud computing distributed Big Data Hadoop. A master of the road---Master of the top "and so on;






Android architect, senior engineer, consultant, training expert;



Proficient in Android, HTML5, Hadoop, English broadcasting and bodybuilding;



A one-stop solution dedicated to Android, HTML5, Hadoop's soft, hard, and cloud integration;



China's earliest (2007) engaged in Android system porting, soft and hard integration, framework modification, application software development as well as Android system testing and application software testing, one of the technical experts and technical entrepreneurs.






One of the earliest practitioners in the field of HTML5 Technology (2009), successfully implemented a variety of custom HTML5 browsers for multiple organizations, participating in the development of a well-known HTML5 browser;



More than 10 of it best-selling authors;





Total Hadoop Professional

Training target

1, for cloud computing, distributed data storage in processing, big data and other interested friends
2, traditional database, such as Oracle, Maysql, DB2 and so on management personnel
3,java Developer
4, developers on the Web server side

Participation in the Curriculum foundation requirements

Has a strong interest in cloud computing and is able to read basic Java syntax.

Ability to target after training

Get started with Hadoop directly, with the ability to directly work with Hadoop development engineers and system administrators.

Training Skills Objectives

• Thoroughly understand the capabilities of the cloud computing technology that Hadoop represents

• Ability to build and harness Hadoop clusters
• Ability to modify the Hadoop framework

• Ability to develop your own web disk

• Ability to modify HDFS specific source code implementation
• Analyze the specific process of mapreduce execution from a code perspective and have the ability to develop MapReduce code
• Ability to master how Hadoop transforms HDFs files into Key-value for map calls
• Ability to master MapReduce internal operations and implement details and transform MapReduce

• Actual capabilities of specific Hadoop Enterprise Admins
• Ability to understand and manipulate zookeeper through command line and Java two ways

• Ability to master HBase Enterprise-level development and management

• Ability to master pig Enterprise-level development and management

• Ability to master hive Enterprise-level development and management

• Ability to use Sqoop to freely convert data from traditional relational databases and HDFs
• Ability to collect and manage distributed logs using Flume

• Ability to master the entire process of analysis, development, and deployment of Hadoop complete projects

Training career Goals

Hadoop Engineer, able to develop Hadoop distributed applications in any degree of complexity
Hadoop Administrator, able to build and manage Hadoop clusters

Hadoop Framework source research and modification capabilities

Hadoop Complete Project analysis, development, deployment of the entire process of capacity





Training Content





Time

Content

Note

first day

1th topic: Hadoop Three questions (thorough understanding of Hadoop)

1, why is Hadoop a true open source standard software framework for cloud computing distributed big Data?

2, how does Hadoop work?

3, what is the ecological architecture of Hadoop and what are the specific features of each module?

 

2nd topic: Hadoop cluster and management (with the ability to build and harness Hadoop clusters)

1, building Hadoop clusters

2, monitoring of the Hadoop cluster

3, management of the Hadoop cluster

4, running the MapReduce program under the cluster

 

3rd topic: Thorough Mastery HDFs (ability to develop your own network)

1, HDFs architecture anatomy  

2, NameNode, DataNode, Secondarynamenode Architecture

3, ensuring nodename high reliability best Practices

4, block partitioning in Datanode and how to store

5, modify Namenode, Datanode data Storage location

6, using the CLI to manipulate HDFs

7, using Java to operate HDFs

 

4th topic: Mastering HDFs (with the ability to modify HDFS specific source implementations)

1, RPC schema profiling   

2, source parsing Hadoop built on RPC

3, source code profiling HDFS RPC Implementation

4, source profiling client and RPC communication with Namenode

 


Time

Content

Note















Next day


1th topic: Thorough Mastery of MapReduce(from a code perspective to analyze the specific process of mapreduce execution and the ability to develop MapReduce code)



1. The classic steps of MapReduce execution



2, wordcount operation Process Analysis



3, mapper and reducer analysis



4. Custom writable



5, the difference between the old and new APIs and how to use the API



6. Package the MapReduce program into a jar package and run it at the command line






2nd topic: Mastering MapReduce(with the ability to master Hadoop to convert HDFs files to Key-value for map calls)



1. How does Hadoop convert HDFs files to key-value pairs?



2. Source code Analysis Hadoop read HDFs file and convert to key value pair process implementation



3, the source code analysis conversion to the key value pair after the map call process implementation






3rd topic: Mastering mapreducethoroughly (with the ability to master MapReduce internal operations and implement details and to transform MapReduce)



1. Hadoop built-in counters and how to customize counters



2, combiner the specific role and use and its use of restrictions



3. Best practices for using Partitioner



4, Hadoop built-in sorting algorithm analysis



5. Custom sorting algorithm



6. Hadoop built-in grouping algorithm



7. Custom Grouping algorithm



8. MapReduce Common scenario and algorithm implementation






4th topic:Hadoop Cluster Advanced practices (actual capabilities of specific Hadoop Enterprise Admins)



1. Dynamically increase the slave node of Hadoop



2. Dynamically modify the number of replication in Hadoop



3. Using commands to manage Hadoop cluster practices



4. Analyze the security model of Hadoop



5. Log Anatomy Practice






Time

Content

Note










Third Day











1th topic: Combat ZooKeeper(ability to understand and manipulate ZooKeeper in two ways, both command line and Java)

1, zookeeper structure analysis and cluster construction

2. Using command line Operation Zookeeper

3. Using Java Operation zookeeper


2nd topic: hbase(with the ability to master HBase Enterprise-level development and management)

1. HBase Architecture Implementation

2. HBase data model and storage model

3. Use the CLI to manipulate hbase

4. Use Java to manipulate hbase

5. Use MapReduce code to import bulk data into HBase


3rd topic: combat Pig(with the ability to master Pig Enterprise Development and management)

1. Pig architecture

2. Using pig to manipulate data inside

3. Using Pig instance data


4th topic: Live Hive(with the ability to master hive Enterprise-level development and management)

1. Hive Architecture Anatomy

2, hive in the HDFS storage implementation

3. Using MySQL as Hive Metastore

4, internal table, partition table, external table, bucket table

5. View

6. Functions for customizing Hive



Time

Content

Note














Fourth day












1th topic: Combat Sqoop(with the ability to use Sqoop to freely convert data from traditional relational databases and HDFs)



1. Sqoop Architecture



2. Actual combat sqoop import data from MySQL into HDFs



3. Actual combat sqoop import data from HDFs into MySQL



4. Define Sqoop Tasks






2nd topic: Combat Flume(with the ability to collect and manage distributed logs using Flume)



1. Analysis of Flume architecture system



2. Agent configuration information



3, dynamic monitoring file changes in the folder



4. Import data into HDFs



5, the instance monitors the change of the folder file and imports the data into HDFs






3rd topic: AdvancedHadoop System Management (ability to master MapReduce internal operations and implementation details and transform MapReduce)



1. Security mode for Hadoop



2. System Monitoring



3. System Maintenance



4. Appoint nodes and contact nodes



5. System upgrade



6, more system management tools in combat



7. Best Practices in System management






4th topic: Telecom Journal Project (with the ability to master the entire process of analyzing, developing and deploying Hadoop complete project)



Through the telecommunications provider collects the user to call, the Internet and so on the log to the user's telephone and the network behavior carries on the analysis and the monitoring, preliminary through the previous explanation main content, makes everybody familiar with Hadoop a complete project analysis, the development, the deployment entire process.




Liaoliang's most popular one-stop cloud computing big Data and mobile Internet Solution Course V3 Hadoop Enterprise Complete Training: Rocky 16 Lessons (Hdfs&mapreduce&hbase&hive&zookeeper &sqoop&pig&flume&project)


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.