Hadoop learning-Preface

Source: Internet
Author: User
When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard. During an interview, I asked the interviewer where the Internet is usually used now.

When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard. During an interview, I asked the interviewer where the Internet is usually used now.

When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard.

During the interview, I asked the interviewer what technologies are commonly used by the Internet. He told me that Internet development is mainly aimed at Big Data Processing and cloud computing. The specific points are hadoop, hbase, high concurrency, and network programming.

So recently I am preparing to learn hadoop applications, leaving my learning footprint here.


First, the introduction of hadoop in Baidu Encyclopedia:

Hadoop:

A distributed system infrastructure developed by the Apache Foundation.

You can develop distributed programs without understanding the details of the distributed underlying layer. Make full use of the power of the cluster for high-speed computing and storage.

[1] Hadoop implements a Distributed File System (HDFS. HDFS features high fault tolerance and is designed to be deployed on low-cost hardware. It also provides high throughput to access application data, suitable for applications with large data sets. HDFS relaxed (relax) POSIX requirements and allows you to access data in a streaming access File System as a stream.

The core design of the Hadoop framework is: HDFS and MapReduce. HDFS provide storage for massive data, while MapReduce provides computing for massive data. [2]


Statement: This article uses BY-NC-SA protocol for authorization | yu blog
Reprinted please note

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.