When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard. During an interview, I asked the interviewer where the Internet is usually used now.
When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard. During an interview, I asked the interviewer where the Internet is usually used now.
When I was looking for a job, my goal was to find companies on the Internet and e-commerce, but I had a lot of trouble. The reason is that enterprise-level development has been carried out, which is not in line with Internet development. Internet companies all need Internet development experience, and I have many shortcomings in this regard.
During the interview, I asked the interviewer what technologies are commonly used by the Internet. He told me that Internet development is mainly aimed at Big Data Processing and cloud computing. The specific points are hadoop, hbase, high concurrency, and network programming.
So recently I am preparing to learn hadoop applications, leaving my learning footprint here.
First, the introduction of hadoop in Baidu Encyclopedia:
Hadoop:
A distributed system infrastructure developed by the Apache Foundation. You can develop distributed programs without understanding the details of the distributed underlying layer. Make full use of the power of the cluster for high-speed computing and storage. [1] Hadoop implements a Distributed File System (HDFS. HDFS features high fault tolerance and is designed to be deployed on low-cost hardware. It also provides high throughput to access application data, suitable for applications with large data sets. HDFS relaxed (relax) POSIX requirements and allows you to access data in a streaming access File System as a stream. The core design of the Hadoop framework is: HDFS and MapReduce. HDFS provide storage for massive data, while MapReduce provides computing for massive data. [2] |
Statement: This article uses BY-NC-SA protocol for authorization | yu blog
Reprinted please note