SUCCESS[INFO]------------------------------------------------------------------------[INFO] Total time:14:59.240s[INFO] Finished At:thu Jan 18:51:59 JST 2015[INFO] Final memory:168m/435m[INFO]------------------------------------------------------------------------The compiled binary package is located in theHadoop-2.3.0-src/hadoop-dist/target/hadoop-2.3.0.tar.gzPS: When installing
Install and configure Hadoop in Linux
Before installing Hadoop on Linux, you need to install two programs:
JDK 1.6 or later;
We recommend that you install OpenSSH for SSH (Secure Shell protocol ).
The following describes the reasons for installing these two programs:
Hadoop
Install Hadoop in Linux (pseudo distribution mode) before writing: when installing Hadoop in Linux, pay attention to permission issues and grant hadoop permissions to non-root users. This article does not cover how to create a new user in
Xshell, XftpUsing VirtualBox login, uploading files will be more troublesome, using Xshell remote login. Upload files using xftp. Upload hadoop-2.7.3.tar.gz, jdk-8u91-linux-x64.rpm to/usr/local directory. Novice tip: In the right window, select the/usr/local directory, the left double-click the compression package on the upload succeeded.Configuring the Hadoop
I recently tried to build the environment for Hadoop, but I really don't know how to build it. The next hop was a step-by-step error. Answers from many people on the Internet are also common pitfalls (for example, the most typical is the case sensitivity of commands, for example, hadoop commands are in lower case, and many people write Hadoop, so when you encount
Two cyanEmail: [Email protected] Weibo: HTTP://WEIBO.COM/XTFGGEFNow it's time to learn about Hadoop in a systematic way, although it may be a bit late, but you want to learn the hot technology, let's start with the installation environment. Official documentsThe software and version used in this article are as follows:
Ubuntu 14.10-Bit Server Edition
Hadoop2.6.0
JDK 1.7.0_71
Ssh
Rsync
First, you prepare a machine with
Two cyanEmail: [Email protected] Weibo: HTTP://WEIBO.COM/XTFGGEFNow it's time to learn about Hadoop in a systematic way, although it may be a bit late, but you want to learn the hot technology, let's start with the installation environment. Official documentsThe software and version used in this article are as follows:
Ubuntu 14.10-Bit Server Edition
Hadoop2.6.0
JDK 1.7.0_71
Ssh
Rsync
First, you prepare a machine with
Configuration requirements
Host Memory 4GB.
Disk more than GB.
HOST Machine installs common Linux distributions.
Linux Container (LXD)Take the host Ubuntu 16.04 as an example.
Install LXD.sudo Install sudo lxd init
To view the available image sources, if you use the default image, you can skip the next two steps and go directly to the back of the launch.$ LXC Remote List
Selec
This article assumes the user basic understanding Docker, grasps the Linux basic Use command, understands Hadoop's general installation and the simple configuration
Experimental environment: Windows10+vmware WorkStation 11+linux.14.04 server+docker 1.7
Windows 10 as a solid machine operating system, the network segment is: 10.41.0.0/24, virtual machine using NAT network, subnet for 192.168.92.0/24, gateway
Modify virtual local address in Hadoop Linux
Hadoop runs in Linux, but when we remotely access Hadoop HDFS in Eclipse in Windows, if the two sub-addresses are not in the same CIDR Block, they cannot access each other.
In Windows, you only need to ping Dos to test whether rem
A lot of people have written about the process of installing Hadoop under Virtual Machine Linux, but in the actual installation process, there are still a lot of other people are not involved but really need to pay attention to the part, so as to ensure the operation, but also as a small summary of their own.
First, about the Linux system settings
1. Set static
Tags: get java NPE View tables system XML validation 1.21. Preparing the Linux environment1.1 Shutting down the firewall#查看防火墙状态Service Iptables Status#关闭防火墙Service Iptables Stop#查看防火墙开机启动状态Chkconfig iptables--list#关闭防火墙开机启动Chkconfig iptables off1.2 Modifying sudoSu RootVim/etc/sudoersAdd execute permissions to Hadoop usersHadoop all= (All) allTo close the Linux
Hadoop begins, as is customary. Compile the source code first. Import to eclipse. In this way, we need to know about that piece, or it's a problem. Find the source code directly.before compiling the hadoop2.4.1 source code. Maven and ant environments must be installed, and Hadoop needs protoc2.5.0 support, so download Protoc as well. I downloaded:protobuf-2.5.0.tar.bz2There are several dependencies to insta
because it works in parallel and speeds up processing through parallel processing. Hadoop is also scalable and can handle PB-level data. In addition, Hadoop relies on the community server, so its cost is low and can be used by anyone.
As you may have thought, Hadoop is ideal for running on a Linux production platform
. Nnstorageretentionmanager:going to retain 1 images with Txid >= 015/01/14 19:30:17 INFO util. Exitutil:exiting with status 015/01/14 19:30:17 INFO namenode. Namenode:shutdown_msg:/************************************************************shutdown_msg:shutting down NameNode at ubuntu/60.191.124.254************************************************************/2. Start HadoopHadoop/sbin start-all.sh or start-dfs.sh start-yard.sh3. Verifying the InstallationSuch a complete distributed
Tags: map table mapred Linux Environment ISA TPS Word execution windowsSystem for CentOS 6.9,hadoop version 2.8.3, virtual machine vmware WorkstationThis article focuses on Linux virtual machine installation, environment configuration, and Hadoop local mode installation. Pseudo-distributed and installation under Window
Source: http://suxain.iteye.com/blog/1748356
Hadoop is a distributed system that works in Linux. As a developer, it has limited resources and has to use terminal-only virtual machines to run hadoop clusters. However, in this environment, development and debugging become so difficult. So, is there a way to issue debugging in windows. The answer is yes.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.