Hadoop requires more than one machine for subsequent data processing, as a cock wire, how to build a suitable environment to learn Hadoop? This is the----preparation that will be introduced in this article.
Set up the environment there is nothing to say, to build the environment more need it
Hardware: Computer (Win7 system)
Software: virtual machine software vmware Workstation Pro
Operating system CentOS 6.7
Terminal simulation software Xshell
Reverse proxy software Ngrok
Installing the configuration VMware and installing the CentOS system is not covered, and now it is mainly about the following configuration
1.SSH Service
It's a bit of a hassle to operate directly inside a virtual machine, and you can connect via tools on the Win7 system after you open the SSH service
First open the terminal, enter Rpm-qa |grep ssh to find out if the current system has SSH installed
If not installed, it can be installed via yum install SSH
When the installation is complete, start the SSH service via the service sshd Start command
Shut down the firewall by iptables Stop command
You can turn on the network card via Ifup eth0 (your own network card configuration)
Ifconfig View IP
Open Xshell, click New to open the new reply window
Fill the CentOS IP into the host box, the port is 22 by default, and change it to its own port.
Connection is the user name and password required to enter CentOS
2. Reverse Proxy
via the command CD/go to the root directory
Create the folder where the agent software resides mkdir Ngrok
Enter the Ngrok directory CD Ngrok
Download the Linux version of the Ngrok package to Win7
Xshell Enter yum install lrzsz installation file upload Download tool
Enter rz-e OK and select the Ngrok you just downloaded
Unzip Linux_amd64.zip Extract Files
Enter the unpacked directory CD LINUX_AMD64
Edit Ngrok.xml file Vim ngrok.cfg
SERVER_ADDR: "tunnel.qydev.com:4443"falsetunnels: ssh: 2222 Proto: "22"
Save exit, this is the modified file content
Enter the system command line inside the virtual machine
Through the command./ngrok-config=ngrok.cfg start SSH Open service
One line tcp://tunnel.qydev.com:2222-127.0.0.1:22
You can now create a new Xshell reply to the host tunnel.qydev.com Port 2222, as well as the host user name login system, the external network can be accessed.
Here, basically, you can start up the computer of a friend who wants to learn Hadoop with the above configuration. This will have the foundation of the cluster---multiple computers network access.
Hadoop---pre-prep---dick wire