A virtual machine
- 1. Mount the virtual machine in NAT Nic mode
- 2. It is best to use a few virtual machines to modify the hostname, static ip/etc/network/interface, here is S101 s102 s103 three host Ubantu, change/etc/hostname file
- 3. Install SSH
- In the first host there s101 create a public-private key
- Ssh-keygen-t Rsa-p "-F ~/.ssh/id_rsa
- >CD. SSH
- >CP id_rsa.pub >authorized_keys Creating key libraries
- Upload the id_rsa.pub to another host, to the. SSH directory
- Through the service-side nc-l 8888 >~/.ssh/authorized_keys
- Client NC s102 8888 <id_rsa.pub
Start installing HADOOP/JDK
- Install Vm-tools easy to drag files from win 10 to Ubantu
- Create Directory/soft
- Change group chown Ubantu:ubantu/soft Convenient transfer files have permission
- Put files into/soft (can cp/mv src DST from desktop)
- TAR-ZXVF JDK or Hadoop automatically creates an extract directory
- Configuring the Installation Environment (/etc/environment)
-
- Add java_home=/soft/jdk-... JDK Directory
- Add Hadoop_home=/soft/hadoop (HADOOP directory)
- Add/soft/jdk-... jdk/bin:/soft/hadoop/bin/:/soft/hadoop/sbin in Path
- View a version number successfully through Java-version
- Version number is successful in Hadoop version
Start configuring HDFs four files Core-site.xml hdfs-site.xml mapred-site.xml yarn-site.xml
- Core-site.xml
<Configuration> < Property> <name>Fs.defaultfs</name> <value>hdfs://s101:9000</value> </ Property></Configuration>
2.hdfs-site.xml
<Configuration><!--configurations for NameNode: -< Property> <name>Dfs.replication</name> <value>2</value></ Property>< Property> <name>Dfs.namenode.name.dir</name> <value>File:/data/hdfs/name</value></ Property>< Property> <name>Dfs.datanode.data.dir</name> <value>File:/data/hdfs/data</value></ Property>< Property> <name>Dfs.namenode.secondary.http-address</name> <value>s101:50090</value></ Property>< Property> <name>Dfs.namenode.http-address</name> <value>s101:50070</value> <Description>The address and the base port where the Dfs Namenode Web UI would listen on. If the port is 0 then the server would start on a free port. </Description></ Property> < Property> <name>Dfs.namenode.checkpoint.dir</name> <value>File:/data/hdfs/checkpoint</value></ Property>< Property> <name>Dfs.namenode.checkpoint.edits.dir</name> <value>File:/data/hdfs/edits</value></ Property></Configuration>
3. Mapred-site.xml
<Configuration> < Property> <name>Mapreduce.framework.name</name> <value>Yarn</value> </ Property></Configuration>
4.yarn-site.xml
<Configuration><!--Site specific YARN configuration Properties - < Property> <name>Yarn.nodemanager.aux-services</name> <value>Mapreduce_shuffle</value> </ Property> < Property> <name>Yarn.resourcemanager.hostname</name> <value>S101</value> </ Property></Configuration>
Half the success of this .......
Create a folder
mkdir/data/hdfs/tmpmkdir/data/hdfs/varmkdir/data/hdfs/logsmkdir/data/hdfs/dfsmkdir/data/hdfs/datamkdir/data/ Hdfs/namemkdir/data/hdfs/checkpointmkdir/data/hdfs/edits
Remember to modify directory permissions
- sudo chown ubantu:ubantu/data
Next transfer the/soft folder to another host
Create a xsync executable file
- sudo touch xsync
- sudo chmod 777 Xsync permissions into an executable file
- sudo nano xsync
#!/bin/Bashpcount=$#if((pcount<1)); Then Echono args; Exit;fiP1=$1; fname=`basename$p 1 ' pdir= ' Cd-p $ (dirname$p 1);pwd' Cuser=`WhoAmI` for((host=102; host< the; host=host+1)); Do Echo--------S$host--------rsync-RVL $pdir/$fname [email protected] $host: $pdir Done
- Xsync/soft--------> will send folders to other hosts
- Xsync/data
Create Xcall to send commands to other hosts
#!/bin/Bashpcount=$#if((pcount<1)); then Echo no args; Exit; fi echo --------localhost--------[email protected] for (host=102; host <host=host+1)); do echo --------$shost-------- ssh s$host [email protected] Done
Don't worry, it's almost over, huh?
Also need to configure workers ask price
- Place a host name that needs to be configured as a data node (DataNode), one line at a
Attention to the point.
There's a lot of problems in the middle
1, rsync permissions not enough: Delete folder change folder permissions Chown
2. Learn to read log logs
Ubantu 16.4 Hadoop Fully distributed build