After the son on holiday, work this heart really can't hold, in order to be able to enrich the two days spent this difficult, I decided to get a big project ..... PS: Why am I so looking forward to the holiday, lying on the couch like a dead man is really meaningful ...
Of course version: hadoop2.3.0cdh5.0.2
Machine:
|
Nn |
Dn |
Jn |
Rm |
Nm |
Jh |
Hmaster |
Hregionserver |
| Mast1 |
Is |
Is |
Is |
Is |
Is |
|
Is |
Is |
| Mast2 |
Is |
Is |
Is |
Is |
Is |
|
Is |
Is |
| Mast3 |
|
Is |
Is |
|
Is |
Is |
|
Is |
Target version: hadoop2.6.0cdh5.7.0
Upgrade method: Upgrading unmanaged CDH Using the Command line
Upgrade considerations: ① upgrade from below cdh5.4.0 to cdh5.4.0 or later, HDFs metadata upgrade is required;
② upgrade from less than cdh5.2.0 version: Upgrade HDFs metadata
Upgrade Sentry Database
Upgrading the Hive Database
Upgrading the SQOOP2 Database
③ also make sure that you upgrade the Oozie database and the shared database as follows:
If you uploaded the Spark collection jar file to HDFs, upload the latest version of the file
Upgrade steps:
1. Prepare for the upgrade:
① the Namenode into safe mode, backup Fsimage
[Email protected] conf]$ HDFs haadmin-getservicestate mast1
Active
[[email protected] conf]$ HDFs Dfsadmin-safemode Enter
Safe mode is on
[Email protected] conf]$ HDFs dfsadmin-savenamespace
② confirm that no Hadoop service is running
[email protected] ~]# ps-aef|grep java
Root 9540 8838 0 15:34 pts/0 00:00:00 grep java
③ back up metadata in Namenode (active namenode in ha) (note that if a lock file is found in the process, do it again from the beginning)
[Email protected] ~]# Cd/app/hdp/dfs/name
[Email protected] name]# Tar-cvf/root/nn_backup_data.tar.
./
./edits/
./edits/current/
./edits/current/edits_inprogress_0000000000000000624
./edits/current/edits_0000000000000000413-0000000000000000533
./edits/current/edits_0000000000000000620-0000000000000000621
./edits/current/edits_0000000000000000618-0000000000000000619
./edits/current/edits_0000000000000000062-0000000000000000180
./edits/current/edits_0000000000000000622-0000000000000000623
./edits/current/edits_0000000000000000038-0000000000000000050
./edits/current/edits_0000000000000000534-0000000000000000615
./edits/current/edits_0000000000000000181-0000000000000000182
./edits/current/edits_0000000000000000284-0000000000000000412
./edits/current/edits_0000000000000000051-0000000000000000061
./edits/current/seen_txid
./edits/current/edits_0000000000000000183-0000000000000000283
./edits/current/version
./edits/current/edits_0000000000000000616-0000000000000000617
./current/
./current/fsimage_0000000000000000000
./current/seen_txid
./current/version
./current/fsimage_0000000000000000000.md5
./namesecondary/
2. Download CDH 5 "1-click" repository
https://archive.cloudera.com/cdh5/one-click-install/redhat/6/x86_64/cloudera-cdh-5-0.x86_64.rpm
3. Update
hadoop2.3.0cdh5.0.2 Upgrade to cdh5.7.0