It was hard to get a batch of machines to deploy Hadoop1.0.1, but it was found that the tasktracker of datanode could not be started, and a fatalerror was reported in java, Which is inexplicable. I once again admire the strength of google, and I can't find anything about degrees x. Tasktracker fails to start and finds the cause: part of the underlying c of hdfs is built for gcc4.1, while the latest gcc in RedHatAS4 is 3.4. Upgrade gcc. But this is a bitter thing.
It was hard to get the Batch Machine to deploy Hadoop1.0.1, but it was found that the tasktracker of datanode could not be started, and a fatal error was reported in java.
I once again admire the strength of google, and I can't find anything about degrees x.
Tasktracker fails to start and finds the cause:
Part of the underlying c of hdfs is built for gcc4.1, while the latest gcc in RedHat AS 4 is 3.4.
Upgrade gcc. However, this is a hard job. gcc depends on many other packages and needs to be re-compiled. Here we will first list some packages on which gcc4.7 depends:
1. glibc-devel
During gcc compilation, an error is reported, saying:
/Usr/bin/ld: crti. o: No such file or directory
In fact, this file exists in the system. The solution is to find the package to which crti. o belongs:
1) locate crti. o:
Locate crti. o
/Usr/lib64/crit. o
2) obtain the package of the file:
Rpm-qf/usr/lib64/crit. o
Glibc-devel-2.3.4-2.36
3) install the corresponding package:
Yum install glibc-devel-2.3.4-2.36
2. Install xzHttp://tukaani.org/xz/xz-5.0.3.tar.gz
Decompress the package and place it in the/home/hadoop/xz-5.0.3.
./Configure -- prefix =/home/hadoop/local
Make
Make install
Follow to modify ~ /. Bashrc:
- Export PATH =/usr/newgcc/bin:/home/hadoop/local/bin: $ PATH
- Export LD_LIBRARY_PATH =/usr/newgcc/lib64:/home/hadoop/local/lib: $ LD_LIBRARY_PATH
- Export LD_LIBRARY_PATH =/usr/newgcc/lib:/home/hadoop/local/lib: $ LD_LIBRARY_PATH
Here,/usr/newgcc is the final location for gcc installation, which is added to. bashrc at one time.
Then, make the settings take effect: source ~ /. Bashrc
To test whether the installation is correct, enter "xz ".
There will be related prompts.
3. Download mpc, mpfr, and gmp
1) mpfr http://www.mpfr.org/
2) PC http://www.multiprecision.org/
3) gmp http://ftp.tsukuba.wide.ad.jp/software/gmp/gmp-5.0.0.tar.bz2
During normal operations, the three libraries are compiled separately and then gcc is compiled. However, we find that the gcc official website has a method of laziness, which is to decompress these libraries and change their names: after gmp, mpc, mpfr, mv to gcc source directory, then gcc in the compilation process, it will compile the three libraries together, the original: http://gcc.gnu.org/install/prerequisites.html
4. You can compile and install gcc later.
1 ). /configure -- prefix =/usr/newgcc -- enable-shared -- enable-threads = posix -- enable-checking -- with-zlib =/home/hadoop/local/bin -- enable -_ _ cxa_atexit -- disable-libunwind-exceptions -- enable-java-awt = gtk -- build = x86_64-redhat-linux -- enable-versions ages = c, c ++
The build parameter here is not the host parameter. If it is the host, an error is reported. The reason is unknown.
2) make
3) make install
Enter gcc-v
- Using built-in specs.
- COLLECT_GCC = gcc
- COLLECT_LTO_WRAPPER =/usr/newgcc/libexec/gcc/x86_64-redhat-linux/4.7.0/Lto-wrapper
- Target: x86_64-redhat-linux
- Configured :. /configure -- prefix =/usr/newgcc -- enable-shared -- enable-threads = posix -- with-gmp =/home/hadoop/local/-- with-mpfr =/home/hadoop /local -- with-mpc =/home/hadoop/local/-- enable-checking -- with-zlib =/home/hadoop/local/bin -- enable-_ cxa_atexit -- disable- libunwind-exceptions -- enable-java-awt = gtk -- build = x86_64-redhat-linux -- enable-languages ages = c, c ++
- Thread model: posix
- Gcc version4.7.0(GCC)
The installation is successful.
PS: for different login users to use gcc, You need to modify the. bashrc
Attached to another brother's article, I am very helpful: http://www.linuxidc.com/Linux/2012-06/62591.htm