RedhatAS4 is successfully upgraded to the latest Gcc

Source: Internet
Author: User
It was hard to get a batch of machines to deploy Hadoop1.0.1, but it was found that the tasktracker of datanode could not be started, and a fatalerror was reported in java, Which is inexplicable. I once again admire the strength of google, and I can't find anything about degrees x. Tasktracker fails to start and finds the cause: part of the underlying c of hdfs is built for gcc4.1, while the latest gcc in RedHatAS4 is 3.4. Upgrade gcc. But this is a bitter thing.

It was hard to get the Batch Machine to deploy Hadoop1.0.1, but it was found that the tasktracker of datanode could not be started, and a fatal error was reported in java.

I once again admire the strength of google, and I can't find anything about degrees x.
Tasktracker fails to start and finds the cause:
Part of the underlying c of hdfs is built for gcc4.1, while the latest gcc in RedHat AS 4 is 3.4.

Upgrade gcc. However, this is a hard job. gcc depends on many other packages and needs to be re-compiled. Here we will first list some packages on which gcc4.7 depends:

1. glibc-devel
During gcc compilation, an error is reported, saying:
/Usr/bin/ld: crti. o: No such file or directory

In fact, this file exists in the system. The solution is to find the package to which crti. o belongs:
1) locate crti. o:
Locate crti. o
/Usr/lib64/crit. o
2) obtain the package of the file:
Rpm-qf/usr/lib64/crit. o
Glibc-devel-2.3.4-2.36
3) install the corresponding package:
Yum install glibc-devel-2.3.4-2.36

2. Install xzHttp://tukaani.org/xz/xz-5.0.3.tar.gz
Decompress the package and place it in the/home/hadoop/xz-5.0.3.
./Configure -- prefix =/home/hadoop/local
Make
Make install
Follow to modify ~ /. Bashrc:

  1. Export PATH =/usr/newgcc/bin:/home/hadoop/local/bin: $ PATH
  2. Export LD_LIBRARY_PATH =/usr/newgcc/lib64:/home/hadoop/local/lib: $ LD_LIBRARY_PATH
  3. Export LD_LIBRARY_PATH =/usr/newgcc/lib:/home/hadoop/local/lib: $ LD_LIBRARY_PATH

Here,/usr/newgcc is the final location for gcc installation, which is added to. bashrc at one time.
Then, make the settings take effect: source ~ /. Bashrc
To test whether the installation is correct, enter "xz ".
There will be related prompts.


3. Download mpc, mpfr, and gmp
1) mpfr http://www.mpfr.org/
2) PC http://www.multiprecision.org/
3) gmp http://ftp.tsukuba.wide.ad.jp/software/gmp/gmp-5.0.0.tar.bz2

During normal operations, the three libraries are compiled separately and then gcc is compiled. However, we find that the gcc official website has a method of laziness, which is to decompress these libraries and change their names: after gmp, mpc, mpfr, mv to gcc source directory, then gcc in the compilation process, it will compile the three libraries together, the original: http://gcc.gnu.org/install/prerequisites.html

4. You can compile and install gcc later.
1 ). /configure -- prefix =/usr/newgcc -- enable-shared -- enable-threads = posix -- enable-checking -- with-zlib =/home/hadoop/local/bin -- enable -_ _ cxa_atexit -- disable-libunwind-exceptions -- enable-java-awt = gtk -- build = x86_64-redhat-linux -- enable-versions ages = c, c ++

The build parameter here is not the host parameter. If it is the host, an error is reported. The reason is unknown.

2) make
3) make install
Enter gcc-v

  1. Using built-in specs.
  2. COLLECT_GCC = gcc
  3. COLLECT_LTO_WRAPPER =/usr/newgcc/libexec/gcc/x86_64-redhat-linux/4.7.0/Lto-wrapper
  4. Target: x86_64-redhat-linux
  5. Configured :. /configure -- prefix =/usr/newgcc -- enable-shared -- enable-threads = posix -- with-gmp =/home/hadoop/local/-- with-mpfr =/home/hadoop /local -- with-mpc =/home/hadoop/local/-- enable-checking -- with-zlib =/home/hadoop/local/bin -- enable-_ cxa_atexit -- disable- libunwind-exceptions -- enable-java-awt = gtk -- build = x86_64-redhat-linux -- enable-languages ages = c, c ++
  6. Thread model: posix
  7. Gcc version4.7.0(GCC)

The installation is successful.

PS: for different login users to use gcc, You need to modify the. bashrc

Attached to another brother's article, I am very helpful: http://www.linuxidc.com/Linux/2012-06/62591.htm

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.