LAMP: httpd 2.4.6 + mysql-5.6.13 + php-5.4.22System Environment: Amazon Linux (64-bit) (similar to RHEL6.4 _ x86_64)
1. Compile and install the httpd-2.4.61. install the software package on which httpd depends$ Sudo yum groupinstall Development
The DNS configuration in RedHat Linux AS 5.3 is different from that in RedHat Linux AS 4.5. Here I take a note of the DNS configuration steps in RedHat Linux AS 5.3, AS shown below:
I. Check the BIND package. The bind and bind-chroot packages are
I. Installation Process1. Download libevent-2.0.12-stable and compile and installWget http://httpsqs.googlecode.com/files/libevent-2.0.12-stable.tar.gzTar zxvf libevent-2.0.12-stable.tar.gzCd libevent-2.0.12-stable/./Configure -- prefix
CentOS 6.5 cannot be accessed after the Apache server is installed. Similar symptoms are described in references, but the solution is slightly modified.
In article 1, we recommend that you add rules to the firewall to enable port 80. The method is
Unable to execute upgrade script/xx/schema-40to410. SQL
Problem Background: After CloudStack4.1.1 is installed on a newly installed operating system and is executed to cloudstack-setup-management, an error is reported when CloudStack starts to start
Environment: CentOS 6.4 x86_64
Required applications: vsftpd-3.0.2.tar.gz
I have introduced httpd source code installation in details in another article. (Address:) Here I will introduce vsftpd source code installation and compare it with what is
Subversion has a standard directory structure.For example, if the project is proj and the svn address is svn: // proj/, the standard svn layout is
Svn: // proj/| +-trunk +-branches +-tagsThis is a standard layout. trunk is the main development
With the development of Hadoop technology, more and more enterprises have been using Hadoop to process big data since it initially solved the storage problems of large amounts of data for companies such as Google and Facebook. Understanding the
Automated synchronization documentation
1. prepare local SynchronizationSynchronize data from 172.18.1.247 to 172.18.1.249.1. Double-click mutual trustDo not damage the environment of the automatic building platform 172.18.1.247. You only need to
1. Compile and install the keepalived-1.2.8
: Http://www.keepalived.org/download.html
Note that the latest version is not necessarily the best, so select 1.2.8.
Tar xf keepalived-1.2.8.tar.gz
Cd keepalived-1.2.8
Mkdir-p/data/soft/keepalived
Because I want to set the port number of an external web service of an Ubuntu 13.04 server to 8000, I changed it myself, but it cannot be accessed, and the port is still 80. So I found a way to modify the port on the Internet. I have not modified a
As a new generation of code version management tools, SVN has many advantages, such as convenient management, clear logic, high security, and high code consistency. There are two SVN Data Storage Methods: BDB (Transaction Security table type) and
Ganglia is an open-source monitoring project initiated by UC Berkeley designed to measure thousands of nodes. Each computer runs a gmond daemon that collects and sends metric data (such as processor speed and memory usage. It is collected from the
Ubuntu:
1. install JAVA.
First install java. Because OpenJDK is installed in Ubuntu by default, You can uninstall it first. Enter the command on the terminal: sudo apt-get purge openjdk *.
1. Download JDK for linux from the sun homepage. I
Hadoop mode Introduction
Standalone mode: easy to install, with almost no configuration required, but only for debugging purposes
Pseudo-distribution mode: starts five processes, including namenode, datanode, jobtracker, tasktracker, and secondary
Hadoop's MapReduce environment is a complex programming environment, so we should try to simplify the process of building a MapReduce project. Maven is a very good automated project construction tool. Maven helps us to get rid of complicated
Recently I deployed Hadoop. I found a problem with the ambari tool to deploy hadoop's hive component. I don't know if other people have encountered this problem.Problem description: a fully distributed hadoop2.0 cluster is built using the ambari
I. Problem symptoms:
When installing Hadoop, an error similar to the following is reported:
# A fatal error has beendetected by the Java Runtime Environment:
#
# SIGFPE (0x8) atpc = 0x40008026, pid = 31393, tid = 2283477936
#
# JRE version: 6.0 _ 29-
1. cacti installation time zone error Warning: date (): It is not safe to rely on the system's timezone settings. you are * required * to use the date. timezone setting or the date_default_timezone_set () function. in case you used any of those
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.
A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service