1. OverviewToday, I would add a blog about flume, which was omitted when explaining the highly available Hadoop platform, and this blog will tell you the following:
Flume ng Brief Introduction
Single point flume ng construction, operation
Highly Available flume
START: Flume is a high-availability, highly reliable, open-source, distributed, high-volume log collection system provided by Cloudera, where log data can flow through flume to storage terminal destinations. The log here is a general term, refers to the file, Operation Records and many other data.First, flume basic Theory 1.1 Common distributed log Collection sys
START: Flume is a high-availability, highly reliable, open-source, distributed, high-volume log collection system provided by Cloudera, where log data can flow through flume to storage terminal destinations. The log here is a general term, refers to the file, Operation Records and many other data.First, flume basic Theory 1.1 Common distributed log Collection sys
master HBase Enterprise-level development and management• Ability to master pig Enterprise-level development and management• Ability to master hive Enterprise-level development and management• Ability to use Sqoop to freely convert data from traditional relational databases and HDFs• Ability to collect and manage distributed logs using Flume• Ability to master the entire process of analysis, development, and deployment of
master HBase Enterprise-level development and management• Ability to master pig Enterprise-level development and management• Ability to master hive Enterprise-level development and management• Ability to use Sqoop to freely convert data from traditional relational databases and HDFs• Ability to collect and manage distributed logs using Flume• Ability to master the entire process of analysis, development, and deployment of
Tag: Data sent stream via example database high availability Val SystemFlume is a log collection system provided by Cloudera, with the characteristics of distributed, high reliability, high availability and so on, the Flume supports the development of various kinds of data transmission in the log system, and Flume provides the ability to handle the data easily and write to the various number of receiver. It
, store data in a file system and set a time interval to save data. When there are more log data, the corresponding log data can be stored in Hadoop for future data analysis.
More sink can refer to the Official Handbook.
3. Prepare installation files
JDK Download Address: http://www.oracle.com/technetwork/java/javase/downloads/index.html
Flume+hadoop Download A
Environment Description: master server ip:192.168.80.1281. Prepare Apache-flume-1.7.0-bin.tar File2. Upload to Master (192.168.80.128) server3, Decompression Apache-flume-1.7.0-bin.tarTAR-ZXVF Apache-flume-1.7.0-bin.tar4. Enter the configuration file directory of the FlumeCd/apache-flume-1.7.0-bin/conf5, modify the con
Why does data analysis generally use java instead of hadoop, flume, and hive APIs to process related services? Why does data analysis generally use java instead of hadoop, flume, and hive APIs to process related services?
Reply content:
Why does data analysis generally use java instead of
Origin:
Since Hadoop is used, and because the project is not currently distributed, it is a clustered environment that causes the business log to be moved every time, and then analyzed by Hadoop.In this case, it is not as good as the previous distributed flume to work with out-of-the-box HDFs to avoid unnecessary operations. Preparation Environment:
You must have a ready-to-use version of
1. Hadoop supports Lzo compression dependencies:The Lzo:unix/linux system does not have a Lzo library by default, so it needs to be installed, sudo yum install lzo-devel.x86_64sudo yum install lzo.x86_64 sudo yum install lzop.x86_642. Prepare MAVEN,ANT,GCC, etc.3. Compiling Hadoop-lzoDownload from Https://github.com/twitter/hadoop-lzo, unzip into directory, MVN c
Big Data Architecture Development mining analysis Hadoop HBase Hive Storm Spark Flume ZooKeeper Kafka Redis MongoDB Java cloud computing machine learning video tutorial, flumekafkastorm
Training big data architecture development, mining and analysis!
From basic to advanced, one-on-one training! Full technical guidance! [Technical QQ: 2937765541]
Get the big data video tutorial and training address
Byt
Training Big Data Architecture development!from zero-based to advanced, one-to-one training! [Technical qq:2937765541]--------------------------------------------------------------------------------------------------------------- ----------------------------Course System:get video material and training answer technical support addressCourse Presentation ( Big Data technology is very wide, has been online for you training solutions!) ):get video material and training answer technical support ad
Training Big Data architecture development, mining and analysis!from zero-based to advanced, one-to-one training! [Technical qq:2937765541]--------------------------------------------------------------------------------------------------------------- ----------------------------Course System:get video material and training answer technical support addressCourse Presentation ( Big Data technology is very wide, has been online for you training solutions!) ):Get video material and training answer
Training Big Data Architecture development!from zero-based to advanced, one-to-one training! [Technical qq:2937765541]--------------------------------------------------------------------------------------------------------------- ----------------------------Course System:get video material and training answer technical support addressCourse Presentation ( Big Data technology is very wide, has been online for you training solutions!) ):get video material and training answer technical support ad
1. All hosts need to install JDK and configure JDK environment variable 2, all the host installed SSH, and each other to achieve no secret access 3, modify the host hosts: File/etc/hosts, to ensure that the machine through the machine name can exchange visits 4. Install Python 2.6 and above (Storm ) 5, ZeroMQJava code
wget http://download.zeromq.org/zeromq-2.1.7.tar.gz
TAR-XZF zeromq-2.1. 7. tar.gz
CD zeromq-2.1. 7
./configure
Make
sudo make install
During
Big Data Architecture Development mining analysis Hadoop Hive HBase Storm Spark Flume ZooKeeper Kafka Redis MongoDB Java cloud computing machine learning video tutorial, flumekafkastorm
Training big data architecture development, mining and analysis!
From basic to advanced, one-on-one training! Full technical guidance! [Technical QQ: 2937765541]
Get the big data video tutorial and training address
Byt
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.