Newest member of Apace Twill:hadoop Club

Twill, formerly known as Weave, has now become one of the new members of the http://www.aliyun.com/zixun/aggregation/14417.html ">apache incubator Project,   It is designed to simplify the operation of applications in Yarn/hadoop. The fact that Hadoop is now a compelling technology solution is almost no doubt. The success of this project has been achieved with the release of its version 2.0.

Data import HBase Three most commonly used methods and practice analysis

To use Hadoop, data consolidation is critical and hbase is widely used. In general, you need to transfer data from existing types of databases or data files to HBase for different scenario patterns. The common approach is to use the Put method in the HBase API, to use the HBase Bulk Load tool, and to use a custom mapreduce job. The book "HBase Administration Cookbook" has a detailed description of these three ways, by Imp ...

Java Open Source Log framework Competition

Http://www.aliyun.com/zixun/aggregation/14223.html "> Application system, the log is an indispensable important part of all the application of error information should be able to find in the log file,   Some application system log may be very small, some large application system log is quite large, while the log file must be user-friendly and search, to have a high performance, otherwise it will affect the performance of the application system. Because the log usually involves I.

Sweep 2013 most commonly used NoSQL databases

Within a few years, the NoSQL database has focused attention on performance, scalability, flexible patterns, and analytical capabilities.   Although relational databases are still a good choice for some use cases, like structural data and applications that require acid transactions, NoSQL is more advantageous in the following use cases: The data stored is essentially semi-structured or loosely-structured.   Requires a certain level of performance and scalability.   The application to access the data is consistent with the final consistency. Non-relational databases typically support the following features: Flexible ...

Hadoop Copvin--45 common problem solutions

In the work life, some problems are very simple, but often search for half a day can not find the required answers, in the learning and use of Hadoop is the same.   Here are some common problems with the Hadoop cluster settings: 3 models that 1.Hadoop clusters can run?  Single-machine (local) mode pseudo-distributed mode 2.   Attention points in stand-alone (local) mode? In stand-alone mode (standalone) ...

Facebook's 9 Open source projects for 2013 years

Facebook is the world's biggest social networking site, and its growth is driven by open source power.   James Pearce, the head of Open-source project, said that Facebook began with the first line of writing its own PHP code, starting with the MySQL INSERT statement, and that open source has been incorporated into the company's engineering culture. Facebook is not only open source, but also open source its internal projects, internal results feedback to the open source community, it can be said that this is a great company should be the attitude. By constantly open source yourself ...

Hadoop is more suitable for enterprise application when it enters 2.0 times

"Now is the best time for companies to apply Hadoop. "Jeff Markham, Hortonworks's chief technology officer, said in a speech at the 2013 China Hadoop Technology Summit, which was held at the end of November. At this summit, Hadoop entered the 2.0 era as the focus of people's talk.   Jeff Markham says Hadoop 2.0 has a stronger, broader range of new features that meet the needs of enterprise users, making up for the lack of Hadoop 1.0 and more in line with the needs of business users. Ha ...

SK Telecom Test Water Sql-on-hadoop Open source query engine Tajo

A new Hadoop SQL database query engine http://www.aliyun.com/zixun/aggregation/14417.html ">apache Tajo recently won the South Korean telecom operators SK Telecom's favor. Geun-tae Park, senior manager of SK Telecom Data Technology Laboratory, said: After extensive research on the current available data analysis technology, we found that the Apache hatching project Tajo can be implemented in had ...

Netflix open source data Flow manager Suro

Netflix recently launched an Open-source tool called Suro that collects event data from multiple http://www.aliyun.com/zixun/aggregation/15818.html "> Application Servers" and real-time directed delivery to target data platforms such as Hadoop and Elasticsearch.   Netfix's innovation is expected to be a major data-mainstream technology. Netflix uses Suro for real-time orientation of data sources to target hosts, Su ...

Migrating from MySQL to Mariadb (CentOS)

Here's an excerpt of the background, followed by a record of the operations I migrated from MySQL 5.5.31 to mariadb 5.5.31 on CentOS 6.4.   Finally, I found a better way to migrate. 1. Background introduction MySQL is the world's most popular open source relational database. 2008, Sun acquired MySQL. Then, in 2010, Oracle bought Sun, and MySQL fell into Oracle's hands. Oracl ...

Intel has CPU encryption enabled for Hadoop

Chip giant Intel is redoubling its efforts to defend its valuable data-center territory-specifically to develop its own technology to drive data management and analytics technologies, such as hadoop--implementations. To ensure that the Xeon chips are the preferred platform for running a large Hadoop cluster in the data Center Administrator's account, Intel announced in Tuesday that it will be the Intel distribution for Apache Hadoop (its own, Open-source software derivative solution)   Add a number of new features and technologies. ...

One to two years ahead of MongoDB. Build Enterprise-Class NoSQL database

Over the past few years, NoSQL database with its easy to expand, high-performance, high availability, data model flexibility and other features attracted a large number of emerging internet companies, including the domestic Taobao, Sina, Jingdong Mall, 360, Sogou, etc. have been in the local attempt to NoSQL solutions. Guangzhou giant FIR Database is a focus on the new NoSQL distributed database research and development of the start-up company, has now won the first round of angel investment, the core product is sequoiadb. The founding team members of the giant FIR database are from IBM North America Laboratories, and have long been engaged in the relational database D ...

Talking about the three-layer operation of MongoDB

NoSQL Recent momentum is good, http://www.aliyun.com/zixun/aggregation/13461.html ">mongodb is one of the Jiao jiao, oneself learn nosql when also reference a lot of information, The final decision to start from the MongoDB, the most important reason has two: 1 oneself is a simple amateur, all problems I think whether there is a simple way to solve, rather stop to think a lot of time, also do not want to use stupid method to do immediately, and Mong ...

A few things you need to know about Hadoop

In today's technology world, big Data is a popular it buzzword. To mitigate the complexity of processing large amounts of data, Apache developed a reliable, scalable, distributed computing framework for hadoop--. Hadoop is especially good for large data processing tasks, and it can leverage its distributed file systems, reliably and cheaply, to replicate data blocks to nodes in the cluster, enabling data to be processed on the local machine. Anoop Kumar explains the techniques needed to handle large data using Hadoop in 10 ways. For from HD ...

Open source SQL in Hadoop solution: Where are we?

With Facebook opening up the recently released Presto, the already overcrowded SQL in Hadoop market has become more complex. Some open-source tools are trying to get the attention of developers: Hortonworks around the hive created Stinger, Apache Drill, Apache Tajo, Cloudera Impala, Salesforce's Phoenix (for HBase) and now Facebook Presto. ...

Why is MongoDB worth 1.2 billion dollars?

If the document database startup company Http://www.aliyun.com/zixun/aggregation/13461.html ">mongodb needs to thank some people for the huge valuations they have recently reaped, then Oracle's CEO Larry   Ellison will be ranked first in the list. MongoDB document database behind the company 10Gen recently renamed the company name to MongoDB, and received 231 million U.S. dollars of financing ...

The advantages and applications of the Hadoop of experience

In today's technology world, big Data is a popular it buzzword. To mitigate the complexity of processing large amounts of data, Apache developed a reliable, scalable, distributed computing framework for hadoop--. Hadoop is especially good for large data processing tasks, and it can leverage its distributed file systems, reliably and cheaply, to replicate data blocks to nodes in the cluster, enabling data to be processed on the local machine.   Anoop Kumar explains the techniques needed to handle large data using Hadoop in 10 ways. For the ...

Nodejs: Count 10 amazing Nodejs Open Source projects

In a few years time, Nodejs gradually developed into a mature development platform, attracted many developers.   Many large high-traffic sites are developed using Nodejs, like PayPal, and developers can use it to develop some fast mobile web frameworks.   In addition to Web applications, Nodejs is also used in many ways, and this article counts the ten amazing projects that Nodejs has developed in other areas, such as application monitoring, media streaming, remote control, desktop and mobile applications, and so on. 1.N ...

Yarn shook MapReduce's grip on Hadoop.

Hadoop is considered to be a mapreduce running on HDFS (Distributed File System).   Increased number of potential applications through Yarn,hadoop 2.0. Hadoop has always been a general term for all kinds of open source innovations that are more or less integrated into the unified large data architecture.   Some people believe that the core of Hadoop is a distributed File System (HDFS), while a series of alternative HDFS databases such as HBase and Cassandra are shaking the claim. Hadoop used to have a special job ...

Apache HBase Project Management Committee Chairman Michael Stack: Outlook hbase

China's most influential and largest large data-gathering event--2013 China's Big Data Technology conference (DA data Marvell CONFERENCE,BDTC) was held in Beijing in December 2013 5-6th. Dozens of leading companies, nearly 70 lectures, not only covers the Hadoop ecosystem and flow calculation, real-time computing and NoSQL, Newsql and other technical direction, but also on the Internet, finance, telecommunications, transportation, medical and other innovative cases, large data resources laws and regulations, large data commercial use policy ...

Total Pages: 265 1 .... 62 63 64 65 66 .... 265 Go to: GO
Tags Index:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.