As Greg Schulz, a storage technology analyst, said, "Big data is unmatched and it has the capacity to carry everything." "That means there are already a number of independent storage tools on the market, designed to help storage administrators take care of the ever-expanding large data ocean." Unsurprisingly, most of them are closely related to Hadoop.
SGI Infinitestorage
SGI Infinitestorage transforms storage into a set of hybrid systems with virtualization technology, which includes both a performance-superior flash mechanism and low-cost tape solutions. And all of this is based on the fact that the data is always online, that is, the transparency of the data to the user.
"The SGI infinitestorage hardware and software ecosystem is a strong guarantee for the SGI to solve large data problems over the past 20 years, as well as providing a global response to the most demanding data management environments, including weather forecasts, life sciences, manufacturing, media and education," Floyd Christofferson, head of SGI's storage product marketing department, said.
Red Hat Storage Server 2.0
Linux has become a major system platform for large data implementations, according to a recent report by the Linux Foundation. In this sense, Red Hat company in large data storage field occupies a place also not difficult to understand. Red Hat Storage Server 2.0 allows data stored and managed somewhere to be accessed by a variety of enterprise workloads, said Ranga Rangachari, vice president and general manager of the Red Hat storage business unit.
"Given the ongoing growth in data size, it is hard for corporate users to withstand the resulting dedicated storage requirements," Rangachari explains. "The ideal solution is to reside data in a common enterprise database and allow all types of enterprise workloads to be accessed at any time." ”
In view of this, Red Hat Company has cooperated with Intel to create a more ideal open source large data application. As a first step in a grand vision, Red Hat takes advantage of Intel's recently launched Apache Hadoop software release to integrate it with Red Hat Storage Server 2.0 and the Linux operating system of the company. In addition, the storage Apache Hadoop plug-in developed by the Red Hat Company is also about to be pushed into the open source technology community, becoming one of the storage options in the enterprise Hadoop deployment effort.
"Red Hat is a dominant enterprise-wide data solution, and according to IDC, the total market share will increase from 2011 to $23.8 billion trillion in 2016," Ashish Nadkarni, an analyst at IDC, told us. "Red Hat is one of the few infrastructure providers that can deliver a comprehensive, large data solution, thanks to its strong infrastructure solutions and application platform in the area of local or cloud delivery models." ”
EMC Pivotal HD
Speaking of the latest Hadoop releases, EMC's pivotal HD is definitely worth mentioning, and its role is to integrate large data with EMC Greenplum large-scale parallel processing (MPP) databases. With a set of engine technologies called HAWQ, EMC gives the Hadoop SQL process more than a hundredfold performance boost in queries and workloads (supposedly).
"Hadoop is important, and it's the key to unlocking the potential of big data transformations, and we want to be able to integrate it with Greenplum technology, which will drive the wider popularity of Hadoop," said Scott Yara, senior vice president of EMC Greenplum.
(Responsible editor: Fumingli)