providing infrastructure for big data and newer fast data architectures is not a problem of cookie cutting. Both have significant adjustments or changes to the hardware and software infrastructure. Newer, faster data architectures are significantly different from big data architectures, and fast data provides true online transaction processing tools. Understanding the changes in big data and fast data needs can help you make the right hardware and software choices. Big data architectures, compared to how companies typically collect data in the past, big data is the process of analyzing and gaining greater insight through greater data capacity, and most of the data (for example, social media about customer data) is accessible to the public cloud. This data, in turn, emphasizes fast access, no longer emphasizes consistency, and also creates a series of big data tools such as Hadoop. As a result, the following changes and focus in the architecture are common: support for in-house software such as Hadoop and Hive, as well as scale-out cloud-functional hardware for scenarios where social media or other big data input works. Virtualization and private cloud software that supports existing data architectures. Software tools that support large-scale, deep and ad hoc analysis software, and allow data scientists to customize requirements for enterprises. Large scale expansion of storage capacity, especially near real-time analysis. Fast data Architecture Fast data is a framework for processing streaming sensors and IoT data in near real-time situations. This architecture is more focused on fast updates, and it often releases the restrictions on reading data until there is data written to disk to be locked. Whether it's through an existing, typical histogram, database, or a specially designed Hadoop-related tool, organizations that work with this architecture typically need some initial stream analysis of the data. In this new field, changes in architecture and focus are common: database software for rapid updates and initial stream data analysis. Dramatically improve the use of nonvolatile RAM and SSDs for fast data storage (e.g., 1TB of main memory and 1PB SSDs), and timely software constraints, similar to those of older real-time operating systems. Fast data integration with Big data architectures is the goal of fast data integration with big data architectures. Therefore, in order to combine these two approaches: data is decoupled between fast-response fast data and reduced-limit big data storage. This converged architecture allows access to data stored in fast data architectures using big Data databases and analysis tools. This is a very brief overview, typical implementations and a range of options. Major vendors sell a wide variety of software and hardware to cover all big data architectures and most of the fast data architectures, while open source vendors cover most of the same software areas. Therefore, the implementation of fast data and big data is often a balance between cost and speed. Smart buyers can gain a competitive advantage by adding an effective architecture. Small suppliers in the fast data area redis labs and Gridgain, a large supplier of Oracle and SAP, have played an important role in both fast data and big data. SAP may be a more appropriate supplier for the fast data tools area. Intel has a strong interest in fast data in the field of hardware. Other traditional big data vendors, such as IBM and Dell, are excited about the process of acquiring EMC before it can be published. Between IBM and Dell, EMC has earned face and lining, so it may be more targeted in the fast-data architecture than IBM in the future.
TechTarget China original content, original link: http://www.searchdatacenter.com.cn/showcontent_92676.htm
©techtarget China: http://www.techtarget.com.cn
What infrastructure is right for fast and big data architectures?