The development course and evolution trend of large data technology

Source: Internet
Author: User
Keywords Large data large data technology development history trends

The earliest words "Big Data" were published in 2011 by the McKinsey Global Research Institute, "large numbers: The next frontier of innovation, competition and productivity." After the Gartner technology hype Curve and the 2012 Victor Schoenberg "Big Data Age: Life, work and thinking of the great changes" of the promotion, the concept of large data began to sweep the world.

Based on the web of Science database 1994 years after the concept of a large number of 4,495 of documents, the use of Citespace knowledge mapping tool, through hot key words and high cited literature analysis, can outline large data technology from the bud to mature development process.

From the 90 's to the beginning of this century, it is the embryonic stage of large data development and is in the phase of data mining technology. With the gradual maturation of data mining theory and database technology, a group of business intelligence tools and knowledge management technologies have been applied, such as Data Warehouse, expert system, knowledge management systems and so on. At this time, the study of large data mainly focus on "algorithms" (algorithm), "model" (models), "Patterns" (mode), "identification" (identification) and other hot key words.

The breakthrough period for large data development is 2003-2006, in the free exploration phase around unstructured data. The explosion of unstructured data led to a rapid breakthrough in large data technology, marked by the creation of a 2004-year Facebook, which led to the emergence of a large number of unstructured data, which was difficult to deal with. At this time, the focus of the key words are more dispersed, including the "BAE" (System), "NX" (Network), "Evolution" (evolution), etc., High cited literature is very few, indicating that academia, the business community is from a multi-angle of data processing system, database structure to rethink, And no consensus has yet emerged.

In the 2006-2009, large data technology formed parallel operation and distributed system, which was a mature period for large data development. Jeff Dean developed the Spanner Database (2009) based on BigTable. At this stage, the hotspot keywords of large data research tend to focus again, focusing on "configured" (performance), "Cloudcomputing" (Cloud computing), "MapReduce" (large-scale data set parallel operation algorithm), "Hadoop" (open source Distributed system infrastructure ) and so on.

Since 2010, with the application of smartphone more and more extensive, data fragmentation, distributed, streaming media features more obvious, mobile data growth dramatically.

In recent years, large data has been continuously infiltrated into all walks of life in the society, which makes the technical field of large data and industry boundary more and more blurred and changeable, and the application innovation has surpassed the technology itself more favored. Large data technologies can bring about transformative effects in every field and are becoming the driving force and booster for disruptive innovation in all walks of life.

In May 2013, the McKinsey Global Institute published a study entitled Disruptive Technologies: changing life, business, and the global economy of Cato technology. The report identifies 12 emerging technologies that are expected to bring 14 trillion to 33 trillion dollars in economic benefits in 2025. Surprisingly, the most popular big data technology has not been included. The big data, McKinsey explains, has become the cornerstone of many of the 12 technologies that could change the world's landscape, including mobile internet, knowledge-work automation, IoT, cloud computing, advanced robotics, auto, genomics, etc.

May 2014, the White House released the 2014 global "Big Data" white Paper Research Report "Big data: Seize the opportunity, guardian value." The report encourages the use of data to promote social progress, particularly in areas where markets and existing institutions do not otherwise support such progress, and the need for appropriate frameworks, structures and research to help protect Americans from their firm belief in protecting privacy, ensuring equity or preventing discrimination. The Global Information Technology Report (13th edition) was also published by the World Economic Forum in April 2014 with a similar theme of "Return and risk of large data". The report argues that policies aimed at ICT in the coming years will even become more important. This will be followed by active discussions on issues such as data confidentiality and network control. With the increasing activity of the global data industry, the rapid development of technology evolution and application innovation, governments have come to realize the significance of large data in promoting economic development, improving public services, enhancing people's welfare and safeguarding national security.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.