Recently, hosted by the China Electronic Society, China Electronic Society cloud Computing Experts committee and China's cloud computing Technology and Industry Alliance hosted the "cloud Computing and large Data" symposium held in Beijing Jingxi Hotel Grand. Honorary chairman of the Chinese Electronic Society, the former minister of information industry, Jichuan, and the Ministry of Industry and Information Technology Zhou chief economist successively addressed the Chinese electronic Society to grasp the new generation of information technologies, the development of the characteristics of a forward-looking, basic seminar congratulated. The leaders and guests attending the symposium were: Secretary for Software and Services, Ministry of Industry and Information technology, Li Wei Academician, Li Deii Academician, Ni Academician, vice chairman of China Electronic Society Rulin, Secretary-General of China Electronic Society Xu Xiaolan, Renhua, Deputy Secretary-General of China Electronic Society, and experts of cloud Computing Experts Committee of China Electronic Society and China Cloud Computing Technology and Industry Alliance, represented a total of more than 50 people.
The seminar was chaired by academician Li Deii, director of Cloud computing Experts Committee of China Electronic Society, Li Wei Academician, Li Deii Academician, chairman of the Technical Committee of Baidu Company, Dr. Chen Shangyi, Research Institute of Chinese Academy of Sciences Qing researcher, China Mobile Communication Research Academy Changling researcher, Dr. Yao Hongyu, general manager of Beijing Friends Tianyu System Technology Co., Ltd. delivered a keynote speech on cloud computing and Big data topics respectively.
"Tetrahedron model realizes unstructured data management"
In Li Wei's view, the software industry faces three waves, the first is structured data + algorithm, the main basic software is the operating system such as Windows and databases, the second is semi-structured data + search, forming a number of search engines such as Baidu, Google and E-commerce such as Amazon Third, unstructured data Services + Instant service, representing unstructured data management systems and integrated services.
In the large data age, the data model becomes more important, and the data models such as intelligent processing, relational retrieval and knowledge mining need new changes to realize deep extraction and sharing. Li Wei Academician led the team since 2009, specially developed advanced unstructured data-tetrahedron model, with a unified data model as a breakthrough, combined with text images, graphics, audio, video features to achieve a new type of unstructured data management system.
Li Wei that the next 90% of the data will be unstructured data, and the growth of the new data model for unstructured data is the key to the unstructured data management system, the Li Wei academician proposed the tetrahedron model (i.e. basic attribute faceted, semantic feature faceted, underlying feature faceted, raw data faceted), The tetrahedron model has the characteristics of completeness, correlation, integration and expansibility, it is easy to create, maintain and manage the excessive unstructured data, it can become the standard of unstructured data model, and must use the concept and technology of group software engineering to generate, manage and maintain the unstructured data. The tetrahedron model has been extensively tested in 5 million unstructured samples and will be further developed in the future from sensors to improve the tetrahedron in unstructured data through group software.
"Broadband is not wide constraints large data development"
Li Deii: Broadband bandwidth is still the bottleneck restricting cloud computing and large data development
At the fourth session of China's cloud computing conference, Li Deii had made it clear that "broadband is not wide" restricts the development of cloud computing. Now facing big data, Li Deii still said: "The Internet and cloud computing is the basis for large data generation, broadband is still restricting the cloud computing and large data development bottlenecks." ”
Around "What is large data, how to deal with large data", Li DEII the data has a low density value characteristics, demand characteristics are user-driven, and focus on location cloud services as an example, explain the current application of large data in the industry. And through the scientific development and the current large data technology links, as well as the technical limitations of the development of large data, Li Deii said: "Now is not necessarily the first science and technology, it is likely that there is science in technology, science has technology." The scientific value of the big Data age (is being embodied in a step-by-step way). ”
"Big Data brings industry change"
Dr. Chen Shangyi from Baidu's Big Data features, this paper analyzes the technological and industrial changes brought by large data, and considers that the current data scale and growth rate exceed the processing capacity, and the new security crisis has been caused by the fact that the data intelligence has become a reality, which has changed the thinking of academic research and technological innovation, and the innovative mode has changed radically.
Qing researcher from the evolution of large data, data mining development process, large data mining cloud service practice three points out that large data mining algorithm is often high complexity or even NP difficult problem, we need to try to transform the global optimal problem to local optimal problems, need to adopt efficient parallel strategy. Changling researcher reviewed the development of large data processing technology, pointed out the demand point of telecom operators for large numbers, and finally, through the Big Cloud 2.0, expounded the innovation of China Mobile in the practice of large data application. Dr. Yao Hongyu. The characteristics of large users are the explosion of growth, the use of suddenness, the volatility of demand, and the effects of association networks; Large data features are data volumes (Volume), Data diversity (produced), data throughput (Velocity), data content (Value) The large system is characterized by high availability, management pressure and maintenance cost, linear extension of performance, demand diversity, and detailed exposition of the opportunities and challenges faced by large users, large data and large systems. The participants had a lively discussion of cloud computing and large data technology, business models and future integration and innovation trends.
Large data is affecting the transformation of business model, the processing of large data, analysis and integration is becoming an effective way to enhance the core competitiveness of enterprises. In March 2012, the United States government allocated 200 million dollars to launch the "Big Data Research and Development initiative" program. This seminar is in this context, invited the national relevant ministries and leaders, cloud computing and large data fields of eminent academicians, for cloud computing and large data and other important issues to explore, in order to promote the domestic large data technology development and industrial change to make a positive exploration.
(Responsible editor: Lu Guang)