JMP Shering: How big Data contributes to great value

Source: Internet
Author: User
Keywords Large data become contribute acquire statistics

The September Pudong New area, the weather has been slightly chilly. Zhangjiang Hi-Tech Park, a high-end office building surrounded by glass, early in the morning, the marketing department's senior staff Miss Li is in her seat to start a day before the warm-up: pour a glass of water, spend a quarter of an hour to visit the major websites on the day news. In the few forums she went to most often, usually cars, clothing, or travel promotion ads will be uninvited; recently she has found some casual changes in the years of constant change, such as the window that has just jumped out: what does big data analysis do to help you lock in High-value customers "big data" and travel forums? She was puzzled by the puzzle.

What she does not know is that she has entered in the browser search interface, "Customer", "promotional", "Communication record" and other keywords, has been linked to IE above the grab program to obtain automatically, and is associated with the corresponding ads, ie Plug-ins to complete the final advertising push. In the computer room somewhere in Beijing, the analysis software based on statistical analysis methods such as "Association Rules" and "clustering" is constantly dealing with the data produced by thousands of Miss Li.

What she may not have noticed is that big data and analysis are becoming the hottest concept in the world. First, a few years ago, Google's chief economist declared that data analysts were becoming the sexiest jobs of the 21st century, and that March 29, 2012 the Obama administration unveiled the Big Data Development Program (Development initiative), The White House Science and Technology Policy Committee (OSTP) has also set up a large data-level steering group to drive the strategic plan. Then there is this year, the Earth people already know that 2013 is the International Year of Statistics.

The heat of large data

A few months ago, Professor Michael Rappa, an old friend of JMP, heard the House of Representatives in the United States, with the theme "Next generation Computing and Big Data Analytics"--Next generation computing and large analysis. Professor Rappa, founder of the IAA (Cato of Advanced Analytics, NCSU, North Carolina State University), was appointed as academic co-chairman last year by the new large data Committee of the American Science Foundation. The Obama administration's big data-development program is funded by the American Science Foundation.

Domestic academia, thanks to the tireless efforts of the former vice president of Renmin University of China, Mr. Lingwei, and other renowned professors and scholars in the statistical field, in 2012, the Ministry of Education finally separated the statistics from the mathematics, and formally upgraded to a "first class" with mathematics, physics, chemistry, computer and other major disciplines.

Statistics-centered data analysis methods are showing increasing value in academic, business and Government fields. Miss Li's changes on the computer are a small token of this trend.

Gartner recently released the Magic Quadrant for Business Intelligence and Analysis for 2013, which makes clear that large data and analytics are becoming the core of enterprise IT planning.

All the phenomena are telling people that a new era of technology seems to be coming. Some IT professionals are even excited to think that "the third scientific revolution in human history" is coming.

The confusion of big data

The question is, what is big data? Why does everybody say Pro data?

"Big, Big Data" is big data. The definition of "big" is constantly refreshing. 10 years ago 1GB data is very big, today, 1000GB is not too big.

The problem is not in the big, but in the value. "Big Data" is still only data, not enough effective analysis and application, all data are rubbish. The New York Times columnist David Brooks argues that the lack of effective analysis is the biggest problem with big data: More and more data, more and more correlations; in fact, many dependencies are meaningless, and this deceptive data association leads to misleading data managers and users, Waste a lot of manpower and resources to manage and analyze these data.

In addition to the traditional belief that there are rows of data forms with numbers or words, it technology helps people gather more and more other types of information, such as video, voice, pictures, documents, and more. These are called "unstructured data."

Structured and unstructured data multiply every day. Take road video surveillance as an example, all of Shanghai's cameras have 10多万个, every moment in the recording of pictures and video. Once a case or event occurs, the information recorded in the hard disk library becomes an important evidence of the reconnaissance and trial process. Although the technology is not yet supported, the industry still expects a specific figure or face to be found in TB and petabytes of video data in the future. This kind of search/analysis technology will be the engine that launches the big data application of video class in the future.

Similarly, the analysis and data mining based on voice, photo or text can bring a revolutionary breakthrough to the understanding of data. The problem is that such technologies remain in the laboratory.

Despite the lack of adequate application, large data is still unstoppable. No big data seems to be behind the times, big Data flying day came. As for whether this trend will become like a dot-com bubble, or the third industrial revolution, in the eyes of Mr Warwick, it is not important. Industry, database/storage and other domain suppliers are certainly happy to see it, and the enterprise's IT managers have a more than an excuse to apply for a budget.

Data value and enterprise data strategy

Data acquisition and storage is still the infrastructure of it construction. Once a decision is made to launch a "Big Data strategy", a steady flow of resources makes the work a black hole. How to circumvent this large data black hole? Combined with the success stories of leading enterprises in the world and some small and strong European companies, I believe that the application (analysis and business decision) should be used as the center to establish the corresponding data strategy, and then set up a corresponding process from collecting data, managing data to final business decision. Rather than data for data--the first is to establish an application-centric data strategy. When it comes to applications, banks, insurance, automobiles, chemicals, and nearly all industries are conducting applications based on data analysis, taking some typical customers in the JMP software Global Industry Case Library as examples:

in the analysis of customer procurement behavior data for promotional and related product recommendations (Cross/upgrade sales) Airlines are investigating passenger feedback to improve air services (customer retention) The pharmaceutical company analyses the clinical experimental data to determine the safety and effectiveness of new drugs (new products) Car manufacturers in the maintenance of information analysis, to improve the reliability of vehicle and key components to enhance customer satisfaction (retention and acquisition of customers), reduce customer cost of ownership and depot warranty costs (reduce costs) mobile phone companies in the mobile phone sales forecasts to reasonable production and optimize inventory (Operation optimization) Health authorities are using data models to describe epidemiological trends, monitoring and forecasting the bank is optimizing and improving the customer service process in order to improve customer satisfaction. Computer manufacturers are using customers to conduct market research on different configuration combinations to conduct pricing insurance companies ' dynamic adjustment of policy pricing according to policy Chuxian. To ensure the basic profitability of the product semiconductor enterprises in the manufacturing process data analysis/modeling/optimization, in order to improve the process, improve the yield rate, so as to achieve cost reduction and profit promotion food companies in the use of data analysis and market research tools to develop local customers favorite taste The fast food industry is using JMP map analysis tools to combine demographics to store location, customer access and supply chain optimization

The value of data can be obtained only if the application is sufficient and effective. Only by establishing the importance of the data analysis at the strategic level, can the enterprise improve continuously. Taking GE as an example, Six Sigma and the corresponding data analysis process have become GE's global strategy and culture. In addition to this, GE continues to work tirelessly to promote continuous improvement based on data analysis. In the field of high-end aero-engine research and GE Energy Systems, GE has also advanced to the industry's highest level of experimental design (DOE) represented by JMP to further enhance its research and development level.

Secondly, everything can not be separated from people. With the exponential growth of data analysis needs, statistical, analytical talent is becoming a scarce variety in the workplace and popular. In early March, the Wall Street Journal published "America's most sought-after career rankings", with data analysis jobs ranked second in the list. This is the United States. For China, perhaps the rankings are higher because of scarcity.

Finally, a set of data analysis and decision-making process is established to replace the traditional decision making system. This is especially important for Chinese companies. This is not only the effective implementation of the strategy, but also need enterprises to come up with "change" determination and courage, at the institutional level reflects the "change" of encouragement and tolerance.

In this application for the king of the era, for enterprises, whether it is to build infrastructure or application software, to invest, how to invest in fact, is an old topic, no external value and price. Big Data/cloud computing, no matter how the name changes, the logic remains.

JMP is the world's leading statistical software group SAS's important business unit, has more than 200,000 users, is committed to helping global enterprise customers to improve quality management, optimize business processes and improve product development, to help the global university teachers and students and researchers to improve the teaching quality of statistics-related courses and scientific research work results. JMP Software is the famous high-end desktop "performance statistics Discovery Engine", its application areas include business visualization, Six Sigma and continuous improvement (visual Six Sigma, quality management, process optimization), Pilot design doe, research and development innovation, exploration and discovery, teaching and research and other aspects.

(Responsible editor: The good of the Legacy)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.