Interview with Teradata executives: How to handle large data intelligently

Source: Internet
Author: User
Keywords We the big data think that some
Tags analysis based big data big data technology change company data data quality

Baoliming (Stephen Brobst) is chief technical officer of the data warehousing company Teradata, while Martin Willcox is the company's platform and solution sales Director in Europe, the Middle East and Africa. They recently interviewed TechTarget and described their views on the big data technology industry in the 2013. The first part is Martin Willcox's interview transcript:

SAP believes its memory database device Hana is a technology that leads to change. What do you think of memory technology?

Willcox: There are two ideas about the industry: SAP thinks that all data should be kept in memory; others think that the unit cost of memory cannot accommodate the growth of data capacity, so it is economically infeasible to store all the data in memory. In the latter view, you need to combine different storage mechanisms on a classic structure.

Teradata also thinks "You can't store all of your data in memory," but the difference between us and other vendors is that we use a multiple (hot and cold data) model to automate the movement of data in the hierarchy. This is what we call Teradata Intelligent memory technology.

How do you intelligently implement ' big data ' processing?

Willcox: Some vendors mistakenly believe that big data is a homogeneous problem. And we use a two pair two model to divide it into two coordinates: the x-axis is a data structure-the left is a simple structure, the right is a multiple structure, and on the y-axis, the following is a collection based analysis, which is a non-traditional analysis, such as a path or graph analysis.

The latter is naturally iterative. For example, perform a comprehensive analysis of sales data: ' What are the best products to sell with bananas? ' is a classic question. If I wanted to ask: ' What are the best products to sell with bananas and milk? ' It's expensive to analyze this problem on a traditional database management system.

In the processing of graphs, our approach to node relationships also facilitates the analysis based on aggregation-the individual of social networks is an example that can be used to determine influence.

Therefore, the new data to perform a new type of analysis, can bring significant data. Otherwise, it's just a term.

What trends have you found in your customer database over the past year that are related to large data?

Willcox: While large database technologies such as Asterdata or Hadoop are spreading around the world, most customers are still watching.

Some of our telecom customers are performing some interesting tests that want to understand network data and customer data. Moving data is another area that needs to be better understood, and Asterdata, Hadoop, and sql-h are all used in this area. Among them, Sql-h supports the use of industry standard SQL for the Hadoop Distributed File System (HDFS) execution analysis.

It is well known that in the technical field, few people can understand their narrow areas of expertise outside of the technology. The industry has not been very good at explaining the problems that have been solved in the past. You will find that many people are using new technology to realize the original solution. Some Hadoop supporters should take some responsibility for this. Some (not all) of them do not quite understand how structured data is managed. There's a lot of repetitive work here.

When I ask some of the new big data-technology vendors about the role of traditional data warehouses, they often say ' they're still valuable '.

Willcox: Yes, the name is derogatory! It is still the foundation. Some of the new technologies are fascinating, but some proponents say they are like file-based and specific application-related data processing, something like the 60-70 years of the last century.

This is the way we achieve it. It creates a large database redundancy and inconsistency. It's not for large organizations with complex data, and that's why we invent relational database management systems. We found that ensuring data quality and consistency is the abstraction of services to the database management system level, helping all developers responsible for data integrity to solve problems.

In the past 30 years, the organizational style has not changed. What about data quality, data consistency, metadata management, and system? If you're just doing a science project, it may not be a problem, but if you leave the academic world and need to report to the regulatory authorities, your data and data quality becomes very important.

Some people think that these new companies will replace the two pairs of two structure in the lower left corner of the 30 years of engineering methods, I think it is impossible. However, they have some effect on the structure data and non-traditional analysis.

No technology can cover all 4 aspects-that's why we're proposing a unified data architecture.

(Responsible editor: Lu Guang)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.