Groupon, an American group-buying site, has acquired Adku, an E-commerce target start-up. Adku can use large data to personalize the shopping experience of users of e-commerce sites such as ebay, http://www.aliyun.com/zixun/aggregation/2467.html, >amazon and Zappos. Adku three months to create their personalized target technology, before 1.5 in Angelpad D ...
According to the IDC survey, global electronic device storage data will be 30 times times as fast as 2020, to 35ZB (equivalent to 1 billion 1TB of hard disk capacity). The arrival of large data waves has also brought a new challenge to the enterprise. For the prepared enterprise this is undoubtedly an information gold mine, can reasonably transform large data into valuable information to become the necessary skills of future enterprises. Coincides with this time, CSDN specifically for enterprise-related personnel conducted a large-scale questionnaire survey, and in thousands of of the survey report summed up the current enterprise data business. Here we will also tune ...
Earlier this month, Oracle began shipping large data machines (Oracle http://www.aliyun.com/zixun/aggregation/13527.html ">BIG Data Appliance"), analysts said. This will force major competitors such as IBM, HP and SAP to come up with Hadoop products that tightly bundle hardware, software and other tools. On the day of shipment, Oracle announces that its new product will run cloud ...
Reading: The ability of large data technology to discover patterns in large amounts of data clearly offers a lot of opportunities for tourism, where companies with large data projects can sell tourism products to consumers in a new way. If you are a loyal reader of Tnooz, you must have read about the potential of big data. But like most emerging technologies, people are confused about what it really means, can it bring a better customer experience and boost sales? This article is just a tip, analysis of the future of large data in the tourism industry some potential applications ...
Figure: The result of the Quaero information explosion is that data is becoming as ubiquitous as air: mobile phones, computers, digital cameras, smart meters and GPS devices. Faced with http://www.aliyun.com/zixun/aggregation/13584.html "> massive data, companies and government agencies are difficult to digest, at a loss, nothing to do." But the problem means opportunity. Splunk is a start-up company based in San Francisco whose software can be built for machines ...
How fast is the tide of big data? IDC estimated that the amount of data produced worldwide in 2006 was 0.18ZB (1zb=100), and this year the number has been upgraded to a magnitude of 1.8ZB, which corresponds to almost everyone in the world with more than 100 GB of hard drives. This growth is still accelerating and is expected to reach nearly 8ZB by 2015. For now, large data processing is facing three bottlenecks-large capacity, multiple format and speed, and the corresponding solution is proposed, which is extensibility, openness and next-generation storage technology. Capacity-high expansion ...
Tian Yun Technology Vice President Reitao: Good afternoon! Back to the meeting, this afternoon is a more detailed problem, in the morning there are some planning between the Silicon Valley elite and the local elites, there are many challenges to the problem, the local elite, for the traditional big Enterprises, Oracle, Cisco's big enterprises, some people heard, some people may come, did not hear some voices. Today there is a better opportunity for everyone to express their views. The main topic today is the correlation between cloud computing and big data. And we see cloud computing coming from the Amazon era of the internet ...
The great thing about cloud computing is that when you do large data processing, you don't have to buy a large number of server clusters in the past, and the rental server handles large numbers to make more use of control costs. As a heavyweight distributed processing open source framework, Hadoop has made a difference in the field of large data processing, and companies want to use Hadoop to plan their own future data processing blueprints. From EMC, Oracle to Microsoft, almost all High-tech vendors have announced their own large data strategy based on Hadoop over the past few months. Today Hadoop has become ...
Whether you're using relational database systems, hash tables, or other structures to maintain your data, you must have heard about NoSQL and big data. At present, companies such as Google, Yahoo and Amazon are already developing or using large data/nosql solutions. But apart from some very specific cases, are these big data implementations really that useful? In a recent article, Capgemini consulting company Steve even pointed out that sometimes large data may be a big scam, or at least not fully become a panacea, can solve the original relationship ...
As large data becomes increasingly trending, individual object storage vendors are trumpeting their technical advantages, claiming they can store and protect large volumes of data more efficiently and faster. "Object storage, especially cloud storage, needs to cross the cheap, complex phase and become an important part of the business and service supply markets," said Henry Baltazar, senior analyst 451 Group. "The following is a report on some of the major object storage vendors and their products on the market today." Atmos Atmos has become a reality ...
Today's technology is developing rapidly. Especially storage, the performance is extremely prominent. From the technology, the user and the application aspect, its development speed surpasses other it domain. At the same time brings about the corresponding problem. Data center and enterprise managers are faced with the dilemma of how to choose a storage array. The usual solution has long been hyped up, such as flash storage or such hyped technologies as WAN optimizations seem to have become ingrained in people's minds. The following seven storage solutions are not recommended for "new things" based on any technology, but more practical and ...
What is big data? We live in an information explosion era, we receive a variety of messages every day: SMS, mail, telephone ..., we also produce a variety of information every day: microblogging, blog, Frid ... All kinds of information are filled with the world, how should we deal with and make use of such a large amount of information? The challenges posed by these problems have brought the industry to an unprecedented level of enthusiasm for "big data". Not long ago, journalists participated in the leading enterprise data integration software Independent ...
Whether you are ready or not, the big data age will come. At present, IT managers need to do the following 5 things to meet the challenges of future massive data. by David Linthicum | InfoWorld is your big data plan ready? If not, you may want to start thinking about designing one. Today, big data is revered or oversold as a key strategic asset in the future-depending on your personal opinion, but that means that any CEO of a company wants to know about it's big data ...
Red Hat http://www.aliyun.com/zixun/aggregation/17539.html "> Company announced the acquisition of Gluster, The latter is widely concerned as a developer of Glusterfs open source file system and Gluster storage platform software stack. In this way, Red Hat has created itself as a service provider for customers looking for large data solutions like Apache Hadoop.
There is no doubt that all the people in the world concerned about the development of technology are aware of the potential value of "big data" for business business, and the goal is to address the pain caused by the growth of business data in the process of enterprise development. The reality is that many problems hinder the development and practical application of large data technologies. Because a successful technology requires some measure of standard. Now we can measure large data technologies with a few basic elements--streaming, parallelism, summary indexing, and visualization. Who uses the big data? A year ago, some of the major users of data technology ...
Cloud computing era surges, operators how to take advantage of the perfect transformation of cloud computing? On October 19, China Unicom's second cloud Computing symposium, Beijing Sky Cloud technology to "large data for the transformation of operators to create new opportunities" as the topic of the theme speech, to the presence of operators, media, cloud computing practitioners bring new thinking angle. With the market becoming saturated and the new slow development, China's telecom industry's three major operators are facing a decline in unit prices, ARPU value of the problem, while the Internet and other industries on the impact of traditional telecommunications industry has become more and more, operators face to find new ...
We have all heard the following predictions: By 2020, the amount of data stored electronically in the world will reach 35ZB, which is 40 times times the world's reserves in 2009. At the end of 2010, according to IDC, global data volumes have reached 1.2 million PB, or 1.2ZB. If you burn the data on a DVD, you can stack the DVDs from the Earth to the moon and back (about 240,000 miles one way). For those who are apt to worry about the sky, such a large number may be unknown, indicating the coming of the end of the world. To ...
The core of "big data" analysis is to grab value from a lot of detailed data. The cloud computing infrastructure, combined with sophisticated analytical tools, enables organizations to gain the full value of the above, without being limited by extensibility. ESG 2011 surveyed 611 senior IT consumers about their IT consumption intentions for enterprises using public cloud computing services, especially software SaaS. In addition to measuring application levels, ESG investigates current and planned users in terms of the specific application delivery or expected delivery mode-that is, through the SaaS model. Although the business analysis in the IT elimination ...
Guide: Yahoo CTO raymie Stata is a key figure in leading a massive data analysis engine. IBM and Hadoop are focusing more on massive amounts of data, and massive amounts of data are subtly altering businesses and IT departments. An increasing number of large enterprise datasets and all the technologies needed to create them, including storage, networking, analytics, archiving, and retrieval, are considered massive data. This vast amount of information directly drives the development of storage, servers, and security. It also brings a series of problems to the IT department that must be addressed. Information...
With the development of the Internet, mobile Internet and IoT, no one can deny that we have actually ushered in a massive data era, data research company IDC expects 2011 total data will reach 1.8 trillion GB, the analysis of these massive data has become a very important and urgent demand. As an Internet data analysis company, we are "revolt" in the field of analysis of massive data. Over the years, with stringent business requirements and data pressures, we've tried almost every possible big data analysis method, and finally landed on the Hadoop platform ...
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.