Open source change big data and cloud future Outlook 2013 Red Hat five big things

If cloud computing and large data are two important trends in current IT development, open source can be said to be an important booster for these two trends, after all, a significant part of innovation comes from the open source community. In 2012, as the only pure open source company to achieve 1 billion U.S. dollar sales in the industry, Red Hat has proven its allure and potential in cloud computing and large data times with its own outstanding performance. How will red Hat continue to use open source to change the future of big data and cloud computing in 2013? What new surprises will the new year bring to the users of this open source company? This article combs the 201 ...

Red Hat borrows power ecosystem against cloud, big data

With the development of cloud computing, Linux is becoming more and more concerned by enterprise users, and even becomes the preferred operating system for enterprise cloud computing. The Red Hat, which has been focused on open source technology, still adheres to the credo of open source and strengthens its cooperation with upstream and downstream partners to build a perfect ecosystem and promote open source solutions. It is also because of the importance of the Red hat to the ecosystem that the Red Hat has been able to do well in cloud computing and large data fields in recent years. We might as well review the Red Hat's efforts and gains in the ecosystem over the years. Insert KVM from Red Cap ...

Microsoft sets up large data platform for China Customs to increase import and export trade volume

In recent time, a foreign trade enterprise in Guangdong, Mr. Song, obviously feel that the local customs Department for the clearance of goods time shortened a lot of customs window Declaration business efficiency significantly improved. Like Mr Song, it is the Chinese customs staff who feel most strongly about this change. And all this change behind, depends on is the national Customs Dynamic Data Warehouse platform, with the latest cloud computing concept, for the customs to provide flexible, dynamic, efficient, manageable it and data services. There are 46 direct customs units (Guangdong branch Office, Tianjin, Shanghai special Office, 41 direct Customs, 2 customs ...)

Sift Science uses large data to guard against cyber fraud

Existing network payment and network transaction fraud prevention system is too complex, also can not effectively eliminate network fraud, Sift science, such as start-up companies began to use machine-learning based on large data analysis to prevent network fraud. For large data analysis, network security is a potential application field. With large data analysis, the attacker's pattern can be screened to be more proactive in defending against network attacks, rather than a known attack pattern that can only be prevented as a traditional security defense. Active defense of this site: Large data triggered by the information Ann ...

Dell and Intel launch one-stop large data solution

Dell 25th announced the launch of the Dell Hadoop solution based on the Intel Hadoop release, Dell PowerEdge Cloud Server and network architecture to further strengthen its next generation computing solution to provide customers with one-stop large data solutions. Dell Hadoop solution provides customers with optimized hardware and software configuration recommendations, simple and fast implementation of deployment services, and overall professional service support to ensure high availability and stability of the enterprise Hadoop environment. The collaboration between Dell and Intel has pushed the development of the big data age. Close to Customer soft ...

When does the enterprise's "big data" put into "cloud"?

In a world of big data, cloud computing plays an important role, especially in short-term jobs and applications, because much of the data in these areas is already in the cloud. For most people, cloud computing is a big, fuzzy, and a little distant dream. Some people see big data strategies as equivalent to putting big data in the clouds, but is it really visionary or just a simple repetition of an industry meeting view? In fact, large data and cloud computing have a lot of overlap and intersection, so companies can very clearly claim that they are using the internal ...

The future of technology in Microsoft's eyes: Big data, artificial intelligence and more natural human-computer interaction

Yesterday, Microsoft CEO Senior advisor Craig Mundie the "Technology Change Future" keynote address at the overcrowded Beihang academic exchange to share Microsoft's views on future trends in technology. Mundie's speech has roughly three aspects: Big data, artificial intelligence and human-computer interaction. Microsoft believes that with the advent of the big data age, people's various interactions, devices, social networks and sensors are generating massive amounts of data. Artificial intelligence such as machine learning can deal with these data better, and explore the potential value. Xbox's Kinect 3D ...

What are the challenges of large data to cloud computing data centers

Faced with the challenges of building a data center in the cloud era, what should be noted when traditional data centers evolve to cloud computing or when companies transform existing data centers or build new-generation data centers in the cloud era? First, cloud computing data center or traditional data Center cloud transformation, to start with planning, study the overall balance of reliability, efficiency, investment, availability, and manageability metrics, choosing the right infrastructure, service levels, reliability and security levels, and choosing the best investment strategy and estimating the right cost price. Second, cloud computing data ...

Capacity up to 240TB HP push large data Dedicated server

HP January 11, 2013 announced the launch of the industry's first dedicated server for large data, designed to help customers explore new business opportunities and save up to $1 million in three years. With the advent of large data software and its benefits, many organizations have tried to deploy these solutions on existing architectures, and the architecture is not designed for the specific requirements of large data workloads. So, from a performance and cost standpoint, these early deployments didn't work out the best. Dan Vesset, vice president of IDC Business Analytics, said: "Hado ...

Cloud storage: Hard to sneak into the big data age

Today, as we focus on cloud computing, the storage sector also launches discussions on cloud storage. Cloud storage is an online storage mode that stores data in a virtualized storage pool provided by a third party.   By 2014, according to IDC, the cloud storage market would be more than $7 billion trillion a year, compared with about $1.5 billion in 2009. Cloud storage and cloud computing have the same characteristics in terms of agility, scalability, and multi-tenant, and the advantages of this model are obvious: they simply pay for the capacity they actually use, and the enterprise no longer relies on its own ...

Large data growth poses new challenges for data storage systems

"The role of data storage is changing, one of the factors is that data is becoming more and more complex, and for a data storage system designed 5-10 years ago, it must be able to handle the diversity of resources," said Donald Feinberg, the analyst's Vice president Complexity, large capacity, and system response speed to real-time and other characteristics. And he argues that while big data has different meanings for different industries, basically large data represents large, complex, unstructured data. However, for good at dealing with knots ...

EMC greenplum Large unstructured data analysis capabilities

EMC today announces the addition of a new feature in the Hadoop Data Computing Appliance (DCA) device that allows users to combine unstructured and structured data analysis platforms. EMC also publishes Greenplum Analytics workbench--a 1000-node test bench for the Apache Hadoop software integration test. The test Bench provides test resources for the Hadoop open source community to quickly identify errors, stabilize new versions, and optimize hardware matching ...

StorNext series How to simplify large data sharing and archiving

Meta Data controller device listing as a leading manufacturer in the backup, recovery and archiving Arena, Quentin showcases the StorNext software and equipment at the 2011 IBC Convention, and the new equipment will take full advantage of StorNext's powerful capabilities and market-leading hardware, in metadata controllers,     Provides predictable high-performance file sharing and archiving for extended devices and disks, as well as dedicated configurations for libraries that support archiving. It is reported that the new product will also be with the traditional StorNext software and partner hardware ...

Hold Big Data! Resolving EMC Isilon cluster storage

With Intel's drive, IT system communications bandwidth and computing power follow Molfa to record highs, maintaining a doubling of growth rates every 12-18 months. At the same time, IDC's latest "Digital Universe" study predicts that data will grow faster than Moore's law, reaching 1.8ZB in 2011, and that the company will manage 50 times times the current amount of data over the next 10 years, and file volumes will increase 75 times times.   Under the background of the rapid expansion of the digital universe, the concept of "big data" came into being. In fact, big data and cloud computing is two ...

IBM releases DS8870 Flash optimized array target large data cloud storage

IBM has recently released a variety of new products and services focused on "security, cloud computing and analytics", adding fresh impetus to its wisdom calculations. The first showcase includes power 770 and 780 servers, POWERVM virtualization software, POWERSC Security and compliance software, and new mainframe software products with IBM's newest power7+ microprocessors. On the storage side, IBM has released new systems and software designed to address the challenges of large data storage performance and, in some cases, to follow the popular OpenStack cloud platform. IB ...

Detailed explanation of large data storage: which issues are most likely to occur

"Big data" usually refers to data sets that are huge, difficult to collect, process, analyze, and those that are kept in the traditional infrastructure for long periods of time. The "big" here has several meanings, it can describe the size of the organization, and more importantly, it defines the size of the IT infrastructure in the enterprise. The industry has an infinite expectation of large data applications. The more value the business information accumulates, the more it's worth, but we need a way to dig it out. Perhaps people's impressions of big data come mainly from the cheapness of storage capacity, but in fact, businesses are creating a lot of numbers every day ...

Storage outlook for 2013: Big data, clouds remain the main theme

And by the end of the year, for the storage sector, too much has happened, the big data quickly become it hot words, and large data-related large data-derivative industry has been booming. 2012 Big Data from the sky, the rapid occupation of science and technology newspapers, hybrid cloud storage emerging, NAS storage to reproduce the scenery, flash technology and integrated infrastructure is also among the mainstream, can be said that the 2012 storage market boom anomaly. Here we have a 2012 storage year for the field of inventory, look at the storage industry those aspects in this year has been a big development, for 2013 years or ...

How to reduce the pue of the engine room under the big data trend?

Today, big data is a hot topic. In the latest IDC report in January, it predicts that the global market for large data technology and services will achieve a 31.7% annual compound growth rate (CAGR), which is expected to yield $23.8 billion by 2016. To support this rapid data explosion, China currently has nearly 540,000 data centers operating and is growing at a rapid rate of 18% per cent a year, with a data center-related infrastructure construction market that will reach billions of millions. Such a development situation, can not help but cause the industry for the computer room energy, green data center attention. 3.

Oracle pushes new memory applications to shrink large data machines

Oracle is planning to release a series of memory-based applications that will keep Oracle in the forefront of competition with SAP. Oracle announced in Tuesday that it will launch more than 10 memory applications, and its spokesman said the first three will be launched in May, including JD Edwards EnterpriseOne as Sales Advisor JD Edwards EnterpriseOne Memory Sales Advisor), JD Edwards enterpriseone in-me ...

Typical case analysis of large data stored in public cloud

Cloud services are playing an important role in large data applications, especially for short-term tasks or applications that have large amounts of data stored on the cloud. Cloud services are attractive to everyone. When someone says to you that their big data strategy is "store all the data in the cloud," you can't tell whether these people are visionary or simply repeat what the experts have suggested to them at the industry meeting. There is no doubt that there is a huge overlap between the big data and the cloud paradigm. These intersections are so broad that you can justifiably claim that you are ...

Total Pages: 263 1 .... 9 10 11 12 13 .... 263 Go to: GO

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.