This week's news of big data is rife with industry events, industry anecdotes and both. Today, small knitting here for everyone to tidy up this week with large data related to the news events can not be missed.
1. EMC publishes the Hadoop release named "Pivotal HD"
EMC released its own Apache Hadoop release-pivotal HD on February 27, along with a technology called HAWQ, which hawq the ability to seamlessly integrate Greenplum analytic databases with the Hadoop distributed architecture.
Pivotal HD has a comprehensive "makeover" of Apache Hadoop, and its biggest advantage over other Hadoop distributions (Cloudera, Intel, etc.) is its ability to integrate with the Greenplum database, Not just running SQL in Hadoop is as simple as that. The release of Pivotal HD and HAWQ has made EMC a step further in the Hadoop arena and will be an important milestone in EMC's big data strategy, according to TechTarget.
Pivotal HD can simply be seen as replacing the POSIX file system of the Greenplum database with the Hadoop Distributed File System (HDFS), and all the operations that the DBA can do in the Greenplum database, as described by the EMC related owner, Pivotal HD can provide support. Compared to the mainstream Hadoop distribution, Pivotal HD is able to handle a wider range of large data workloads with significant performance improvements and can help users save half the cost.
2. Intel releases Apache Hadoop-related scenario change response Big Data
Intel's application for large data innovation, which directly solidifies Hadoop into the chip, is designed to achieve better business decisions with large amounts of data and to identify potential security threats more quickly.
Intel is trying to make the X86 architecture as capable of handling large data loads as the ARM architecture. Almost 24 partners can help Intel extend the Xeon solution for solidified Hadoop to public and private clouds such as Cisco,dell and SAP. To speed up deployment, Intel also disclosed that it would invest in smaller, large data companies such as MongoDB and Guavus Anaytics, which would enrich the data analysis solution based on Apache Hadoop.
3.Friendster has gone, where has Facebook gone?
Friendste's glorious period had reached 200,000 new subscribers a week, 20 people per minute, and almost one of the three people in Silicon Valley was using Friendster. Later, because the number of registrations exceeded the size of the server load, causing the Web site to run slowly or even unable to log in, causing a lot of users dissatisfaction; In order to solve the technical obstacles caused by the proliferation of users, Friendster began to limit the user's behavior in order to lighten the burden. Eventually, because of too much garbage account, too slow and lead to the road of no return.
Facebook's world has recently staged a "big cancellation" campaign, including a lot of people such as Fred Wilson, founder of Union Square Ventures, who are canceling their friends ' attention and shrinking their Facebook friends. A recent survey by Pew Center found that two-thirds of users are extending their time to not use Facebook, with nearly 30% of users planning to reduce the time they use Facebook.
When a product can not provide users with the things he wants, the product will receive the user's indifference, or even discarded. Facebook's mistake is to make the user's flow of information very messy, especially with "sponsored messages (sponsored stories)" and "like (liked)", so that a lot of people's information flow into a vast ocean, can not find their real concern.
What does Facebook do with too much data and messy information? Whether it can enhance its user presence through large data technology will determine the future of Facebook.
4. Build the brain with a memory resistor
This is an exciting research project. The memory resistor is similar to the neural bond in the human brain, and can be used to create an artificial brain. A memory resistor can be detected at a nanometer scale on its working state. Scientists say that computers created with a resistive circuit will "remember" what was previously handled and "Freeze" the "memory" after a power outage. This allows the computer to switch back and forth immediately, because all components do not have to go through the "import" process to instantly revert to the nearest end state. It can be said that the memory resistor indicates the arrival of artificial intelligence.
5.MapR and Google rose in popularity
Marp is a product that is three times times faster than the existing Hadoop Distributed File system and is open source. Google has been peddling its cloud computing engine for high-performance jobs. MAPR's one-minute sort benchmark beat all previous records (and beat Hadoop's minute-sorting benchmark), not to mention standard cloud servers.
6.LinkedIn open its data bus source code
The data bus is a tool for LinkedIn to rapidly update data changes in different storage systems and applications. This is very valuable. In addition, a year ago, LinkedIn had indextank the customizable indexing engine to open source.
According to foreign media reports, Facebook, Zynga, Groupon and other social-concept stocks after the market value of the avalanche, only white-collar social networking site LinkedIn's shares are still rising, February 27, the closing price of 168.55 U.S. dollars, hit a record high. As a white-collar social networking site, LinkedIn needs to be better at dealing with large data-processing capabilities and providing more user representation.
7.Continuuity Free Beta is now open to the public
Continuuity is a platform service company that develops large data applications created by Yahoo's former vice President Todd Papaioannou and Facebook engineer Jonathan Gray. This Wednesday, the company opened Beta to developers and provided developers with a service to test the user experience of Hadoop applications on a cloud-based platform.
Todd Papaioannou, co-founder and chief executive of the company, said that as a start-up, Continuuity was trying to lift the tide of the next wave of large data applications, and the tools offered by the company could greatly improve the scalability of different parts and phases of software in development. In addition, Continuuity's team members have extensive experience in large data architectures and applications, and the technology used in the AppFabric platform is pioneered by the company.
8. Position Analysis company placed told you who went shopping but shopped online.
Placed is a start-up company in the field of Applied location analysis. The company tells you where business people like to go by tracking phone location information, and which stores are at risk of being "seen" rather than "bought".
It is reported that the company raised 3.4 million dollars in the first round of financing at the beginning of last year.
9.IBM into Korea Meteorological Bureau
Predicting weather has always been a useful method for large data and high-performance computing. According to news from the media, IBM has provided the Korean Meteorological Agency and the National Mobile Business Center with its latest IBM storage technology, which can record 20g bytes per second (equivalent to 400,000 pages).
10.Virtustream uses Druid to provide cloud analytics services
Virtustream the image of its enterprise-class cloud service provider, and Metamarkets's partnership is beneficial to its access to and expansion of large data application markets. Essentially, Metamarkets is a company that provides consulting services to users who want to lay out applications on top of Hadoop and Druid.
It is reported that Virtustream completed a new round of 15 million dollars of financing from Intel in March last year.