The evolution of data center

Source: Internet
Author: User
Keywords Data Center Data center

  

The concept of data center originated in the the late 1950s, 1960, the concept of data processing system has become a reality, from the previous mainframe to the current cloud technology-centric, computing and data storage in the field of physical and technological changes to take us to a tortuous road to today.

"Data centers are central to modern software technology and play a key role in expanding enterprise capabilities," says a Wikibon article called the Data Center's past, present and future. "The concept of the data center originated in the late 1950s, when American Airlines, in collaboration with IBM, created a passenger reservation system belonging to the United States Sabre Company, which made this part of its main business area more automated." In the 1960, the concept of a data-processing system became a reality, and it was used to create and manage aircraft reservation systems, allowing any agent point anywhere to obtain electronic data in a timely manner, thus opening the enterprise-class data center gate.

Since then, the physical and technological changes in computing and data storage have taken us on a tortuous path to today. Let's briefly review the history of the data center, from the previous mainframe to the current cloud-centric, and their impact on it decisions.

1946

The electronic Digital Integration computer (ENIAC), built in 1946, was used by the U.S. military to store artillery firing codes, known as the first general-purpose electronic digital computer.

Early 1960 's

The first transistor computer (TRADIC), adopted in 1954, was the first machine to use transistors and diodes without the use of vacuum tubes. The emergence of a molded commercial system after the 1960s has led to a leap-forward in computing technology for mainframes such as the IBM System series.

1971

Intel uses its 4004 processor to become the first general-purpose programmable processor on the market. It provides "building block" services that allow engineers to buy and customize software to enable many electronic devices to function differently.

1973

Xerox Alto became the first desktop computer to use a graphical interface, with a graphical user interface and a high-resolution screen, oversized memory and proprietary software.

1977

ARCnet, the first LAN service, was put into use at the Chase Manhattan Bank in the United States. It supports a 2.5Mbps data rate, with up to 255 computers connected to the network.

1978

SunGard Company has developed and established business disaster recovery operations.

Note: Before the introduction of the PC server, any I-t decisions made around the host must be done at the enterprise level, whether it be the operating system, hardware, or application. All of these things in the enterprise run in one device, so it's decision is inflexible and difficult.

1980 's

The personal computer (PC) began in 1981, sparking a boom in the micro-computing industry.

Sun Microsystems has developed a network File system protocol that allows users of computer clients to access files over the network in a manner similar to local storage.

Computers are quickly installed anywhere we go, but there is little concern about the environment and operational requirements.

Early 1990 's

Microcomputers began as servers, replacing older mainframe computers, which are increasingly known as data centers. The enterprise begins to build a row of server clusters internally.

Mid-1990

The ". com" Wave has prompted companies to aspire to fast network connectivity and uninterrupted operation. Businesses are starting to build server space that can run more devices (thousands of servers). Data centers became popular as a service model during this period.

Note: Thanks to the PC (server), it decisions begin in two separate ways. The server allows for application based decisions, while hardware (data center) decisions are still made at the enterprise level.

1997

Apple created a program called Virtual PC and sold it through the Connectix company. Virtual PCs, like Softpc, allow users to run a copy of a window on a Mac computer to address software incompatibility issues.

1999

VMware started selling VMware Workstation similar to Virtual PCs. The original version can only run on Windows systems, and it later supports other operating systems.

Salesforce.com pioneered the concept of delivering enterprise applications through a simple Web site.

2001

VMware ESX release, this is a bare metal management program, can run directly on the server hardware, no additional underlying operating system.

2002

Amazon AWS has begun to develop cloud-based services, including the storage, computing, and implementation of artificial intelligence via the Amazon mechanical Turk.

2006

Amazon AWS has started providing IT infrastructure services to businesses in the form of Web services, and is now often referred to as "cloud computing".

2007

Sun Microsystems has adopted a modular data center that has changed the basic economics of enterprise computing.

2011

Facebook launched an open computing project to promote industry-wide sharing of technical parameters and practical experience to create the most energy-efficient and economical data centers.

2012

Surveys show that 38% of businesses are using the cloud, and 28% of companies are planning to start using or expanding the cloud.

2013

Telcordia Company has issued general requirements for equipment and space in telecommunications data centers. The file presents minimum spatial and environmental requirements for data center equipment and space.

Google invested $7.35 billion trillion in billions of dollars in 2013 to build its network infrastructure. The cost is for a massive expansion of Google's global Data center network, possibly the biggest construction project in the history of the data center industry.

Now and in the future

Today's data centers are shifting from ownership patterns of infrastructure, hardware and software to another on-demand subscription and delivery model.

To meet the needs of applications, especially through cloud computing, today's data center capabilities need to match the cloud. As data center consolidation, cost control, and cloud support, the entire data center industry is now changing. Cloud computing is paired with today's data centers, making it more flexible to access resources when it makes decisions, but the datacenter itself is a complete and independent entity.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.