Cloud Computing's three major problems and two bottlenecks

Source: Internet
Author: User
Keywords Consumer cloud computing
Tags access access management accounts application application developers application software application software development applications

The Cloud Security Alliance (CSA) presents some key IT operations areas that need to be focused on the organization's deployment of cloud computing resources: Governance and enterprise risk assessment, legal and contractual issues, electronic discovery procedures, compliance and auditing, information lifecycle management, portability and interoperability, business continuity and disaster recovery, data center operations, event response, notification and remediation, application security, encryption and key management, identity and access management, memory operations, virtual operations.

In a traditional data center, a stable border is built around the infrastructure and data that needs to be protected, and management procedures and controls can be placed where appropriate. This deployment is easier to manage because organizations can control the location of their servers and take advantage of all physical hardware. However, in the cloud, as applications migrate dynamically and organizations share the same remote placement of physical hardware with third parties, the boundaries become blurred and the control of security is weakened.

The cloud does not use dedicated virtual network (VPN) technology. This means that anonymous attacks can access connection points like legitimate users or administrators of any system. In the traditional computing environment, only a few servers can access the Internet. In the cloud computing environment, most servers have access to the Internet, which obviously expands the attack surface. In the cloud, multiple leases mean that multiple different individual end user groups share the same services and/or resources. These shared environments have a special risk within a user's resource heap. The risk looms that the groups sharing the cloud will intentionally or unintentionally access each other's private data. Especially in the cloud based on IaaS, security researchers have uncovered new vulnerabilities that do not exist in the old system. Because cloud consumer data is stored in public memory hardware, lax management or malicious attacks can compromise its security. In the cloud, because the boundary of the application is dynamic, the security scheme must be dynamic and virtual with the application of random migration in the cloud.

There is no location specificity conducive to the wide availability of cloud services. However, it is not possible for users or cloud vendors or both to directly identify the detailed location of the computing resources in a particular cloud. When security defenders don't even know where the data is, how do they protect the data? How can cloud vendors identify user data (for legal and other purposes)? If the user exits the cloud scenario, how can the cloud vendor safely erase the user's data? The public nature of cloud computing seriously affects the privacy and confidentiality of data. Cloud data is typically stored in plain text, and few companies fully understand the level of sensitivity that their data stores have. Data loss and leakage is one of the most serious security problems in the cloud.

When storing and transmitting data, you should always encrypt the data and use a separate symmetric key when storing it. Special protection for user keys will require some collaboration from cloud vendors. Unlike dedicated hardware, a memory buffer of 0 is not possible to remove a key: ① memory is hardened by a system management program so that it is persisted. ② the virtual machine has been captured by the snapshot for the sake of recovery. ③ virtual machines are migrated continuously to different hardware. How to securely use keys in the cloud is an unresolved issue.

When deciding how to identify a large number of different users, Internet accessibility and multiple leasing challenges. Because conventional authentication services tend to use shared public resources, resource scaling and multiple leases complicate the authentication process. For example, in Active Directory, members of the Everyone group can see and list lists of all kinds of resources.

The first Deyun requires that all end users have separate accounts in their databases and must log on to each of their own accounts. It is obvious that it is impractical to require end users to register and manage individual login accounts for each Web service they will use in the future. The main principle of any good identity system is that users should only show the least amount of identity information required to expose the service and transaction processing. Work on the extensive identification programme is still ongoing.

In the book "The true extent of internal threats", a survey of 3,000 British workers showed that 37% had shared privileged corporate information with friends and family. Even if 58% of PCs are shared with others or at least accessed by others, 21% of laptop/desktop owners have moved the company's data to their personal computers, and 14% admit to moving work information to personal smartphones. 82% of respondents thought that internal threats amounted to or more than the threat posed by external attackers to the organization. Internal security may be intentional or unintentional by insiders, or it may be a malicious attack or theft of information from internal and external collusion. However, so far, there is little discussion of the internal security of the cloud in the literature on cloud computing security. "Defending internal threats, reducing IT Risk" discusses cloud computing, virtualization, and internal threats. In cloud computing, insiders are no longer just employees within firewalls, but also include internal members of service providers that provide cloud services to the organization, adding to the complexity of internal threats. Internal threats clearly involve virtualization, and when critical servers are virtualized, strict controls are required to limit the rights of privileged users to control those virtual servers.

Performance problems with cloud computing

The Compuware company conducted an independent survey of nearly 700 U.S. and European companies in November 2010, including 378 North American companies, 100 companies in the UK, Germany and France, and released the Cloud Performance survey on February 19, 2011.

The Cloud Performance survey reports that cloud computing performance relies heavily on every component of the extended delivery chain: data centers, transmission networks (including the Internet, Wan, LAN, or physical transport), other service providers, and even end-user devices and browsers.

Compuware's previous research has clearly demonstrated that there is a direct link between application performance and revenue. For example, when the company Site page response time is close to 4 seconds, users become more and more frustrated, at 6 seconds, 33% of users will give up access to the page, and to the competitive Enterprise page.

Factors that affect the delivery of network response time include the total amount of data being transferred, WAN bandwidth, round trip time, the number of application rotations, the number of concurrent TCP sessions, server-side latency, and client latency. Riverbed Marvell has compared the history of IT development in its report on releasing cloud performance and making cloud promises reality, and analyzed the impact of network response time on public cloud performance. Public clouds can or may not be merged, that is, an enterprise can separate its assets in its data center and other people's cloud, but some assets will be in the more distant public cloud. Migrating resources away from users can cause performance problems. There is a hypothesis in the cloud vendor that if they manage the performance of their cloud infrastructure, the performance between the cloud and end users, regardless of business users or it users, will not affect the technical and business value. This assumption is not tenable, as cloud services will force users to pull more distances from their data, resulting in greater latency and impact performance.

Cloud risk suggests that cloud services cannot guarantee consumers access to a high-quality Web experience. Cloud providers have demonstrated a shift away from the target in performance across geographic sites, and in some important cities, the response time for end-user delivery is 10 times times slower than in some cities, and many cloud services show performance problems on the edge of the internet where consumers live.

If you assume that the application spans the boundaries of the cloud, this will complicate the storage and transmission of your data. Amazon, in developing new cloud services, found that the cheapest way to send large amounts of data was to deliver the disk, or even the entire computer, through the overnight delivery service. In addition to the bottleneck of WAN bandwidth, cloud internal network technology may also be a performance bottleneck. Connectivity to data center nodes, switches, and routers can lead to bandwidth performance bottlenecks. Lack of bandwidth is why scientists rarely use cloud computing.

Webtorials published the 2010 Cloud Network report and wide area network (WAN), and proposed the Internet-free cloud networks. For example, some cloud computing definitions assume that cloud services are always delivered over the Internet. However, the Internet is not always used to access the most appropriate connection services in the cloud computing scenario.

Usability issues with cloud computing

The availability of services is critical, regardless of which cloud architecture the consumer or service provider chooses.

However, some experts believe that most current discussions about cloud computing fail to understand that the convenience of cloud computing models will unleash flood-like computing needs. The release of requirements can put pressure on usability.

Therefore, the availability of cloud computing is a comprehensive issue. Cloud computing security, performance, and growing demand for applications with new capabilities will affect the availability of cloud computing.

SaaS is facing major hurdles in four major software industries, according to Forrester Studies. These include operating systems and databases, internal IT management and data management software, legacy and defined process applications, and vertical joint applications, such as the Securities transaction processing system. These systems comprise 40% of all software investments and are often kept in-house for reasons such as security considerations, investment in existing infrastructure, and the need to closely integrate with other applications.

Some software and hardware are strongly correlated. For example, the operating system that is installed is associated with the MAC address of the native hardware, needs to be supported in memory while the system is running, and the operating system needs to consume a lot of hardware resources, including memory and memory. Therefore, if the operating system as a cloud service, it may be randomly migrated to multiple cloud computing machines during the run. This may require a structural redesign of the operating system to make it irrelevant to hardware. Re-developing a cloud-appropriate operating system requires huge investments. In addition, the migration of large operating systems will be time-consuming, which will not only reduce the scalability of cloud resources but also increase the cost of using cloud consumers. Finally, the hardware module TPM, which is closely related to the operating system, will also face problems. The inherent characteristics of the operating system determine that it is not available for a cloud service.

To achieve some of the benefits of cloud computing, cloud resources are virtualized, such as servers, storage, and networks. Therefore, the application of multiple leasing modes in cloud environment may automatically select hardware resources randomly. This means that critical data that affects system performance can be continuously exchanged and migrated between servers through the cloud network. This will seriously affect the real-time nature of the system. The characteristics of cloud computing determine that it is not suitable for real-time control operations.

At present, public cloud computing is particularly unsuitable for CPU-intensive and large-capacity applications, whether in terms of performance, price, and resource use, and seems more suited to the low and peak applications of resource requirements and non-critical services. According to the data in the schedule, users used 25.6 dollars from Amazon in 2008 to rent only 1 dollars of CPU. Therefore, even considering power supply and other management overhead, the application of large data volume high performance dense computing is obviously uneconomical to rent a public cloud. The United States Department of Energy's Berkeley National Laboratory, in terms of performance and price, compares its workload with other commercial cloud providers in its Estrecho De Magallanes private Cloud, which provides high-performance computing in the Energy Lab bed. The results show that Amazon's EC2 cloud service is very competitive in performance, but the price per CPU hour is 20 cents, The workload in the lab is less than 2 cents per CPU hour.

It is only in the public cloud that the SaaS service model is used to achieve all the benefits of cloud computing. Therefore, the development of SaaS service model will inevitably become the touchstone of cloud computing success. The cloud allows analysts to see whether a multiple-lease model is suitable for rapidly changing organizations is a controversial issue, according to the observation of SaaS ERP for service organizations. All users of a multiple-lease SaaS vendor use the exact same version of the system, and one user may want to modify the existing version of the application, while others may insist on maintaining the existing version. The SaaS model also has problems with user data being locked.

In a word, the theoretical simulation differs greatly from the actual results, and the cloud computing economics still faces the challenge. The expected benefits still require cloud computing to be perfected in practice before it can be truly realized.

Two major bottlenecks in cloud computing development

The issue of data sovereignty is related to laws and regulations and policies, especially international cooperation and trust. Cloud computing, a service (SaaS) model, pulls computing resources, application software development, operations, and management away from the cloud consumer and its end users farther than the traditional computing model, weakening the user's initiative. Data sovereignty and user initiative are bottlenecks in the development of cloud computing in countries where information technology (IT) prevails. But countries that are at the disadvantage of it are not just bottlenecks, they may involve deeper issues.

History has proven that migrating resources away from users can lead to a series of problems. Cloud computing has made these issues even more prominent by moving resources farther away from users. Therefore, cloud computing development will face two major bottlenecks: Data sovereignty and Consumer initiative. It is called a bottleneck because the two problems cannot be solved by the cloud itself.

Data sovereignty involves cloud users ' right to deal with data or even national security. Technology alone cannot solve the problem of data sovereignty, domestic laws and regulations and international laws and treaties can not be fully resolved, but also between the state and users (consumers and their end users) and suppliers between the trust.

One of the characteristics of cloud computing is that consumers and their end users are far from data and computing resources, while resource deployment and data processing are highly automated. This seriously reduces the user's initiative. Changing user requirements are always a bottleneck in software engineering, and SaaS service patterns can make the problem even worse.

In order to ensure social stability, economic prosperity and security, we must recognize the importance of data sovereignty and consumer initiative from the high level of national security. Each country must develop its own national cloud computing development Strategy as part of its national security strategy.

Data sovereignty

The importance of geographic positioning in the cloud on the representation of data sovereignty details the issue and scope of data sovereignty associated with the authenticity and geographical location of the data stored in the cloud. When data and resources are virtualized and widely distributed, the security of data sovereignty needs to be of particular concern in the binding provisions of laws and policies.

For performance, control, and continuity reasons, a cloudy SLA guarantee data will only be stored in a data center within a particular geographic area (for example, within a state, time zone, or political boundary). However, it is challenging to actually verify that the cloud service provider satisfies its contractual geographical obligations. For example, to reduce IT costs, cloud service providers may intentionally or unintentionally violate SLAs and move data to foreign data centers. However, this action may make data available to foreign governments.

Data sovereignty still has many unresolved new issues, each of which requires further study. For example, what is the most correct way to identify and place known and trusted landmarks? Can the Government play this role? Is it possible to motivate cloud service providers to serve as landmarks for their competitors? Can you grant certain numbers of known or unknown honest landmarks as trusted Web?

When technical solutions are difficult to solve, the provision of legal fixes to deter and punish wrongful acts is often developed only after the fraud that has occurred has been discovered. Similarly, there is a legal ambiguity as to how legal protection can be used to store data in a geographically agnostic cloud.

Now, there are many laws governing the flow and storage of data across national borders, including the governance of privacy laws, intellectual property laws, law enforcement regulations, electronic discovery codes of conduct, and information collection laws. In Canada's Nova Scotia and two regions of British Columbia, most of the personal data available to public groups cannot be moved beyond the Canadian border. Australia's 9th National Privacy Law on Cross-border data flows prohibits the transmission of personal information to foreign countries unless certain standards are met, including foreign-backed laws that are sufficiently similar to national privacy laws. Similarly, the EU Data Protection Act extensively restricts the flow of personal information from within Europe to countries that do not provide adequate levels of protection to any domestic law. Although the Department of Commerce has organized a voluntary mechanism for American companies to verify compliance with these EU laws (the Safe Harbor Law), the adequacy of these mechanisms is often criticized. In April 2010, the German Data protection Authority issued a resolution requiring German data exporters to work harder with the US Safe Harbor verification entity to effectively question the adequacy of the safe Harbor plan to meet EU guidelines, and the lack of efforts of exporters will likely face sanctions. The Patriot Act in the United States allows federal agencies to obtain and obtain data from suppliers without notice or with the consent of the data owner, which may include trade secrets and sensitive electronic conversations. Other countries have expressed reservations about storing data in a US-based cloud, influenced by U.S. legal powers like the Patriot Act.

The initiative of Consumers

Cloud computing allows consumers and their end users to stay away from data and computing resources, and cloud computing's high degree of automation of resource deployment and data processing has severely reduced user motivation. In particular, the SaaS service model, Third-party software developers to replace the consumer's internal IT department responsible for the development of the application software, Third-party cloud vendors to replace the consumer's business unit responsible for the application of the operation and management, which is very detrimental to the application software developers timely and effective understanding and use of specific consumer experience and needs

In traditional computing models, there is always a serious problem with the collaboration between IT departments and business units in most organizations, and only a handful of good companies value and utilize the end-user experience. Because the SaaS model weakens the role of cloud consumers, it is likely to worsen this state.

Unfortunately, although end-user experience and requirements are one of the most important reasons for changing the organization's business requirements, this problem is deliberately or unintentionally overlooked in almost all cloud vendor reports. Some independent research reports emphasize the importance of the end-user experience and make the user experience the first in measuring the performance of the application software. However, the SaaS model allows application developers to stay away from consumer organizations, and Third-party application software vendors that are running are not directly in touch with consumers ' end users. Many consumers of the same SaaS service have different end-user groups, but it is not possible for a SaaS provider to feel the need to redesign a software structure if a consumer's user base reaches a considerable size. And, it is almost impossible for SaaS developers to specialize in new software for some consumers without the need of other consumers.

The collaboration between the SaaS developer and the consumer business unit can be more difficult than collaboration with the consumer's internal it and business units, because it is far from the consumer's business unit and its end users. Therefore, even without considering the impact of virtualization and network bandwidth, the SaaS model does not seem to be appropriate for large applications.

The SaaS model is a disruptive evolution of computing. Traditionally, applications can be developed by Third-party software companies, while in SaaS mode the application software evolves to run and manage by a third party company. Therefore, weakening the consumer's initiative will seriously affect all aspects of operation, tactics and strategy.

At the consumer level, traditionally, a handful of large software companies typically monopolize the development and marketing of system software and large application software. It can be foreseen that not only a few large SaaS application developers will likely monopolize the development of application software, but also a few large SaaS cloud providers will likely monopolize the operation and management of application software. The characteristics of the SaaS model determine that consumers will almost completely lose their motivation in the entire lifecycle of application software development, operation, versioning, and vulnerability patching. Although in some small and medium-sized SaaS services consumers can gain some considerable benefits, but fundamentally lose the control of the application software and governance rights and capabilities.

At the national strategic level, if the foreign SaaS application software developers and service providers completely monopolize the development and operation of a country's internal application software, the country will lose the application software market, including the huge market benefits and the control and governance of the application software. Therefore, the SaaS model will affect the country's economic security. This is a serious challenge for the country where it is at a disadvantage.

In particular, if an important information system in a country adopts the SaaS model, it could have more serious consequences. A critical infrastructure of the country relies heavily on its application information system, losing its initiative to control and govern important information systems and losing its control and governance of key infrastructure. In other words, weakening consumer activism will likely not only affect the country's economic security but will directly affect the security of the country's critical infrastructure.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.