Concurrency and parallelism

Source: Internet
Author: User

Concurrency and parallelism are macroscopic concepts that process multiple requests at the same time. But concurrency differs from parallelism in that two or more events occur at the same time, while concurrency refers to two or more events that occur within the same interval.

In the operating system, concurrency refers to a time period in which several programs are running from the start to completion, and these programs run on the same processor, but only one program runs on the processor at any one point.

① program and calculation no longer corresponds to one by one, a program copy can have multiple calculations
② Concurrent Program has the mutual restriction relation, the direct restriction manifests as one procedure needs another program's computation result, the indirect restriction manifests as many programs to compete a certain resource, such as the processor, the buffer and so on.
③ Concurrent program in the execution is to walk and stop, intermittent advance.

On the network server, concurrency refers to the number of connections that can be processed at the same time, for example, the server can establish 1000 TCP connections, that is, the server maintains 1000 sockets, the concurrency of this server is 1000, but the server may be single core or 8 cores, 16 cores, etc. In a word, the processing of these 1000 socket connections is also time-sharing. Each socket server processing time if it is 1s, then the server can process 1000 requests within 1s, if each socket processing 100ms, then the server 1s can process 10,000 requests.

Here we first throw out some concepts, if these concepts are clear, concurrency and parallelism are basically clear.

Session: When we work on a computer, open a window or a Web page, we can call it a "session", extended to the Web server, to maintain a lot of users of Web page access, we can assume that the server manages multiple "sessions."

Number of concurrent connections: the site sometimes error: "HTTP error 503. The service is unavailable ". But the brush one or two is normal, it is estimated that more than the maximum number of concurrent connections to the site. Concurrent connection refers to the ability of the network traffic management device or proxy server to handle its business information flow, is the maximum number of point-to-point connections that can be processed simultaneously, it reflects the device's access control capability to multiple connections, and the connection status tracking ability, which directly affects the maximum amount of information that the device can support.

Concurrency can be understood as the maximum number of sessions maintained by the server, parallel to each other, it is related to how many sessions are at the same time, if there are two servers (processes), the number of possible parallel is 2, and the number of concurrent is 1000. We can also compare the concepts of throughput and bandwidth.

Throughput vs. Bandwidth: Throughput and bandwidth are easy to confuse, and the units are Mbps. Let's look at the corresponding English, throughput: throughput; Bandwidth: Max net bitrate. When discussing the bandwidth of a communication link, it usually refers to the number of bits per second that can be transmitted on the link, depending on the link clock rate and the channel encoding in the computer network, also known as the line speed. It can be said that the bandwidth of Ethernet is 10Mbps. However, you need to differentiate between the available bandwidth (bandwidth) on the link and the number of bits (throughput) that can be transmitted per second in the actual link. It is generally more likely to use the term "throughput" to represent the test performance of a system. Thus, because the implementation is affected by various inefficiencies, a pair of nodes connected by a link with a bandwidth of 10Mbps may reach only 2Mbps of throughput. This means that the application on one host can send data to another host at 2Mbps speed.

The bandwidth can be understood as parallel, that is, 10M bits (0,1) can be transmitted in the line at the same time. The throughput is similar to concurrency, which means that the host can handle 2M bits per second. Some of the metaphors are not very appropriate, but there are some similarities in the careful experience.

Concurrency and parallelism

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.