Overview of CPUs and threads, how the thread pool is properly set up __cpu and threads

Source: Internet
Author: User
Tags switches

Physical nuclear virtual Nuclear single-core CPU and multi-core CPU process and thread understanding comparison thread switching thread cost serial, concurrent, parallel serial concurrent parallel multi-core thread number selection compute-intensive IO-intensive advanced high performance direction concurrent programming online summary of thread pool

Reference Connection:
Understanding CPUs, cores, and threads
How to estimate the thread pool size reasonably
How to reasonably set the thread pool size physical Nuclear physical core number =CPU number (number of CPUs installed) * The core number of each CPU virtual core of the so-called 4 nuclear 8 threads, 4 core refers to the physical core. Using Hyper-Threading technology, a physical kernel simulates two virtual cores, two threads per kernel, and a total of 8 threads. The operating system appears to be 8 cores, but it is actually 4 physical cores. Hyper-Threading technology enables a single physical kernel to implement thread-level parallel computations, but not two physical cores. single-core CPU and multi-core CPU are a CPU, the difference is that the core of each CPU multi-core CPU is a number of single core CPU alternative, multi-core CPU reduces the volume, but also reduces the power of a core can only execute a thread process and thread Understanding Process is the minimum unit thread that the operating system allocates to resources (including CPU, memory, disk IO, etc.) is CPU dispatch and allocation of basic unit resources allocated to process, thread sharing process resources. contrast

contrast Process Threads
Defined A process is the running process of an entity that is run by a program and is an independent unit of resource allocation and provisioning for the system. A thread is the smallest dispatch unit for a process to run and execute
System overhead Create undo switch overhead, resource reallocation and recovery Save only a small amount of register content, small overhead, executing code in the address space of the process
Owning assets The basic unit of resource ownership Basically does not occupy the resources, only has the essential scarce resources (program counter, a set of registers and the stack)
Scheduling The basic unit of resource allocation Unit of Independent Dispatch assignment
Security The process is independent of each other and does not affect Threads share resources under a process that can communicate with each other and affect
Address space Independent memory address space given by the system Consists of a correlation stack register and a thread control table TCB that registers can be used to store local variables within a thread
Thread SwitchingThe CPU assigns a time slice (that is, the time allocated to the thread) to the thread, and then switches to the other thread after the time slice is completed. The state of the thread is saved before the switch, and the current state is not known until the next time slice is given to the thread. This process of reloading the state of thread B is called context switching from the state of saving thread A to the switch to thread B. A large amount of CPU time is consumed when switching up and down. Thread OverheadContext switches consume thread creation and extinction overhead threads need to save the maintenance thread local stack, which consumes memory Serial, concurrent, parallel SerialMultiple tasks, one execution at a time, and the other executing. Metaphor: After dinner, watch the game. ConcurrentMultiple threads run at a single core, one thread at a time, and the system keeps switching threads, looks like it's running simultaneously, and it's actually a thread that keeps switching. Analogy: One will run to the Food hall for dinner, one will run to the living room to watch the game. ParallelEach thread is assigned to the core of independence, and the thread runs concurrently. Analogy: Watch a game while eating. Multi-core thread number selection Compute-intensiveThe program is mainly for complex logic judgment and complex operation. CPU utilization is high, do not open too many threads, open too many threads will instead because the thread switch context and waste resources. IO-intensivePrograms are primarily IO operations, such as disk IO (read files) and network IO (network requests). Because IO operations block threads, CPU utilization is not high, you can open multiple threads, blocking can switch to other ready threads, improve CPU utilization. the direction of high performance


One way to improve performance: Improve hardware level, process speed or core number. Another way: According to the scene, reasonable set the number of threads, software to improve CPU utilization. (Number of threads set, if CPU-intensive, the thread pool size is set to n+1, and if it is an IO-intensive application, the thread pool size is set to 2n+1)

The number of threads can also be referenced by the formula thread number =CPU core number/(1-blocking factor), see if your task is compute-intensive or IO-intensive, io-intensive, with a larger congestion factor, less computational-intensive congestion coefficients, and a blocking factor that can be interpreted as > blocking time/computation time; a summary of the thread pool in concurrent programming network

How to use a thread pool for a business with high concurrency and short task execution time. How to use the thread pool for a business with a high concurrency and long task execution time. How to use thread pooling for a business with high concurrency and long business execution time.
-High concurrency, short task execution time, thread pool threads can be set to +1 CPU cores, reducing thread context switching
-Concurrency is not high, the task execution time long business to differentiate to look:
-If business hours are focused on IO operations, which is IO-intensive, because IO does not consume CPU, do not let all the CPU idle, you can increase the number of threads in the thread pool, so that the CPU to handle more business
-If the business time is concentrated on the calculation operation, that is, the compute-intensive task, the number of threads in the thread pool is set to less, reducing the thread context switching
-High concurrency, long business execution time, this type of task requires segmentation positioning optimization, the first look at some of these business data can be cached, the second increase the server to improve hardware performance, the third is to set a reasonable thread pool, and finally, the business implementation of a long time problem, need to consider the split and decoupling business.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.