The difference between multicore programming and single-core multithreaded programming

Source: Internet
Author: User

1. Lock competition: In a single core, if a single thread acquires it, it gets the CPU run time, and other threads waiting to acquire the lock are blocked. The use of locks affects only the time-consuming shackles and locks, and the CPU always runs.
Multicore, if 2 (more) threads use the same lock, it can cause CPU starvation. The actual or serialized execution!
2, the difference between the thread and execution: on the single core CPU, the client software, the use of multi-threading, mainly to create multithreading to put some calculations in the background to perform, without affecting user interaction. (User interface & other computations in parallel) to improve the user's operational performance!
multicore, separate multiple threads, no longer limited to separating user interface operations from other computations. Break down multiple threads so that calculations are performed on each CPU to be allocated. The number of execution threads is related to the number of CPU cores! Some CPUs must be idle if the number of threads is less than the number of cores.
3, CPU core load balancing: The single core does not consider load balancing, the calculation of the various threads vary greatly, and will not affect the total program calculation time. Multi-core, you must consider balancing each thread's computing capacity to the CPU
4, Task scheduling policy differences: Single core, task scheduling mainly consider time-sharing, some tasks priority implementation! Commonly used: time slice rotation, priority preemption!
In multicore, task scheduling has a new need than a single core. Consider the time-consuming and computational equalization of different tasks! A simple time slice rotation and priority preemption cannot be used. And the total call to the operating system is not mastered. Need programmer to implement!
5, CPU Cache access: When the CPU reads the cache, read in the behavior unit. If 2 hardware threads have two different memory on the same cache line. When 2 hardware threads write to their own memory simultaneously, 2 threads write a cache line at the same time. Conflict!! 【pseudo-memory issues】
Single-core machine, there is no pseudo-memory problem multicore machine, there is a pseudo-memory problem! Find a way to map different memory blocks to different cache rows
6, Task priority preemptive difference: Single-core CPU, priority preemption is a very common task scheduling strategy. In multicore CPUs, multiple cores can cause low-priority and high-priority tasks to run concurrently. Priority scheduling policies need to be further improved.
7, serial computing and parallel Computing, distributed computing, the difference between single-core multithreaded programming, are the serial algorithm. Parallel computing and distributed computing are more used in distributed computing multicore programming than parallel computing.
Parallel computing = Parallel design Patterns + parallel algorithms distributed computational complexity > Parallel Computing complexity > Serial Computational complexity
Parallel computing: Only compute parallel execution is considered, regardless of the contention between threads resulting in CPU starvation distributed computing: Compared to parallel computing, can better solve the CPU hunger, so that the calculation of a balanced allocation of tasks to the various cores

The difference between multicore programming and single-core multithreaded programming

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.