Concurrency: Two or more separate activities occur simultaneously, performing multiple independent tasks simultaneously in a single system, rather than performing some activities sequentially
Old: A single processor that performs a task at a time and can switch multiple tasks per second
NEW: multicore processors, real parallel multicore tasks, and task switching
System from one task to another (that is, to switch), to make a context switch, the operating system must save the state of the CPU and the instruction pointer for the task currently running, and calculate which task to switch to, and reload the processor state for the task that is about to switch to. The CPU then loads the instructions for the new task and the memory of the data into the cache.
Process switching: Recovering a processor from a running process is essentially a place where the intermediate data stored in the processor's registers is saved, freeing up the processor registers for other processes to use. Intermediate data that is terminated by the running process is stored on the private stack of the process.
allowing a process to occupy the processor is essentially the data that a process holds on the private stack (the intermediate data from the last time the process was aborted) to the processor's register, and the breakpoint of the running process into the processor's program pointer pc, so that the running process begins to run by the processor. That is, the process already occupies the right to use the processor.
Multi-process Concurrency:
This inter-process communication setup is complex and slow because the operating system provides some protection between processes to avoid one process from modifying the data of another, and the other is the fixed overhead required to run multiple processes, the time it takes to start the process, and so on.
The advantage is that because of the protection between processes, it is easier to write secure concurrency code.
Multithreading Concurrency:
All threads in a process, share the address space, and thread access to most of the data can be passed between threads, address sharing, lack of inter-thread data protection, so that the operating system's record workload is reduced, multithreading overhead is much smaller than the multi-process.
Cost of sharing: data is accessed by multiple threads and must be guaranteed that the data accessed by each thread is consistent.
Threads are limited resources, and too many threads running at the same time consume a lot of operating system resources, making the operating system run more slowly on the whole.
Each thread requires a separate stack space, and too many threads run out of the available memory or address space of the process.
Each thread will have a 1MB stack (many systems will allocate this way), and for a process with a flat architecture of 4GB (32bit) of available address space, 4,096 threads will run out of all address spaces, leaving no room for code, static data, or heap data.
The more threads are running, the more context switches the operating system will have to do.
Process switching multithreading concurrency