A scenario in which multiple threads of execution share a resource is one of the most common concurrent programming scenarios. Often encountered in concurrent applications: multiple threads read or write the same data, or access the same file or database connection. To prevent these shared resources from possible errors or inconsistent data, we must implement mechanisms to prevent these errors from occurring.
In order to solve these problems, the concept of critical area (Critical section) was introduced. A critical section is a block of code that accesses a shared resource, which allows only one thread to execute at the same time.
To help programmers implement this critical section, Java provides a synchronization mechanism. When a thread attempts to access a critical section, it uses a synchronization mechanism to see if another thread has entered the critical section. If no other thread enters the critical section, it can enter the critical section, and if it is already wired into the critical section, it is suspended by the synchronization mechanism until the incoming thread leaves the critical section. If more than one thread waits to enter the critical section, the JVM chooses one and the rest waits.
We'll walk through the two basic synchronization mechanisms provided with Java statements in the following steps:
Introduction to Thread Synchronization basics