Concurrent processing greatly increases the utilization of database resources relative to transactional serial processing
But there are several situations
1 Dirty Reads
A transaction is modifying a record before the transaction is committed; another transaction reads the same record, and if unchecked, the second transaction reads the dirty data and makes further processing
2 Non-REPEATABLE READ
A transaction reads the previous data at some time after reading some data, but finds that the read data has changed
3 Phantom Reading
One transaction re-reads the previously retrieved data in the same query condition, but finds that other transactions have inserted new data that satisfies the other query criteria.
These problems, in fact, are the database read consistency problem, the database must provide a certain transaction isolation mechanism to solve
This article is from the DBA Sky blog, so be sure to keep this source http://9425473.blog.51cto.com/9415473/1671300
Problems caused by concurrent processing of transactions