How does PHP handle big data, and how to ensure processing integrity;
such as: At the same time to 1W users transfer, how to guarantee a trigger, so that all users of the transfer to the account.
Please explain it in detail!
Reply content:
How does PHP handle big data, and how to ensure processing integrity;
such as: At the same time to 1W users transfer, how to guarantee a trigger, so that all users of the transfer to the account.
Please explain it in detail!
I think what you're hoping for is
1 guarantee the operation of a single user is accurate, if one of the steps is a problem, roll back immediately, and then start the transfer from the new step
2 actions on individual users, regardless of success, do not affect the operations of other users
I give an example,
For example, a transfer requires 10 steps, you write 10 interfaces,
Each interface has a return value (indicating whether the operation was successful), and these 10 interfaces are placed in the queue (this is guaranteed to transfer 10 steps to be processed in order),
When the user transfers funds, use the steps of the transfer queue,
The above is the operation of the transfer, the following say to the user's handling
We see the user as an object, the user's information is the property of the object, all the objects are placed in the queue, the queue is traversed, for each user to do the following
1 record the status of this user first (with the exception of the Accidental recovery),
2 Steps to transfer queue
2-error a problem, restore the user's initial information, add the user back to the end of the queue
The above is to ensure the integrity of processing, the following talk about speed problem
In order to improve reading and writing speed (I mean a lot of content), you can use a NoSQL database such as Redis, Redis has a queue this data structure, you Baidu look.
Second, consider using multi-threading, creating 10 threads, 1w user points, one thread handling 1000 users
I hope I can help you.
Use queues like Memcacheq to process slowly