PHP interface-PHP Data interface design

Source: Internet
Author: User
Tags import database rsync
Service A to provide data 100W a day of log data, b need to write an interface to crawl service a data, (assuming a 100W can be simulated from the database to take out), now need to design an interface, to ensure that the interface can be relatively fast to obtain 100W of data, get the data suddenly interrupted can continue to get the breakpoint, At the same time to ensure that the interface security

Reply content:

Service A to provide data 100W a day of log data, b need to write an interface to crawl service a data, (assuming a 100W can be simulated from the database to take out), now need to design an interface, to ensure that the interface can be relatively fast to obtain 100W of data, get the data suddenly interrupted can continue to get the breakpoint, At the same time to ensure that the interface security

I've done the same data-counting service,
A service, is a server ( A服务器 )
b service, on the other server ( B服务器 )

The final solution is that a server's data is eventually stored through the file,
Then on the a server through the scheduled task with script ( php的curl ) or simple point Direct rsync command synchronization to B server,

Then b server sweeps the contents of the file, then will 数据归档 , and 去重 入库 so on.

The amount of data in 100W is still very small. When I first saw it, I thought it was 100M.

The solution was more sympathetic to @sunwenzheng's proposal. Use the Redis queue to resolve.

Here's how to fix it:

1. Build a queue service on Server B (not on a, because your a server may be the primary server, reducing its stress)
2.A server, and immediately after the log is generated, push to the queue of the B server Redis.
The 3.B server polls the queue and collects data into the database.

The advantage of this is that the timing of logging is guaranteed, as well as the timeliness of controlling the log acquisition. And more than enough for 100W data.

Such a large amount of data, but also require the continuation of the breakpoint, fortunately, the log data, real-time should not be high.
You can consider the service a timed export to a file, and then service B through the ftp/sftp, such as direct download, FTP speed is fast enough, if it is faster, you can take an NFS share file. (All support for the continuation of the breakpoint, OH)

The interface provided by the other side is there any time parameters? If there is a partial request can be made, the last time to record, the next request to use this time to do the condition is good
You can also write the received data to the Redis queue while another process reads data from the Redis queue in a batch write to the database
Upstairs, if you can use the text, you can also consider using rsync to synchronize

With such a large data, the use of API transfer efficiency is too low point.
* Export the daily database directly to a file (data volume if too large, can export one file per hour of data, or 10W a file)
* Then Server B gets through HTTP (HTTP breakpoint continuation is easy to implement)
* Get to reverse unpack data later, import database
The advantage of this is that it is simple to implement and not easy to make mistakes.

You might consider using a pagination approach. For example, 100W, I handle 1W at a time, after processing, processing the next page of data. As for how this is handled, you can get the total number first and then generate the queue task. Relying on the server to execute the fetch data

  • Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.