How to implement php without interrupting download for a long time! Urgent! Urgent!

Source: Internet
Author: User
How to implement php without interrupting download for a long time! Urgent! Urgent! Php downloads are not interrupted for a long time

Find a php method to download the website api file for a long time without interruption
Or find a software that can download the same task every day and put it in the scheduled task.
Mainly used to download api files
Currently, I am using the php copy function to download the file. I cannot download the file in the next hour. I don't know why, do I have to set something else ???
Please help me ~~~~~


Reply to discussion (solution)

First, if you want to capture 10 million data records, we recommend that you do not capture them all at once.

Capture 1000 entries and check the execution time. if the execution time is 1 minute, you can write a command (linux) or schedule a task (windows) to be executed every 3 minutes.

If the number of items executed at a time is, it is certainly prone to problems.

 When learning CURL, write a small program to download the backup database on the server. Add scheduled tasks to access this page for download.

First, if you want to capture 10 million data records, we recommend that you do not capture them all at once.

Capture 1000 entries and check the execution time. if the execution time is 1 minute, you can write a command (linux) or schedule a task (windows) to be executed every 3 minutes.

If the number of items executed at a time is, it is certainly prone to problems.

Hello, thank you for your answers.
Note: the size of a single downloaded file is not very large, only between a few megabytes and 30 megabytes.
The main reason is that there are more than 600 files, resulting in long-time download program interruptions.
Previously, a database was created to save download records for judgment, but the results were completed multiple times.
Is there any way to complete the execution at one time?
Or is there any software that can implement such a function?


First, if you want to capture 10 million data records, we recommend that you do not capture them all at once.

Capture 1000 entries and check the execution time. if the execution time is 1 minute, you can write a command (linux) or schedule a task (windows) to be executed every 3 minutes.

If the number of items executed at a time is, it is certainly prone to problems.

Hello, thank you for your answers.
Note: the size of a single downloaded file is not very large, only between a few megabytes and 30 megabytes.
The main reason is that there are more than 600 files, resulting in long-time download program interruptions.
Previously, a database was created to save download records for judgment, but the results were completed multiple times.
Is there any way to complete the execution at one time?
Or is there any software that can implement such a function?


Sorry, I have never heard of such software... many coders, like me, like me, prefer to write code to run after the death of complex requirements ....

It's not good to use your method. you can use a database to save the download records. each download has a fixed number. The next time you run the command, the download will not start from the recorded place. then you can run the scheduled task or wget.

Maybe you need set_time_limit (0 );?

  When learning CURL, write a small program to download the backup database on the server. Add scheduled tasks to access this page for download.
Hello, thank you for your answers.
I downloaded an api file from someone else's server without ftp.
I just tried your program and it seems that it cannot be downloaded. I don't know if I have changed it wrong. Please check it out.
Header ("content-type: text/html; charset = utf-8"); set_time_limit (0); $ curl = curl_init (); $ target_file = 'http: // Encode ($ curl, CURLOPT_URL, $ target_file); curl_setopt ($ curl, expires, 1); curl_setopt ($ curl, CURLOPT_VERBOSE, 1); curl_setopt ($ curl, expires, (0); curl_setopt ($ curl, CURLOPT_TIMEOUT, 300); curl_setopt ($ curl, CURLOPT_USERPW) D, "update: windcms"); $ target_path = ''; if (is_dir ($ target_path) {$ outfile = fopen ($ target_path. '/'. date ("Ymd", time ()). '. XML', "w +"); curl_setopt ($ curl, CURLOPT_FILE, $ outfile); $ info = curl_exec ($ curl); fclose ($ outfile ); $ error_no = curl_errno ($ curl); if ($ error_no = 0) {echo "download successful! ";} Else {echo" Download failed! ";}Curl_close ($ curl );}

Maybe you need set_time_limit (0 );?
Hello, this is useful for me, but the program is disconnected after running for more than one hour. What else do I need to set?


   When learning CURL, write a small program to download the backup database on the server. Add scheduled tasks to access this page for download.
Hello, thank you for your answers.
I downloaded an api file from someone else's server without ftp.
I just tried your program and it seems that it cannot be downloaded. I don't know if I have changed it wrong. Please check it out.
Header ("content-type: text/html; charset = utf-8"); set_time_limit (0); $ curl = curl_init (); $ target_file = 'http: // Encode ($ curl, CURLOPT_URL, $ target_file); curl_setopt ($ curl, expires, 1); curl_setopt ($ curl, CURLOPT_VERBOSE, 1); curl_setopt ($ curl, expires, (0); curl_setopt ($ curl, CURLOPT_TIMEOUT, 300); curl_setopt ($ curl, CURLOPT_USERPW) D, "update: windcms"); $ target_path = ''; if (is_dir ($ target_path) {$ outfile = fopen ($ target_path. '/'. date ("Ymd", time ()). '. XML', "w +"); curl_setopt ($ curl, CURLOPT_FILE, $ outfile); $ info = curl_exec ($ curl); fclose ($ outfile ); $ error_no = curl_errno ($ curl); if ($ error_no = 0) {echo "download successful! ";} Else {echo" Download failed! ";}Curl_close ($ curl );}
Use readdir to repeat the directory and execute the download in a loop. use set_time_limit (0). If you set the script execution time to no limit, you can estimate that there are more than 6 GB of all your files.


Maybe you need set_time_limit (0 );?
Hello, this is useful for me, but the program is disconnected after running for more than one hour. What else do I need to set?

I personally said that... it's normal... PHP is indeed compared in processing long-time programs ?, Of course, it must be emphasized that this is my opinion and can be implemented smoothly theoretically in a harsh environment? (Stable network environment + large enough memory

Let's Move them apart...


Hello, because after the download is complete, you still need to import it to the database, but I don't know when to download it. how can I determine it?
Another point is that the program is opened by adding batch processing in the scheduled task, and the process is automatically closed 10 seconds after execution, however, the program is interrupted after being downloaded for more than one hour. does this affect the server?



    When learning CURL, write a small program to download the backup database on the server. Add scheduled tasks to access this page for download.
Hello, thank you for your answers.
I downloaded an api file from someone else's server without ftp.
I just tried your program and it seems that it cannot be downloaded. I don't know if I have changed it wrong. Please check it out.
Header ("content-type: text/html; charset = utf-8"); set_time_limit (0); $ curl = curl_init (); $ target_file = 'http: // Encode ($ curl, CURLOPT_URL, $ target_file); curl_setopt ($ curl, expires, 1); curl_setopt ($ curl, CURLOPT_VERBOSE, 1); curl_setopt ($ curl, expires, (0); curl_setopt ($ curl, CURLOPT_TIMEOUT, 300); curl_setopt ($ curl, CURLOPT_USERPW) D, "update: windcms"); $ target_path = ''; if (is_dir ($ target_path) {$ outfile = fopen ($ target_path. '/'. date ("Ymd", time ()). '. XML', "w +"); curl_setopt ($ curl, CURLOPT_FILE, $ outfile); $ info = curl_exec ($ curl); fclose ($ outfile ); $ error_no = curl_errno ($ curl); if ($ error_no = 0) {echo "download successful! ";} Else {echo" Download failed! ";}Curl_close ($ curl );}
Use readdir to repeat the directory and execute the download in a loop. use set_time_limit (0). If you set the script execution time to no limit, you can estimate that there are more than 6 GB of all your files.
Well, yes, about 5 Gbps


Hello, because after the download is complete, you still need to import it to the database, but I don't know when to download it. how can I determine it?
Another point is that the program is opened by adding batch processing in the scheduled task, and the process is automatically closed 10 seconds after execution, however, the program is interrupted after being downloaded for more than one hour. does this affect the server?


This type of problem is not a specific statement problem, so I can only say that my thinking cannot guarantee that this will certainly work or be the best.

You can write two start/end signs in the database before and after the copy operation.

Check whether the previous program has been executed before the php file copy. if not, exit. If yes, write the start mark to start copy.

In this case, there is no issue of executing the task for more than an hour...



Hello, because after the download is complete, you still need to import it to the database, but I don't know when to download it. how can I determine it?
Another point is that the program is opened by adding batch processing in the scheduled task, and the process is automatically closed 10 seconds after execution, however, the program is interrupted after being downloaded for more than one hour. does this affect the server?


This type of problem is not a specific statement problem, so I can only say that my thinking cannot guarantee that this will certainly work or be the best.

You can write two start/end signs in the database before and after the copy operation.

Check whether the previous program has been executed before the php file copy. if not, exit. If yes, write the start mark to start copy.

In this case, there is no issue of executing the task for more than an hour...

Hello, one hour of execution is because there are too many files to be downloaded, because I downloaded them cyclically. where to stop the download, and where to download it next time?

Okay, I don't know how to make the PHP program that downloads 5G Files run stably for several hours.

All of the above are circular crawling solutions. if you want to use one capture, you cannot provide valid suggestions.

Okay, I don't know how to make the PHP program that downloads 5G Files run stably for several hours.

All of the above are circular crawling solutions. if you want to use one capture, you cannot provide valid suggestions.
Hello, can I achieve one-time capturing with wget?
I have never used wget.

Wget has the resumable Upload function, but I have never used it like this. For more information, see

Http://xinkang120.blog.163.com/blog/static/194668223201188114417360/

For PHP breakpoint downloads, you can find related plug-ins.

If it is a windows Server, run cmd. there is no timeout and no disconnection occurs.

The last time I captured more than 5000 flash game files using this method, I can just execute them on the server.


In addition, if you want to download the file as soon as possible, you can use PHP to import the captured file address to the database and export it to a list. Import to thunder or other download software for batch download. I have tried this method, but it is not bad.

If it is a windows Server, run cmd. there is no timeout and no disconnection occurs.

The last time I captured more than 5000 flash game files using this method, I can just execute them on the server.


In addition, if you want to download the file as soon as possible, you can use PHP to import the captured file address to the database and export it to a list. Import to thunder or other download software for batch download. I have tried this method, but it is not bad.

Hello, thank you for your answers.
Is the first method to run the php file with cmd? If yes, I tried to use cmd to process the webpage and it would also time out. I don't know if my method is incorrect. if not, how can I get it?
Second, if thunder is used, how can we run it every day? Because the download link is the same, but the content is updated every day, but it seems that the download cannot be repeated after Thunder downloads. how can I set this?

It looks like you can't use Thunder in this situation. it's a planning task. Thunder does not seem to provide any interface for the program.

When cmd is used, the php file is executed, and there is no timeout, even if php. ini has configured the timeout time. A simple method is to add PHP to the system Path (you don't know if this is the case, but google will not), and then write a bat file.

php d:\wwwroot\cmd\task.phppause


This task. php is the file that you use HTTP to access the webpage to execute the task. Save this as a. bat file. Double-click to execute the task, or you can add it to the scheduled task for daily execution.

     When learning CURL, write a small program to download the backup database on the server. Add scheduled tasks to access this page for download. This method can be used

It's better to use shell to do this.

Even the optical fiber does not guarantee that the time-consuming multi-file download can be completed at one time.
Wget or other command line clients can detect and skip (or resume, resume is generally based on a fixed number of bytes ).

If you want to automate the download, we recommend that you use log, recorder, or error
Loop N times, skip completed, re-download error, until all records in the log are completed
Or log only records errors and downloads them cyclically until the log record is 0.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.