In practical applications, we often encounter special situations, such as the need for news, weather forecasts, and so on, however, as personal websites or websites with low strength, we cannot do this with so much human, material, and financial resources. What should we do?
Fortunately, resources are shared on the Internet. We can use programs to automatically capture the pages of other sites and use them for processing.
What is the use? What the comrade-in-arms gave is not feasible. In fact, this function is provided in Php, that is, the curl library is used. Please refer to the following code!
<? Php
$ Ch = curl_init ("http://dailynews.sina.com.cn ");
$ Fp = fopen ("php_homepage.txt", "w ");
Curl_setopt ($ ch, CURLOPT_FILE, $ fp );
Curl_setopt ($ ch, CURLOPT_HEADER, 0 );
Curl_exec ($ ch );
Curl_close ($ ch );
Fclose ($ fp );
?>
But sometimes some errors may occur, but the download is complete! I asked foreigners, but they didn't give me a reply. If I really don't think I can, I just add a question in front of the function. so we only need to perform an appropriate analysis on $ txt, we can steal sina's news! However, it is better not to use it! To avoid legal disputes, I just want to tell you that Php is very powerful! You can do many things!
[This article is copyrighted by the author and osuo. If you need to reprint it, please indicate the author and its source]