Cross-site crawling of other sites to supplement the page, capture site pages. How to supplement the page capture of other sites across sites, capture site pages in practical applications, often encounter some special circumstances, such as the need for news, weather forecasts, and so on, however, how does one supplement the page capture of other sites and capture site pages?
In practical applications, we often encounter special situations, such as the need for news, weather forecasts, and so on, however, as personal websites or websites with low strength, we cannot do this with so much human, material, and financial resources. what should we do?
Fortunately, resources are shared on the internet. we can use programs to automatically capture the pages of other sites and use them for processing.
What is the use? what the comrade-in-arms gave is not feasible. In fact, this function is provided in Php, that is, the curl library is used. Please refer to the following code!
$ Ch = curl_init ("http://dailynews.sina.com.cn ");
$ Fp = fopen ("php_homepage.txt", "w ");
Curl_setopt ($ ch, CURLOPT_FILE, $ fp );
Curl_setopt ($ ch, CURLOPT_HEADER, 0 );
Curl_exec ($ ch );
Curl_close ($ ch );
Fclose ($ fp );
?>
Http://php.662p.com/thread-504-1-1.html
How can I solve cross-site problems?
Change
You can.
How does one set Directory security to prevent cross-site intrusion when multiple websites are deployed on the vps?
A website is configured with A permission, so that even if website A is intruded, it cannot intrude into Website B across sites. If you do not know much about it, we recommend that you install a host management system, such as a free N-point host management system. In this way, the activated space automatically assigns you a permission.
In practical applications, NLP often encounters special situations, such as news and weather forecasts,...