Capture and download php code for all image files in CSS

Source: Internet
Author: User

The highlight of this article is that the regular expression is more complex, namely, embedding (-_-) embedding, which is a very powerful usage of the Copy function.
> I heard from NsYta that the subject of Xiao Xie is too white. I am too busy recently. I am not empty. Otherwise, I will create a new topic myself.

1. capture images in CSS:
> 1. First, make preparations:
> Step 1: first save the original path of CSS to the $ url variable, and then save the CSS content in abc.css.
> Because multiple CSS files are frequently encountered, Xiao Xie does not directly fill in a CSS path.
> Instead, you can merge the contents of several CSS files into the abc.css file.

$ Data = file_get_contents('abc.css ');

> Read the content of the CSS file to the $ data variable, and then use the regular expression to retrieve the domain name.
> Because many image files use relative root paths, such as/img/1.gif and img/1.gif.
> Then the original CSS address is in the http://www.jb51.net/css/, so the two files above are different locations.

> The first file is in/upload/201109/20110926143903807 .gif, because its path uses the relative root path.
> The second one is in/upload/201109/20110926143903169 .gif, and its path is just a normal relative path.
Copy codeThe Code is as follows:
$ Url = 'HTTP: // www.jb51.net/css/'; preg_match ('/(.*\/\/.*?) \ // ', $ Url, $ host );
// Here with the regular expression of the http://www.jb51.net/to take out, the backend do not forget to add a slash Oh.
//.*? It is a lazy match, that is, the less content that can be matched, the less content that will be matched.
$ Host = $ host [1];


2. Create the image storage Folder:
> Here, Xiao Xie uses is_dir to determine whether a folder exists. If a folder exists, you do not need to create it again.
> By the way, the is_file function can determine whether the file is a normal file or whether the file exists.
> But file_exists () is better, because someone has discussed it on Webmasterworld.com.

If (! Is_dir ('img ') {mkdir ('img ');}

> 3. Obtain the relative address of the image using the regular expression:

$ Regex = '/url \ (\' {0, 1} \ "{0, 1 }(.*?) \ '{0, 1} \ "{0, 1 }\)/';
// Here, we use the regular expression to match the image address. There are three situations to consider: url(1.gif) url('1.gif ') url ("1.gif ").
// These three methods can be used, so we can use the above regular expression to extract 1.gif from it.
// \ '{0, 1} indicates that the single quotation mark may appear once or 0, and \ "indicates that the double quotation mark may appear once or 0.
// You must use a lazy match in the middle. Otherwise, the obtained result is 1.gif instead of 1.gif bird, O (prop _ prop) P.
Preg_match_all ($ regex, $ data, $ result );

> 4. process these images:

> First, use a loop to process the first branch content array extracted from the regular expression.
> Well, the first branch here represents the first parenthesis in the regular expression, and so on.

Foreach ($ result [1] as $ val ){}

> Then use the regular expression to determine, because/upload/201109/20110926143903807 .gifshould also be considered.
> The complete path is used, instead of/img/1.gif or img/1.gif.
> Let's take a separate look and then judge the two to see if they are/img/1.gif or img/1.gif.
Copy codeThe Code is as follows:
If (preg_match ('/^ http. */', $ val) {$ target = $ val ;}
Else if (preg_match ('/^ \/. */', $ val) {$ target = $ host. $ val ;}
Else {$ target = $ url. $ val ;}
Echo $ target. "<br/> \ r \ n ";

> Finally, extract the file name, that is, 1.gif in/img/1.gif, to save the file.
Copy codeThe Code is as follows:
Preg_match ('/. * \/(. * \. \ D +) $/', $ val, $ name );

> Then we can start the download. Here we will introduce a powerful Copy function usage.
Copy codeThe Code is as follows:
If (! Is_file ('./img/'. $ name [1]) {
$ Imgc = file_get_contents ($ target );
$ Handle = fopen ('./img/'. $ name [1], 'W + ');
Fwrite ($ handle, $ imgc );
Fclose ($ handle );
}

> The above is our old method. It's very troublesome. Once, Xiao Xie suddenly discovered the power of Copy.
> Copy can also be downloaded, so you can easily use the following code to process it. The above code can be used as a retired bird.
Copy codeThe Code is as follows:
If (! Is_file ('./img/'. $ name [1]) {
Copy ($ target, './img/'. $ name [1]);
}

> 5. complete source code:

> Fill in $ url and save all CSS content to abc.css.
Copy codeThe Code is as follows:
<? Php
$ Url = 'HTTP: // www.jb51.net/css /';
$ Data = file_get_contents('abc.css ');
Preg_match ('/(.*\/\/.*?) \ // ', $ Url, $ host );
$ Host = $ host [1];
If (! Is_dir ('img ') {mkdir ('img ');}
$ Regex = '/url \ (\' {0, 1} \ "{0, 1 }(.*?) \ '{0, 1} \ "{0, 1 }\)/';
Preg_match_all ($ regex, $ data, $ result );
Foreach ($ result [1] as $ val ){
If (preg_match ('/^ http. */', $ val) {$ target = $ val ;}
Else if (preg_match ('/^ \/. */', $ val) {$ target = $ host. $ val ;}
Else {$ target = $ url. $ val ;}
Echo $ target. "<br/> \ r \ n ";
Preg_match ('/. * \/(. * \. \ D +) $/', $ val, $ name );
If (! Is_file ('./img/'. $ name [1]) {
Copy ($ target, './img/'. $ name [1]);
}
}?>

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.