Requirement Description: the website www.xxx.com, the Environment WIN64WAMP, distribution of 3 too many servers, 1 master, 3 times. The image CSSJS used in it is stored in the www. xxx. comPublic directory. The requirement is that the JSCSS under the Public directory introduced to the website will be automatically compressed, which is conducive to the opening speed... Requirement Description
Website www.xxx.com, Environment WIN64 WAMP, distribution of 3 too many servers, Master 1, 3 times. The images/CSS/JS used in them are stored in the www.xxx.com/Public/ directory. The requirement is that JS/CSS introduced to the/Public directory on the website will be automatically compressed, which is conducive to the opening speed.
Original Solution
Through JSMInify + htaccess, direct the JS/CSS request address in the access/Public directory to PHP in JSMinify, and then perform automatic compression through the JSMinify package. In this way, programmers and designers do not need to perform any operations to upload css js. Very convenient.
Demand change
First, you must transfer all the resources in/Public to the CDN server. The CDN server configures LN (Linux + Nignx), does not support PHP and does not support htaccess. How can we minimize the workload?
Personal Solution
1) Use the JSMini plug-in Notepad ++ to compress and format CSS/JS files in the Public file. If the website is still undergoing major changes, many js css files may be modified each time. If the files are compressed one by one, the files may be missing or some errors may occur. 2) use JSMinify to create a plug-in, scan the JS/CSS files in the specified directory, compress them, and store them. Disadvantage: manual operations are still required, and it takes some time to create this plug-in. 3) Find the batch compression tool on the Network: No problem found currently.
Connection
Google JSMinfiy:https://code.google.com/p/minify/
How can we simplify it?
I am using a local PHP script to copy all the resources in the Public directory (non-CSS/JS) to another directory. CSS/JS uses http: // localhost/min/index. php? Filename = $ filename. It's a little rough, but it's omitted the trouble of creating a program on the minify class library.
Reply content: Requirement Description
Website www.xxx.com, Environment WIN64 WAMP, distribution of 3 too many servers, Master 1, 3 times. The images/CSS/JS used in them are stored in the www.xxx.com/Public/ directory. The requirement is that JS/CSS introduced to the/Public directory on the website will be automatically compressed, which is conducive to the opening speed.
Original Solution
Through JSMInify + htaccess, direct the JS/CSS request address in the access/Public directory to PHP in JSMinify, and then perform automatic compression through the JSMinify package. In this way, programmers and designers do not need to perform any operations to upload css js. Very convenient.
Demand change
First, you must transfer all the resources in/Public to the CDN server. The CDN server configures LN (Linux + Nignx), does not support PHP and does not support htaccess. How can we minimize the workload?
Personal Solution
1) Use the JSMini plug-in Notepad ++ to compress and format CSS/JS files in the Public file. If the website is still undergoing major changes, many js css files may be modified each time. If the files are compressed one by one, the files may be missing or some errors may occur. 2) use JSMinify to create a plug-in, scan the JS/CSS files in the specified directory, compress them, and store them. Disadvantage: manual operations are still required, and it takes some time to create this plug-in. 3) Find the batch compression tool on the Network: No problem found currently.
Connection
Google JSMinfiy:https://code.google.com/p/minify/
How can we simplify it?
I am using a local PHP script to copy all the resources in the Public directory (non-CSS/JS) to another directory. CSS/JS uses http: // localhost/min/index. php? Filename = $ filename. It's a little rough, but it's omitted the trouble of creating a program on the minify class library.
Why not use grunt or gulp for automatic compression?
Can CDN back up images? Resources are also placed on the source station, and requests are sent to the CDN. If there are no resources on the CDN, the CDN will be synchronized to the source station. Make back-to-source policies.
Yeoman scaffolding I use
Similar to me, when I run the dynamic route, I don't agree with it, and I also have the idea of using plug-ins (but I have used the generation of dynamic route optimization instead ...)
However, Alibaba Cloud CDN does not support self-Signed and stored content, such as Cloudflare. This is undoubtedly the fastest and best solution.
If you cannot renew CDN and CDN does not support custom environments, the local website hosting solution is the only solution. Therefore, you can take a test to modify the inventory used by the local website, in fact, real auto-dynamic scenarios exist.
In short, it takes some time to write programs, and time to convert itself into a program as a machine as a line of commands, so much time is spent on asking for help.
You already know which of the most convenient tasks.
But is the respondent probably the same as you did? If no one has provided a perfect solution in the last few days, I will spend a few minutes to help you develop one. It's really easy.
Challenge.
Try Google.nginx
Plug-in pagespeed!