Laravel 5.5 officially recommended Nginx configuration learning tutorial, laravelnginx

Source: Internet
Author: User
Tags html header

Laravel 5.5 officially recommended Nginx configuration learning tutorial, laravelnginx

Preface

This article mainly introduces the official Nginx configuration recommended by Laravel 5.5, and shares it with you for your reference. I will not talk about it here. Let's take a look at the detailed introduction.

Laravel 5.5 officially releases Nginx server configurations.

server { listen 80; server_name example.com; root /example.com/public; add_header X-Frame-Options "SAMEORIGIN";  add_header X-XSS-Protection "1; mode=block";  add_header X-Content-Type-Options "nosniff";  index index.html index.htm index.php; charset utf-8; location / { try_files $uri $uri/ /index.php?$query_string; } location = /favicon.ico { access_log off; log_not_found off; }  location = /robots.txt { access_log off; log_not_found off; }  error_page 404 /index.php; location ~ \.php$ { fastcgi_split_path_info ^(.+\.php)(/.+)$; fastcgi_pass unix:/var/run/php/php7.1-fpm.sock; fastcgi_index index.php; include fastcgi_params; } location ~ /\.(?!well-known).* { deny all; }}

I am not good at Nginx. I believe many of my friends share the same idea. Let's learn about Nginx :)

1. add_header X-Frame-Options "SAMEORIGIN ";

The X-Frame-Options response header is used to indicate to the browser whether a page can be displayed in <frame>, <iframe>, or <object>. Websites can use this function to ensure that the content of their websites is not embedded in others' websites. This prevents clickjacking attacks.

X-Frame-Options has three values:

DENY

This indicates that the page cannot be displayed in a frame, and it is not allowed to be nested in pages with the same domain name.
SAMEORIGIN

This page can be displayed in a frame on the same domain name page.
ALLOW-FROM uri

Indicates that the page can be displayed in the frame of the specified source.
This response header setting should be quite common. Previous security teams of foreign customers used tools to scan for vulnerabilities related to our projects, including clickjacking, this setting is also used to solve this problem.

2. add_header X-XSS-Protection "1; mode = block ";

XSS is a cross-site scripting attack. It is a common network attack method. You can change the field to indicate whether the browser has enabled the browser's built-in XSS filter mechanism for the current page. 1 indicates that the filter is allowed. mode = block indicates that the browser cannot load the entire page after detecting an XSS attack.

Reference: Xianzhi XSS challenge knowledge point Overview

3. add_header X-Content-Type-Options "nosniff ";

This response header sets the browser to disable Content-Type speculation. In many cases, the server does not properly configure the Content-Type. Therefore, the browser determines the Type based on the data features of the document, for example, attackers can resolve requests originally parsed as images to JavaScript.

We found that the above three common attack protection configurations are still very practical. We recommend that you use them. Previously, our server only used add_header X-Frame-Options "SAMEORIGIN"; configuration.

4. Do not record favicon. ico and robots.txt logs

 location = /favicon.ico { access_log off; log_not_found off; } location = /robots.txt { access_log off; log_not_found off; }

Favicon. ico website avatar. By default, it is a small website icon on the browser tab and a small icon displayed in the favorites folder.

If favicon. ico is not specified in the html header, the browser will access the http://xxx.com/favicon.ico by default. If this file does not exist, it will cause 404 and be recorded in access_log and error_log. This record is unnecessary in the log file and can be canceled.

Robots.txt is a file that crawlers of search engines crawl. In industry specifications, when a spider crawls a website, it first crawls the file to find out which directory files on the website do not need to be crawled, in SEO, the correct configuration of robots.txt is very effective for SEO. This file does not need to be recorded in logs, and most websites do not have the robots.txt file.

These configurations can be used on most websites, not just the Nginx server. I believe the Apache server also has related configurations. If you are using other web servers, we recommend that you use the above configurations.

Summary

The above is all the content of this article. I hope the content of this article has some reference and learning value for everyone's learning or work. If you have any questions, please leave a message to us, thank you for your support.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.