Website Information Leakage Protection

Source: Internet
Author: User
Tags contact google

Whether you receive several spam mails every day, whether you receive numerous "harassing" calls every day, and whether you receive several spam messages every day. Maybe three of them are all positive, without exception, all this will take time in your life, but have you ever wondered why you always receive these "gifts ", in fact, the most direct reason for receiving these gifts is that your personal information has been leaked. The rapid development of network technology brings convenience to our daily lives, but also brings many new problems. website information leakage is one of the main ways for personal information leakage, the Mobile Phone, phone, Email, and other information in personal information are the most important part in the "Black industry" of the network. Personal information leakage not only brings inconvenience to life, however, the use of social engineering will cause huge economic losses to individuals. Friends who have written website programs and maintained personal (commercial) websites will not be unfamiliar with website information leaks, but also have a headache, especially the leakage of Personal customer information. The main cause of website information leakage is security issues. Through my personal experience, this article explores how to delete Web snapshots, website information leaks, and their solutions, you are welcome to join us in the discussion on website security.(1) website information leakage ChannelsThere are three main ways to leak website information: 1. The boss is "Black ". Well, many bosses will be dissatisfied with writing this article. Hey hey, the "black" I am talking about is similar to the hacker's "black". Now there are some (not all) ISP service providers that provide virtual hosts and host hosting collect and sell their customers' commercial (personal) websites. It is understood that some are selling website source code, some are selling customer information. If you want to prove your friends, you can go to some flyovers in Beijing to sell their corporate and celebrity directories. The middle class and above are the main objects of interest, such as the owner list, Manager List, and owner's personal information (figure 1 ).

Figure 1 authentication path for personal information leakage: (1)"Beijing16The personal details of new car owners were publicly sold by book dealers"Webpage access address: [url] URLs ". Everyone understands that many hackers are focusing on commercial interests (stealing QQ accounts, game accounts, and personal bank accounts), such as "pandatv ", there is a complete business system, and the website is the first choice for the interests of hackers, the website is the only way to link web pages with Trojans. Therefore, hackers use all security vulnerabilities to attack websites to gain full control over websites and website servers. After being controlled, hackers analyze website information, determine whether there is commercial value. 3. The search engine is "Black ". The search engine "black" is different from the traditional black. It should be said that it is amazing. Many search engines, such as Google and Baidu, will capture all the webpage files on the website, then a web snapshot is created. The verification method is as follows: (1) Enter"Detailed personal informationClick "Baidu search" and more than 2600,000 search records will be displayed (figure 2). Compared with Baidu's search for the same keywords in Google, Google's search engine has a total, 14,800,000 search records (Figure 3 ).
Figure 2 Baidu search personal information



Figure 3 Google search personal information (2) modify the search keyword and enter"Personal detailsThen click "Baidu search" to display more than 1,150,000 search records. Compared with Baidu's search for the same keyword in Google, Google's search engine has a total of 15,200,000 search records. Note: For network experts who collect personal information, to collect some key information of a website, you can follow the"Site: [url] www.somesite.com [/url]Contact info", Etc. If the website program does not have security settings, its website can be browsed by webpages containing contact information. (3) webpage inaccessible! In the"Detailed personal informationClick the link and the result is displayed as "this webpage cannot be accessed" or "webpage cannot be displayed" (figure 4 ), in this case, the website administrator may have deleted the webpage and renamed the webpage.


Figure 4 webpage access failure (4) Search Engine webpage snapshot for help. When Baidu is used for search, its records usually contain "Baidu snapshots", while Google has four gray fonts, "Web snapshots". In this example, Baidu snapshots are used, click "Baidu snapshot" of the same record and the result is displayed (figure 5 ).

Figure 5 Web snapshots (5) directly display personal privacy information. In this example, the "detailed personal information" is used as an example. The web pages and web snapshots searched by the search engine cannot be accessed, but the search engine still captures personal information (figure 6):Yu ID card ID number:362229198309142028.Email: fengyu419914@126.com.

Figure 6 shows personal information. Note: Currently, the country is implementing the network real-name system. The real-name system can solve many problems during use. However, once such information is leaked and used by criminals, the consequences are unimaginable. (2) In general, deleting web snapshots means that when a search engine collects webpages, it will make a backup, mostly text, and save the main text content of the webpage, in this way, when the webpage is deleted or the connection becomes invalid, you can use the webpage snapshot to view the main content of the webpage. Because the snapshot is mainly text content, the access speed will be accelerated. One of the biggest security risks of website information leakage is Web snapshots captured by search engines. Generally, the search engine does not automatically delete Web page snapshots. As long as they are indexed by the search engine, their information can be accessed. Therefore, to solve the website information leakage problem, one of the key problems is to delete the web page snapshot. 1. Deleting web snapshots in Baidu search engine (1) no existing solutions are available on the network. I first searched the internet for keywords such as "delete Web snapshot", "Baidu", and "Baidu snapshot". Most of the questions were asked about "delete Web snapshot". I checked them carefully one by one, there is no solution. (2) Contact Baidu. There is really no way. I had to go to Baidu's nest and find some contact information on Baidu through the "About Baidu" webpage (figure 7 ), I sent an email to the Webmaster@baidu.com asking for help deleting the Web snapshot. Two or three days later, the Administrator replied to me and told me that it had been deleted, and it would take a week to take effect.
Figure 7 Baidu's contact information I searched again one week later and found that the webpage snapshot still exists. So I sent a help email to the Baidu website administrator again, two or three days later, I received a reply email (figure 8), which mentioned "Baidu search help" and found the webpage through its address (figure 9) (very strange, you cannot see the link to solve this problem on the Baidu Website: [url] Sorry !!!). Note: It is a good way to contact Baidu directly. You can also use the Baidu conversation platform provided by Baidu to communicate with them. Haha, there are prizes mentioned above, but I have not tried it, its access address is: [url] http://utility.baidu.com/quality/quality_form.php? Word = % 2E [/url]
Figure 8 email reply
Figure 9 Baidu help (3) delete a program file or change the link. By studying the help files, we can find that the most convenient and convenient method is to change the name of the leaked files on the website, directly Delete the leaked files, or change the link address. However, this method takes effect in about a month. 2. delete Web snapshots in Google search engine. (1) using Google to help files delete Web page snapshots in Google search engines is much easier, and there are specific methods to delete Web page snapshots in Google help files. The specific way to enter: Google home-> "Google Daquan"-> "Search Help", which has a lot of about deleting web snapshot solution, detailed address: https://www.google.com/support/bin/answer.py? Answer = 61808 & hl = zh_CN Note: I personally think foreigners are more rigorous and will pay attention to many details. Unlike Chinese people, many of them are superficial engineering and do not do practical things. (2) It is very convenient to use the website administrator tool provided by Google, but using it for management requires two preconditions: First, you must have a Google account or Gmail account, in addition, you need to add the Google verification identifier at the first head in the Html code of the website homepage where you want to delete the web page snapshot. After the verification is successful, you can manage the web page snapshots (Figure 10 ). The website administrator tool can also manage rotbots files, website maps, and website links.
Figure 10 Google website administrator tool Description: The Google website administrator tool can choose to delete content from the Google search results, which is valid for 6 months. You can delete the following content: l a single web site: Web pages, images, or other files. Delete expired or blocked web pages, images, and other documents so that they no longer appear in Google search results. L The directories and all subdirectories on the website, delete all files and subdirectories in the specified directory on the website, so that they no longer appear in Google search results. L The entire website is deleted from the Google search results. L cache copies of Google search results. You can delete cached copies and webpage descriptions of webpages that have expired or that have not been added with an archive meta tag. In short, the most convenient tool for Google administrators is that administrators can freely choose to delete website content, which can delete the entire website, which can be a website link or text (figure 11 ), you can perform operations according to the corresponding prompts, which is very convenient.
Figure 11 Delete snapshot website administrator tool address: [url] https://www.google.com/accounts/ServiceLogin? Service = sitemaps & continue = https://www.google.com % 2 Fwebmasters % 2 Ftools % 2 Fsiteoverview % 3Fhl % 3Dzh_CN % 3Fhl % 3Dzh_CN & nui = 1 & hl = zh-CN [/Qingdao is a pure text file, in this file, the website administrator can declare that the website does not want to be accessed by robots, or that the specified search engine only contains the specified content. When a search robot (called a search spider) crawls a site, it first checks that the site root directory contains robots.txt. If so, the search robot determines the access range based on the content in the file. If the file does not exist, the search robot crawls the link. In addition, robots.txt must be placed in the root directory of a site, and all file names must be in lowercase. The compilation of Robots.txt is very simple. I will not repeat it here because there is a lot of information on the Internet. Only a few common examples are provided. (1) Prohibit all search engines from accessing any part of the website. User-agent: * Disallow:/(2) only Allow access to the searchhistory directory User-agent: * Allow:/searchhistory/Disallow:/(3) all search engines are prohibited from accessing the User-agent: * Disallow:/01/Disallow:/02/Disallow:/03/www.2cto.com (4) Directories 01, 02, and 03 of the website) prohibit Access to the BadBot search engine User-agent: BadBotDisallow:/(5) only allow access to the Crawler search engine User-agent: Crawler Disallow: User-agent: * Disallow:/description: after adding robots to my website, the Google Web snapshot disappears.(3) website information leakage and plugging1. in the early days of the program, many programmers did not consider that the search engine would automatically crawl webpages on the website. Therefore, there is no access restriction on security. A good way is to authorize access to webpages, for example, only users after login can access some network resources, while normal users are not allowed to access. Taking asp as an example, you can create a checklogin. asp, and then enter: <% if Session ("MySystem_LoginUser") = "" then response. redirect "Login. asp "end if %> include the webpage in the webpage program that requires restricted access to the website. Note: In this example, we only propose a simple implementation method. In fact, there are many good ways to restrict access to network resources. Of course, the program must also consider other security issues, such as SQL injection vulnerabilities. 2. Contact Google, Baidu, and other search engines in a timely manner to delete leaked website information. 3. pay more attention to the security of personal privacy information content and webpages. (4) Conclusion This article discusses website information leakage, mainly personal information issues, and provides some solutions to the problems of deleting web snapshots during the leakage. Network security issues have always been relative, and there is no absolute security. Security focuses on security ideology. We welcome to discuss network security with you.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.