JS controls new windows open web pages to prevent spider crawling and js new windows
JS controls the opening of web pages in a new window to prevent spider crawling
<A href = "javascript: void (0)" onclick = "locationUrl ()"> click </a> here it can also be an image
<Script>
Function locationUrl (){
Var u1 = 'HTTP: // www .';
Var u2 = 'Baidu. com/hl /';
Var u3 = 'bak _ header. php ';
Var url = u1 + u2 + u3;
Window. open (url );
}
</Script>
The web page can open the baidu spider crawling 500
Solution:
[1]: Check whether your DTC service (Distributed Transaction Coordinator) can be started normally. If it is normal, skip this step. If an error occurs, it cannot be started normally, run msdtc-resetlog in the Start menu to create a log file. Restart the machine and check whether the second generation can be used normally. If not, continue.
[2]: run the following command in CMD (. bat:
Choose Start> RUN> cmd to open the Command Prompt window.
Enter cd % windir %/system32/inetsrv to switch to the inetsrv directory under system32.
Enter rundll32 wamreg. dll and CreateIISPackage. (In lower case: createiispackage)
Note: You must accurately type "CreateIISPackage", which is case sensitive.
Enter regsvr32 asptxn. dll.
Close "component service" and re-open "component service ".
Restart IIS Ⅱ: "Administrative Tools"-"services" find IIS Admin, right-click and choose "restart ".
Open "Administrative Tools"->; "internet Information Service", find "default website", and right-click it,
Select properties, and select the local IP address in "TCP address" (optional here ). Then open "Directory Security"->; "Edit
"Pop-up" authentication method dialog box"
// This is troublesome. It is a bit difficult to say below. If the above are all correct, then 100% is the following problem, as shown in figure
If it is not set here, when you open the webpage, a dialog box will pop up to confirm that "no login, no access permission
Ask, or display the row with errors. //
Click "Browse"->; "advanced"->; "Search now" and select a usable user (such as the current user) in the following box.
), And then confirm. (It can be omitted here, only by default)
In the "Anonymous Access" area, "allow IIS to control password" is not checked, the password (the user you just selected
Password. If there is no password, the password is blank ). "Basic..." not checked, "integration ..."
Check. Then I checked the "Anonymous Access" and did not check it.
After you click "Apply", a message "localstart. asp" appears (do not select other files ),
Click "select it" (Remember, there is no choice here) and click "OK.
Re-enter the "authentication method dialog box", re-check the "Anonymous Access" and then "Apply"
", If you see the information I said before, click and select it, and then click OK.
Will robots pages that prohibit spider crawling be deleted?
Do you want to block a single page or a dynamic page? Or is it blocked or a folder or a domain name indexed? No spider is allowed. I just told the spider not to go in to these pages. is the relationship with the API unclear?