Discuz Forum SEO Optimization Program

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Set Discuz has enabled pseudo static function, use 5.0/5.5 GBK version;

1, meta,content optimization
2, the content page replication problem
3, robots.txt use and other

Update
The robots.txt problem of discuz5.5
DZ prohibits the addition of a page post

Meta optimization

Discuz background can be set to meta information, or even add their own head information, but its settings are for all pages, all pages have the same keywords and description is not in favor of SEO.

Option one: Delete meta

Modify the header template file templates/default/header.htm: Remove the meta keywords and description tags.

These two label function is very small, and also have discuz some useless information, use bad but will have bad effect, so ningquewulan.

Programme II: Customized META

This section realizes the content page keywords is set as the post title, description is the content First 100 words, also implements the homepage and each edition list page meta individual setting (different edition, the same edition each list page is same).

1. Modify the page header template file templates\default\header.htm: Change the meta Keywords and description labels to the following form

<meta name= "keywords" content= "{$metakeywords} $seokeywords"/>
<meta name= "description" content= "$seodescription"/>

Here's the $seokeywords, $seodescription is the background set that value, below how to customize this value; {$metakeywords} is the key word of Qihoo, leaving behind to deal with

2. Content page (viewthread) Set keywords for the post title, description for the content of the first 100 words

2.1 Modify viewthread.php File:
In include template (' Viewthread '); (Update: dz5.5 is iinclude template ($iscircle?) ' Supesite_viewthread ': ' Viewthread '); Add a line to the above statement:
Require_once Discuz_root. /include/bmt.thread.inc.php ';

2.2 Create include/bmt.thread.inc.php file, content

<?php
if (!defined (' In_discuz ')) {exit (' Access Denied ');}

$seokeywords = Strip_tags ($thread [' subject ']); Keyword is set to the title of the Post
(Updated: Nethome raises a question, and the original code is problematic when you enable the topic classification and allow browsing by category.) So added the label filter, originally in viewthread.php better, for later upgrade convenience, or put it here, subject is very short, will not affect efficiency)

$seodescription = current ($postlist)//description The first 100 words in the content of the article
$seodescription = Mb_substr ($seodescription [' Message '],0,100, "gb2312");
$seodescription = Htmlspecialchars (Strip_tags ($seodescription));
?>

* Here numbers and individual functions apply to the GBK version

The last line of action is to filter the HTML in the content, otherwise it can cause syntax errors in Meta. First remove the HTML tags, but because here is the first 100 words, it is possible that the HTML tag has been truncated, so the use of htmlspecialchars escape, there may be some spam information.
Of course, you can use Strip_tags to remove HTML tags before intercepting, which may be inefficient.

* So the expression here you need to modify according to your own situation.
I am currently using preg_replace ('/[^\xa1-\xff]/', ', $seodescription '), that is, to filter out the contents of Chinese characters, but this will lose English keywords.

Other than that, the data here are viewthread.php has been taken good, so no additional database operations, just do string processing, will not lead to efficiency.

3. List page (forumdisplay) sets different keywords and description

3.1 Modification forumdisplay.php

file, in include template (' Forumdisplay '); Statement to add
Require_once Discuz_root. /include/bmt.forum.inc.php ';

3.2 Create include/bmt.forum.inc.php file, content

<?php
if (!defined (' In_discuz ')) {exit (' Access Denied ');}
$seokeywords = $forum [' name '];
$seodescription = $forum [' description '];

Switch ($forum [' FID ']) {
Case 1://This number is version ID number, different version set different meta
$seokeywords = ' Key1,key2,... ';
$seodescription = ' xxxx xxxx xxxx ';
Break;
Case 2:
$seokeywords = ' Key1,key2,... ';
$seodescription = ' xxxx xxxx xxxx ';
Break;

}
?>

Update: Generally can not switch and case, the list page keyword for the version of the description, to a version of the Special keyword and description, you can set up case. This solves a lot of layout to set a lot of case statements, speed up the execution speed of the program (do not understand the php,java of the case has been optimized, execute quickly)

Use the modified file to achieve, each version of the meta change is inconvenient, feel some retarded, hehe. But did not change the database, used two separate files, modified two files is also very simple, so upgrade or migrate more convenient

The case is a bit more, but it should be much faster than the record of reading the database.

4. Homepage Meta
Set in the background. (if 2, 3 do not do, then Meta also with homepage)

Content optimization
Discuz has a archiver, one is easy to include the URL, the second is the page is relatively dry, and description will take the content from the body, but Archiver does not have the content of [B][url] such as the resolution, those tags do not work, the original display will also increase the spam information.
This section mainly do two aspects of optimization, one is to remove the useless information page, the second is to the title Plus
<h1>
。 Coupled with pseudo static and the optimization of the previous meta, the effect is more than archiver, so you can disable it in the background and reduce the replication page.

Implementation plan

1. Hide the unwanted information in the content page (Viewthread):
Modify the Content page template file templates/default/viewthread.htm:

Use the <!--{if $discuz _uid}--> <!--{/if}--> to include information that will be hidden.

It is accurate to say that when visitors (bots) access, hide that information, when the user logged in is normal, so does not affect the use.
This information mainly refers to the left side of the user Information Bar, (Data Personal Space home page Short message, etc.), the user's signature (hidden not afraid of the content of the signature of the text, links do not work: P)
To live related articles

2. Add to the title
<h1>
Label
Modify the Content page template file templates/default/viewthread.htm:

Replace <span class= "bold" > $post [subject]</span><br><br> with
<h1> $post [subject]</h1>

Custom "This post finally by XXX in xxxxxx edit" content

Modify file templates/default/misc.lang.php: Change the following two lines to something you like:

' Post_edit ' => ' \n\n[[i]] This post is last edited by $editor $edittime [i]] ',
' Post_edit_regexp ' => '/\n{2}\[\[i\] This post is finally made by. * * To edit \[\/i\]\]$/s ',

For example, I changed to [site name URL XXX in xxxx edit]. To amuse oneself when the program is boring, hehe
Note that the top and bottom two expressions want to match. To prevent parsing square brackets from being full-width, do not copy and look in the file.

Page duplication of content pages

Look at the URL of the content page: thread-(TID)-(page)-(Forumdisplay page). HTML, you can see that the last section represents this post on the first page of the list page. So, when you have more and more posts, this post will be from the first page to the second page ..., its URL will be constantly changing. Actually open a list page can be seen, the 2nd page of the post link last number is 2, 3rd page is 3, but very little attention to it. I was in SE's collection found that more and more copy pages, only to the last number of attention.

Workaround
To modify the forumdisplay.php file:

Will $extra = Rawurlencode ("page= $page $forumdisplayadd"); Statement to replace the
$extra = Rawurlencode ("Page=1$forumdisplayadd");

The page in the statement is Forumdisplay page, which is 1 if the post is on the first page of the list page.

Function loss: When the user edits the post or the moderator manages the post, has a prompt to jump the page: chooses to go to the list page or the topic page, then turns to the list page, only then moves to the List page first page, regardless of you originally stays in the page.

REDIRECT 301 redirect

You can see a link like redirect.php?tid=xxx&goto=lastpost#lastpost in the Discuz Forum, which features "latest published, last published, previous theme, Next topic" Features, Only this one feature can cause four copies of the same content page, so the link 301 is permanently redirected to the static address of the post.

Workaround
To modify the redirect.php file:

The first two require_once discuz_root. /viewthread.php '; Statement to replace the
$bmt _url= ' location:/thread-'. $tid. $page. ' -1.html ';
Header (' http/1.1 moved Permanently ');
Header ($bmt _url);
The latter two require_once discuz_root. /viewthread.php '; Statement to replace the
$bmt _url= ' location:/thread-'. $tid. ' -1-1.html ';
Header (' http/1.1 moved Permanently ');
Header ($bmt _url);

The dz5.5 reference also has a jump, which can be done by 301:

Replace Dheader ("location:viewthread.php?tid= $post [tid]&page= $page #pid$pid");
$bmt _url= ' location:/thread-'. $post [Tid]. -' $page. ' -1.html#pid '. $pid;
Header (' http/1.1 moved Permanently ');
Header ($bmt _url);

Note that this goes directly to the static address, does not determine whether to open pseudo static function, so do not open there will be a problem

Loss of function: When you turn to a static address, a #lastpost anchor point in the dynamic address will not work, and you may have to manually roll the screen
??? It doesn't seem to affect the anchor.

Page copy of the prompt information page

This problem and content page also has a great relationship, hehe. If the administrator settings allow visitors to browse the list, and prohibit browsing content, these content pages will return an unauthorized Message page, but their URLs are different, resulting in a serious web copy, in addition to other forms of unauthorized operation. Do not have a hint of information such as a post does not exist, the number of large will also form a copy of the Web page. Both of these prompts are implemented by calling Nopermission.htm and showmessage.htm two templates, respectively, through the ShowMessage function.

Solution
1. Create another header template file templates/default/header_disbots.htm
Content with header.htm files, but add meta robot tags as follows

<meta name= "Robots" content= "Noindex,nofollow"/>

2. Modification of templates/default/nopermission.htm and templates/default/showmessage.htm documents respectively
Replace the {Template header} of their first row with {Template header_disbots}

Although the meta-robots tag is not very broad, it's a less convenient way
Because is the prompt information page, realizes with the 301来, but also must pass to the page which to go to many information, the revision is more troublesome

Iii. robots.txt Use

User: *

#禁止一个版面的收录
#如果有个水版, do not want to prohibit the tourist authority, also do not want SE included, has not affected the quality of the site, you can use the following methods
Disallow:/forum-1-

#数字即为要禁止版块的ID.
#注意数字最后的-do not omit, otherwise even the first version of the ID 11,12 1 is banned
[UPDATE] Supplemental method: Add meta robot to content page

#再禁网页复制
Disallow:/viewthread.php
#这个是内容页的动态形式, the previous pseudo static has been optimized and modified a lot of copy pages, so here dynamic form such as printed pages must be prohibited

Disallow:/forumdisplay.php
#这个要慎重: The dynamic form of the list page, including the essence, activities, voting, and other forms. I found that 5.0 of the static is not thorough, its previous page next page is still dynamic, so if prohibited, 10 pages after the content can not pass the index. So if you are not rich in the link within the site, do not prohibit this in order to avoid the impact included.

Hekaiyu said that there is no problem, go to the official to see the next 5.5 really no problem (perhaps my 5.0 would have been OK, perhaps the official in the new version of the improved regular expression of the replacement rules), so better, forbid it, and clean a lot of

#禁止其它无用内容
Disallow:/profile
#用户信息, I do not know why Discuz also static, are prohibited
Disallow:/relatethread
Disallow:/post
Disallow:/blog
Disallow:/member
Disallow:/misc
Disallow:/faq
Disallow:/my
Disallow:/PM
Disallow:/digest
Disallow:/status
# ... ... Wait a minute
#-----Robots.txt End------
[Update]dz5.5 robots.txt after the Disallow:post.php, etc. is not standard, must be preceded by "/". >> Detailed description

Home URL Problem

Background basic Settings-> first file name if not set, defaults to index.php. So the link inside the station to the homepage is in the form of http://domain/index.php. And we generally refer to the Forum home page or Exchange link form is generally http://domain/. Actually is the same, but se can be considered to be two URLs, and one has a lot of internal links, one has more external links, so it is not a good thing to be demoted, it is best to unify. The target of the modification here is the form of http://domain/.

1. Enter the background, the basic settings-> the first filename is set to:/
2. Modify file member.php header ("Location: {$boardurl}". $indexname); Replace with

if ($indexname = = '/') {
Header ("Location: {$boardurl}");
}else{
Header ("Location: {$boardurl}". $indexname);
}

[Update]dz5.5 for Dheader

The function of this modification is to remove cookies and return to the home page, if you do not modify the domain name after two/. No other questions to use/when the first page file name is temporarily found

End

In addition to the file changes, some places to match the background settings, mentioned above, summed up here. Mainly: 1. URL static, only enable normal page static; 2. Do not enable the Archiver function, 3. Background keywords and description settings are only for the home page (if you do not make a list page and the content page of the meta modification will also apply to them); 4. Background first file name is/(of course you can also set your own specific)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.