I. Content
When I first started to get in touch with SEO, I often saw experts saying "Content is king, external chain is King". I learned it slowly and time, whether it is our browser, or search engines like original content, so we should stick to originality as much as possible. Otherwise, even if the search engine is included, it will be deleted slowly, especially Baidu. My lesson here is: when the content of your blog has just been added and the search engine has not yet been indexed, I will post it on the website of another relevant webmaster, after a long time, it is estimated that Baidu regards my website as a collection site, which has not yet been included. Although Baidu Spider also has many Baidu Anti-chains every day. I hope that the majority of webmasters can learn from me!
II. Page tags
Although the title, keyword, description, alt, and other tags that we often use are not as important as they were in the past, they are still indispensable. Search engines still like these tags, therefore, we must add all the information on both the home page and the internal page.
3. PR
A common topic is also a controversial topic. I personally feel that it is not very clear whether or not other searches are useful, but Google is still useful after all, because PR itself was proposed by Google, the higher the value of PR, the higher the website's attention, it can also be understood as a better user experience. Although the role of Google is not omnipotent, the strength of the two websites is similar. PR will definitely affect the final ranking of the websites.
4. External links:
As mentioned above, "Content is king and external chain is King". It can be seen that external chain is an important factor that cannot be ignored in optimization. Based on the experiment results of some websites, the number of external links determines the final ranking of websites in a sense.
V. Website traffic:
The larger the traffic, it means that the optimization, promotion, and external links of the website are doing well, and the user's welcome will be gradually improved, which will also be favored by search engines. However, the traffic here is the actual traffic, rather than the malicious traffic.
The white-hat Google optimization technology has always been the basis of all Google's optimization er. What is published on the Internet is fragmented and vague, with no practical use or clear substance. Countless new players who dream of becoming masters are still studying hard and are still telling themselves "A Za Fighting !", Unfortunately, Google's optimization is essentially a long-term, slow technology that does not fundamentally solve the problem. All technologies can be called "fur"-as a result, black hat Google's optimization began to emerge, it has also become a "reserved technology" for various Google optimization companies ".
We don't have a wide range of analysis, because there are too many cheating techniques, and the consequence is that one day Google will be optimized by the core class-search engine, K loss. We will publish the popular black hat techniques on the inside. (The following method is the technology currently being used by various Google optimization companies. You just need to repost the record as an internal viewer, so don't repost it everywhere. In addition, if some technologies are too small, they will be a good optimization method for white hat Google optimization.
1. The most common and practical method is to estimate that it is difficult for search engines to identify other black hat methods-batch friendship connections.
Someone shouted, "who can't connect with friendship? What are you talking about ?!" Don't be fooled by the superficial East Simon. Your eyes are the basic qualities of the technical personnel. Both the founder of google and Baidu's Li Yanhong Li have their own patents. What are the core patents? Google is called PageRank and Baidu is called hyperlink analysis.
We will not discuss the specific patents. It's about a white hat. Let's look at the Black Hat practices:
Basic: batch searching for friendship connections, whether or not similar websites. A website optimized by a black hat can have up to 300 or more connections.
Intermediate: connection purchase. 1 PR4 site, home page connection, about 30-a month (PR from September 27 .28, google dance irregular start, is not so authoritative, so the price is relatively low .)
Advanced: can exchange, can buy, can buy-finally use special page code, so that all the site's outbound friendship connection is invalid!
There are probably three methods. The effect is that you often find some very popular words (such as Baidu index 0.1 million or above) before every month's baidu and google updates, the top few, even corn is a new site just registered for a month! -- When you open the website, it's all about advertising, whole-body viruses, plug-ins, private server promotions, pornographic ads, and so on!
2. blog and AD soft text
Google optimized users all know what soft text is. To put it bluntly, they include the site name or corn they need to optimize in a certain article, it is then published on sites with high search engine weights so that spider crawlers can climb to their websites. This is a white hat practice.
What about black hats? Of course we don't have that much time to do it. What they need is speed! What we need is profit!
Basically: starting from the day when the ticket was received, we registered 1000 (approximately) various blogs and free information publishing platforms in batches, and then began to group several AD articles in batches-about 1-6 days later, SE began to include some blogs, so the optimized site instantly had hundreds of thousands of external connections-the website weight jumped to a very high level! Of course, when the weight is higher, the ranking is naturally higher.
Intermediate: We use Baidu and google as examples.
Baidu, A variety of tools, such as mass mailing and batch Post-posting tools, come up-what? Do you say it will be deleted? Yes, it will indeed be deleted! However, Baidu is the most serious search engine for manual intervention and manual review, while google basically eliminates manual intervention-should Baidu employees take a rest at night? I know. Can I post it for an imperfect smart review at night? Is the spider active time at night? OK! Only one night is enough! Tens of thousands of messages are sent one night, which is too easy for technicians!
Google: google doesn't know, but it doesn't know, but google also has something that you trust, such as blogspot! Free? Release information does not need to be reviewed, right? How fast does google search for crawlers? OK! Blogspot becomes the next position of the black hat!
Advanced: This part involves the soft text itself. For example, some Google optimization companies have fixed writers who are associated with the editing of certain Portal website columns (or Google optimization companies have their own publishing methods and channels .), At this time, it began to look strange: a newly launched product was labeled with extremely powerful features and customer experience, and began to be displayed in the directories of major portal websites at levels 2 and 3, but the display method is nothing more than soft text-Google optimization companies will not spend money to help you do Portal website ads! Why should I talk about soft texts? What are the main research objectives of search engines? In typical cases, Baidu pays at least attention to the portal, so there is no problem with Google's Portal optimization. However, if Google's Portal optimization is too high, it will be killed by K! -- Similarly, the portal publishing information, of course, has a high weight and is easy to recognize!
3. The so-called "smart jump"
An interesting thing: around October 25, an engineer from a black hat Google optimization company left the company and was probably angry? Publish the best technology in the black hat circle that has not been found by any search engine ...... When searching for certain websites through SE and then accessing the site, will the website automatically jump to the search engine, search again with the keyword you just searched, and then you have to click the search result again to access the normal site?
This function can be implemented with just a few simple lines of code. Everyone should write: only determining the access source and determining the keywords, then construct a search engine query URL for the keyword (or construct a keyword query URL you want to do), and submit it to the browser to open it-where does it mean?
When the search engine ranks, the content on the first page is the most important, because 90% of people will find what they want on the first page, so, the analysis and monitoring of the ranking results on the first page is a set of independent algorithms-what are the most important points of algorithms? Credibility or user loyalty (unofficial words ). That is to say, when the first page is displayed, each visitor opens a connection first (or many visitors open the same connection, the weight of the connection will be added in the result on the first page.