Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall
Reading】
Web Analytics is still new, so there may be a variety of biases in our understanding of her. This article summed up in my work found a variety of easy to appear on the site analysis of the misunderstanding of the understanding. This is part two, and this part will go into some more detailed areas. The first part please see: the website Analysis ten misunderstandings and alternates (a) the second part please see: the website Analysis ten misunderstandings and alternates (two).
Body】
The last article in this series was on May 1, and now it's August 1, and time passed so quickly that it was a feeling.
In fact, in the first two episodes, the top ten misunderstandings have been talked over, today to talk about only alternates. The reason is called alternate, because they are very controversial areas, I am opinion, still only dare to Bo Jun a smile. But there is no alternative to knowledge, and I want to spark discussion, even debate, to gain real insight and truth.
Alternate misunderstanding One: Website analysis has the standard benchmark
This is a place where there is widespread misunderstanding. We often discuss bounce rate and discuss time on site, so many friends will ask:
My website bounce rate is 60%, OK? Or, the average time on site is 5 minutes, okay?
These are actually questions I can't answer, because web analytics does not have a so-called standard benchmark for these critical metrics. I can only say that the bounce rate is 60% is not the worst or best I've ever seen, and the 5 minute on site is the same, but as far as it goes, the isolated data alone cannot be answered.
Why the website analysis does not have the standard benchmark reason, is the website and the website difference is too big. First, the site's audience/traffic sources are different, and secondly, the function of the site is different, again, the content of the site design is not the same;
Therefore, the website analysis does not have the standard benchmark! for example, cannot say bounce rate below 60% is good, above 60% is not good.
Now, you're going to ask a better question:
If the same segment of the industry, or the audience is very coincident with the site, is it possible to compare these such as bounce rate, time on site, pv/v, visitor loyalty and other basic indicators? For example, Sina and Sohu, potatoes and cool 6, Jingdong Mall and new egg nets, Can they compare these indicators to each other in 22?
I think that can be compared to each other, but do not feel that their own site indicators than others, that is, the site is not good. If Sina's bounce rate is 10%, and Sohu is 15%, Yanggo will be mad? Big can not, this does not necessarily mean that Sohu is worse than Sina. Or that reason, Sina and Sohu page is actually very different, although they are portals, and they mu enough in the competition, but they still have very big difference.
Similarly, Nike and Adidas sites, Intel and AMD sites, they are all the same tier (category), but they are also very different. The size of these metrics does not simply indicate that one site is better or worse than another.
Therefore, I have always insisted: even the same category of Web sites, indicators of the simple number of good or bad can not explain the quality of the site.
Then you will ask:
Since the comparison does not indicate good or bad, then what is the significance of comparison?!
Yes, of course you do! If you know the competitor's numerical value, you can analyze it; you know your value is not as good as it is to know yourself. The so-called people as a mirror can know the gain and loss, the site is also between.
The best, please do not produce another misunderstanding, that is, since there is no standard benchmark, so no matter how much I value no matter how many, do not say that my site good or bad, I can rest easy.
I believe no friend would think so. :)
If your number is too outrageous to go beyond the normal range, it's a good indication of the problem. For example, if your site overall bounce rate above 80% or even 90%, you should pay attention. :) Website analysis like these abnormal phenomenon, remember this article?
Here are some of the extremes of my experience (note that these values are only valid for use with Google Analytics analysis, other WA tools because of the definition and monitoring methods, the value may be significantly different), if more than these values, may indicate that the site has a more serious problem (but not absolute!))
Finally, again, because each site is unique, and the indicators themselves can not be isolated interpretation, so, the site analysis of the standard benchmark does not exist.
Alternate misunderstanding two: Using the automatic BI system for website analysis
Is it possible to use an automated BI system for Web analytics? This is a very smart question, the friends who put forward this question are very resourceful and willing to solve the problem. But, in my humble opinion, this is a good idea, but it's hard to achieve.
Web analytics has some fixed models. Fundamentally, following the "Input-> analysis-> Output" model, we want to analyze this link using the computer's intelligent computing power to help us analyze and then directly output the results. This is like IBM's deep Blue and people playing chess, input is the current chess, analysis is its own belly inside the program, output is the next step of chess. Website Analytics would also like to be able to resemble, for example, entering the site's bounce rate=65%, the site's pv/v=2.54, the site's time on site=187 seconds, the site generated revenue=$332,343,conversion rate=1.34% and other metrics and metrics (including metrics for each page), and so on, and then the smart program starts to analyze the results, the final output--whether the site is healthy, what strengths and weaknesses the site has, and where the site needs to be improved, and recommendations.
Very good, but difficult to achieve (except for some special cases, the following).
What is the problem? input, analysis, and output are problematic, and cannot be achieved-the original reality is always so brutal; and, I think, the problem of input is the hardest to solve.
Why is the problem of input difficult? Give an inappropriate example of war (please forgive me for being peaceful, so this example is not good), if Russia and the United States to fight a war, we can computer simulation of the winning, but the real firing, you will believe that the process of war will be the same as the computer simulation? It is impossible to do the same because external factors, such as weapons and equipment, are often more important outside of the war process, while external factors are difficult to define, and many are completely chaotic.
Web Analytics so. The site's various indicators that can be monitored are internal factors, with clear definitions and boundaries, but unfortunately, only these internal factors are not enough to help us complete the analysis. The core value of analyst analysis is to correlate the internal factors that are monitored by the web analytics tools with the external factors that the site has, and to further analyze the excavation to find a truly valuable insight (insight).
You will ask me, the external factor of the website is what?—— good problem! The external factors of the website are those that are fast changing, impossible to stick to the same time and no longer affect the site to realize the things, I can think of include the following content:
The design of the website--not only the process, but also the visual, usability, how to input to the BI system?
Site audience
The industry and business environment of the website
Competitor of the website
Some unique quirks of China's Internet
Other external events experienced by the website and so on
These external things, even a little impact on your site, will change the final analysis results, and they are neither formatted nor structured, can not be entered into the BI, so the input on the problem.
Because the input can not solve the problem, analysis is very difficult, in the real human neural network before the advent of computer, our current computer analysis can only be structured and programmatic way, if the input is analog, or even fuzzy, it will be helpless.
Talk a little digression. There must be a lot of friends who like to fry, I read a book about Wall Street's financial analyst, the book called "Wall Street Meat". The authors say that analysts who play mathematical models on a daily basis "input, analysis and output" tend to have different predictions than actual results, so he is very skeptical about this approach. I now understand that part of the reason is that many external factors are not accurately entered into the model.
There are also problems with the output, even though the analysis has results, it is still very difficult to make recommendations. The computer can only give some general guidance, it is impossible to tell you how to adjust your call to action element, or how to modify the process, or how to reconstruct the home page, it does not.
So I venture to predict that it is impossible to do in-depth web analytics with BI within 10 years, and it depends on the great human brain.
Of course, nothing is absolute, and the BI system is valuable in a number of ways, as follows:
When the external environment of the site is more constant, can be used as a constant;
Automatic optimization of flow sources, such as the automatic Launch System of SEM;
A/b test or multivariable test;
Automatically according to the test results of the content, such as Omniture test&target system;
Automatically customize the content feed according to the analysis results of the visitor's behavior profile.
is no longer specific.
Alternate misunderstanding Three: the analysis of individual behavior is of great significance
I saw a few tools to record the mouse trajectory of each visitor on the page. These tools have their own advantages, but they are powerful. Typically, these tools are intended for ued (UCD) designers, but are they significant for web analytics?
Website analysis is generally through the whole (that is, no sampling) or large sample data to analyze the convergence of some of the behavior patterns of site visitors, and optimize the access experience of the most important audiences. Web analytics is rarely analyzed by studying individual access behavior. At this point, web analytics and site usability (usability) analysis are quite different.
If you've read Don T made Me think, you know that when the site is done, people who have never used your site can do some of the network access you've specified in front of you and record their access behavior, is a very important site usability testing and improvement methods, but the site analysis is rarely taken this approach, that is, through a visitor on the site to visit the data left to analyze and optimize the site.
The reason is simple, because the access of a large number of visitors is in accordance with the normal distribution. It is possible that the access data of some visitors are distributed in the extreme area, and if analyzed with these data, the deviation is very large. For example, a visitor stays on the site for up to 1 hours, visiting as many as 100 pages, which does not mean that all visitors are so, and it is easy to risk a single visitor. You might say that I can analyze the behavior of several visitors more reliably. The problem, however, is that compared to the number of millions of visitors, the individuals you can analyze are always limited, and the more individual analysis, the harder you will be.
So, in my actual work, I hardly ever use some very specific mouse-track monitoring tools, but I want to be able to have a mouse track recording tool, recording all the mouse behavior, and using different colors to represent the density of the mouse behavior, which is very useful to us, and will be more than we currently do the Hot Zone map (heat MAP) is more valuable. But there appears to be no such tool.
Alternate misunderstanding four: The optimization scheme is the inevitable result of analysis
Analysis of the site is analyzed, yes, but analysis is not the whole site analysis, the main purpose of analysis is to identify problems, but the analysis itself is not enough to help solve the problem, or can only solve some of the problems.
For example, when researching transformations, I often find it obvious that a page has lost a large number of visitors, but why does this page behave so badly? Sometimes, based on experience, we can immediately think of the cause and make suggestions for improvement, but sometimes we don't really know why this page is so bad. Even if we can think of the reason from experience, this is not necessarily the real (or fundamental) reason.
So, sometimes (and more accurately, most of the time), the really reliable optimization is not directly derived from the analysis, but from the test you make after the analysis of the recommendations. The recommendations themselves are subjective, but the results after the tests are objective (as long as you use scientific methods and processes). A cycle of web analytics does not end with analysis, but ends with testing, and testing is kingly.
So the following diagram is a cliché, but it's really important methodology.
Well, this article is finally over, and the series is finally over (there may be revisions and additions). Thank all the friends have been encouraged! I hope you can put forward different opinions, welcome to discuss, welcome quarrel!
All rights reserved: Tomato operation, reproduced please indicate the source of this article address: http://www.chinawebanalytics.cn/top10-misunderstanding-for-web-analytics-part3/