BY:YSUNCN (Welcome reprint, please specify original information)
What is standard deviation (deviation)? According to the International Organization for Standardization (ISO) definition: The standard deviation σ is the positive square root of the variance σ2, and the variance is the expectation of the two deviations expected by the random variable, which is not explained.
What is standard error? Read some of the literature, some or Daniel, definitions are not uniform, usually there are two ways to define:
1, the sample size of the standard error is the standard deviation of the sample divided by. PS: Some people here use the standard deviation of the sample divided by N as the standard error (the estimate is wrong, but the standard error is based on the overall mean to estimate the standard deviation, so there is no need to say that others are wrong);
2. The standard error of a statistic can also be characterized by the standard deviation of the estimated error, namely:.
From the editorial journal Saradi "standard deviation and standard error", the correlation is also relatively large, I hope to help everyone.
As the representative of the random error (or true difference), the standard deviation is the statistical mean of the absolute value of the random error. In the National Metrology Technical Specification, the official name of the standard deviation is the standard deviations, referred to as the standard difference, denoted by the symbol σ. The name of the standard deviation has more than 10 kinds, such as the general standard deviation, the mother standard deviation, the root mean square error, the RMS deviation, the mean square error, the mean variance, the single measurement standard deviation and the theoretical standard and so on. The standard deviation is defined as: The value of the sample standard deviation S is used as the estimate of the general standard deviation σ. The formula for the standard deviation of the sample is:.
In sample tests (or repetitive, equal-precision measurements), standard deviations commonly used for the average of the samples are also known as standard errors or standard errors of the sample averages (standardized error of mean). Because the sample standard deviation s does not directly reflect the sample average x and the overall average μ exactly how much error, so, the average error is essentially a comparison between the average of the sample and the total average of the relative errors. The standard of the available sample averages is mistaken, and its estimated value is that it reflects the degree of dispersion of the average sample. The smaller the standard error, the closer the average of the sample to the general average, or the more discrete the average sample.
Standard deviation is an indicator of the variation between individuals, which reflects the degree of dispersion of the sample to the average of the samples, and is the measure of the precision of the data, while the standard mistakenly reflects the variation of the average of the sample to the average, thus reflecting the size of the sampling error, and is the index of the precision of the measurement result.
Reference: Http://mathworld.wolfram.com/StandardError.html ...