Learning essays--python Realizing entropy weight method

Source: Internet
Author: User

Introduction of the Entropy right method

Entropy was first introduced into information theory by Shannon's, and it has been widely applied in engineering technology, social economy and other fields.

The basic idea of entropy weight method is to determine the objective weight according to the size of the indicator variability.

Generally speaking, if the information entropy of an indicator is smaller, it indicates that the index is worth more variation, the more information is provided, the greater the function can play in the comprehensive evaluation, the greater the weight. On the contrary, the larger the information entropy of an indicator, the smaller the index is worth, the less information is provided, and the smaller the function in the comprehensive evaluation, the smaller the weight.

Second, the entropy right method to assign the right step

1. Standardization of data

Standardize the data for each metric.

Suppose given a K indicator, where. Assuming that the values for each metric data are normalized, then.

2. seeking information entropy of each index

According to the definition of information entropy, the information entropy of a set of data. Where, if, it is defined.

3. determine the weight of each indicator

According to the calculation formula of information entropy, the information entropy of each index is calculated. The weights of each index are calculated by information entropy:.

#coding =utf-8import NumPy as npli=[[100,90,100,84,90,100,100,100,100],    [100,100,78.6,100,90,100,100,100,100],    [75,100,85.7,100,90,100,100,100,100],    [100,100,78.6,100,90,100,94.4,100,100],    [100,90,100,100,100,90,100,100,80],    [ 100,100,100,100,90,100,100,85.7,100], [[+], 78.6, +, +, +,    55.6, [    87.5  , (85.7, +, +, +, +  ),    [100,100, 92.9, +, +, +, +, +  ],    [100,90,    +, +, +, +, +, +] , 92.9, p, p, p, p, p,   100]]li = Np.array (li) #转换为矩阵li = (Li-li.min ())/(Li.max ()-li.min ()) #最大最小标准化m, n = Li.s Hape#m,n for matrix row and column number k = 1/np.log (m) Yij = Li.sum (axis=0)  # axis=0 column add Axis=1 row Add Pij = Li/yijtest = Pij * Np.log (PIJ) test = Np.nan_to_num (test) #将nan空值转换为0ej =-K * (Test.sum (axis=0)) # Calculates information entropy for each metric WI = (1-ej)/Np.sum (1-ej) #计算每种指标的权重

  

Learning essays--python Realizing entropy weight method

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.