Algorithm of Python to achieve approximate entropy and mutual approximate entropy algorithm

Source: Internet
Author: User
Tags abs
Theoretical basis approximate entropy?
    • Definition: Approximate entropy is a random complexity, the probability that the pattern of the adjacent m points of the reaction sequence is approximated to each other by the approximation of the probability of the pattern of the folded segments connected by the m+1 points.

    • Function: Used to describe the irregularity of complex systems, the more irregular time series corresponding to the approximate entropy is greater. The size of the probability that the reaction dimension changes when the new pattern is generated.

For the EEG signal, due to the existence of noise, and signal of the weak, multi-source superposition, reflecting the chaotic properties, but the same person in the brain activity is relatively stable, the EEG approximate entropy should change little.

    • Proofs and corresponding geometrical meanings can be referenced in paper: wenku.baidu.com/view/4ec89e44b307e87101f696ef.html
Mutual approximate entropy
    • Derived from the approximate entropy definition, approximate entropy describes the degree of self-similarity of a sequence of sequences, compared with the degree of complexity of two sequences, and the higher the entropy value, the smaller the more similar;
Analysis of approximate entropy algorithm
    1. There is a time series of M-dimensions obtained by equal time interval sampling u (1), U (2),..., U (N).

    2. Define the correlation parameter dimension m, the general value is 2, similar tolerance is the threshold r, wherein the dimension represents the length of the vector; R represents the measure of "similarity".

    3. Refactor m-dimensional vectors x (1), X (2),..., x (n−m+1), where x (i) =[u (i), U (i+1),..., u (i+m−1)],x (j) =[u (J), U (j+1),..., U (j+m−1)]; calculation x (i) and X (j) The distance between, determined by the maximum difference of the corresponding element; D[x,x∗]=maxa|u (a) −u∗ (a) |d[x,x∗]=maxa|u (a) −u∗ (a) |

    4. To count all the d[x,x∗]<=r of the number G, then g/(n-m) is the corresponding probability of the I value of the same, calculate all I and j the probability logarithm of the average value, that is, the entropy value φm (r);

    5. Take m+1 repeat 3, 4 process, calculate approximate entropy:

Apen=φm (R) −φm+1 (R)

Parameter selection: Usually select the parameter m=2 or m=3; r=0.2∗std, where STD represents the standard deviation of the original time series.

    • The reciprocal approximation entropy is calculated as the approximate entropy step, the distance between the computed x (i) and X (j) is calculated as the distance between the vector x (i) of the sequence A and the vector y (j) of sequence B, and the similarity tolerance R is 0.2 times times the covariance of two original sequences;
Python code implementation calculates approximate entropy using approximate entropy defined by Pincus
Class Baseapen (object): "" "Approximate entropy base class" "Def __init__ (self, M, R):" "Initialize:p Aram U: a matrix list,        For example:u = Np.array ([max. m, G] *):p Aram M.: The size of the subset, int:p Aram R: Threshold cardinality, 0.1---0.2 "" "self.m = m SELF.R = r @staticmethod def _maxdist (X_i, X_j):" "calculates the distance between vectors" "" Return n        P.max ([Np.abs (Np.array (x_i)-Np.array (X_j))]) @staticmethod def _biaozhuncha (U): "" "function for calculating standard deviations :p Aram U:: Return: "" "If Not isinstance (U, np.ndarray): U = Np.array (u) return NP. STD (u, ddof=1) class Apen (Baseapen): "" "Pincus the algorithm proposed, computes the approximate entropy of the Class" "" Def _biaozhunhua (Self, U): "" "Will        Data normalization, get average all values minus mean divided by standard deviation "" "Self.me = Np.mean (u) Self.biao = Self._biaozhuncha (u) Return Np.array ([(x-self.me)/Self.biao for x in U]) def _dazhi (self, U): "" "Get threshold:p Aram      U:: Return:  "" "If not hasattr (self," F "): Self.f = Self._biaozhuncha (U) * SELF.R return self.f def _phi  (Self, M, U): "" Calculates the entropy value:p Aram U::p Aram M:: Return: "" "# Get vector list X = [U[i:i + m] for I in range (len (U)-M + 1)] # gets all the ratios list C = [Len ([1 for X_j in X if Self._maxdist (x_i, X_ j) <= Self._dazhi (U)])/(Len (U)-M + 1.0) for x_i in X] # calculates the entropy return Np.sum (Np.log (list (lambda a        (A, C)))/(Len (U)-M + 1.0) def _phi_b (self, M, U): "" "normalized data calculates the entropy value:p Aram m::p Aram U:        : Return: "" "# get vector list x = [u[i:i + m] for I in range (len (U)-M + 1)] # Get all the ratios list        C = [Len ([1 for X_j in X if Self._maxdist (x_i, X_j) <= SELF.R])/(Len (U)-M + 1.0) for x_i in X] # COMPUTE entropy Return Np.sum (Np.log (list (filter (lambda x:x, C)))/(Len (U)-M + 1.0) def jinshishang (self, U): "" "Count    Calculate approximate entropy: return:    "" "Return Np.abs (Self._phi (self.m + 1, u)-Self._phi (SELF.M, u)) def jinshishangbiao (self, U):" "" Approximate entropy after normalization of raw data:p Aram U:: Return: "" "EEG = Self._biaozhunhua (U) return np.abs (SE    Lf._phi_b (SELF.M + 1, EEG)-self._phi_b (SELF.M, EEG)) if __name__ = = "__main__": U = Np.array ([2, 4, 6, 8, 10] * 17) G = Np.array ([3, 4, 5, 6, 7] *) AP = Apen (2, 0.2) Ap.jinshishang (U) # Calculate approximate entropy

Description

    • Direct calculation of approximate entropy by Jinshishang function

    • Jinshishangbiao function normalizes The original data and calculates the approximate entropy

Calculating the mutual approximate entropy using approximate entropy defined by Pincus
Class Huapen (Baseapen): Def _xiefangcha (self, U, G): "" "function to calculate covariance:p Aram U: Sequence 1, matrix:p Aram G : Sequence 2, Matrix: return: Covariance, float "" "If Not isinstance (U, np.ndarray): U = np.array (u) i F not isinstance (g, np.ndarray): g = Np.array (g) If Len (U)! = Len (g): Raise Attributeerror (' reference Number Error!        ') Return Np.cov (U, G, ddof=1) [0, 1] def _biaozhunhua (self, U, g): "" Standardize Data "" " Self.me_u = Np.mean (u) self.me_g = Np.mean (g) Self.biao_u = Self._biaozhuncha (u) self.biao_g = Self._ Biaozhuncha (g) # Self.biao_u = Self._xiefangcha (U, g) # self.biao_g = Self._xiefangcha (U, g) return n P.array ([(X-self.me_u)/Self.biao_u for x in U]), Np.array ([(x-self.me_g)/self.biao_g for x in U]) d EF _dazhi (self, U, G): "" "gets threshold:p Aram R:: Return:" "" If not hasattr (self, "F") : SELF.F= Self._xiefangcha (U, G) * SELF.R return self.f def _phi (self, M, U, G): "" Calculates the entropy value:p Aram m:        : Return: "" "# Get x vector list x = [u[i:i + m] for I in range (len (U)-M + 1)] # Get a list of y vectors y = [g[g:g + m] for G in range (len (G)-M + 1)] # gets all the conditional probabilities list C = [Len ([1 for Y_k in Y if Self._maxdist (x_ I, Y_k) <= Self._dazhi (U, G)])/(Len (U)-M + 1.0) for x_i in X] # calculates the entropy return Np.sum (Np.log (List (        Lambda x_1:x_1, C)))/(Len (U)-M + 1.0) def _phi_b (self, M, U, G): "" "Normalized data calculates entropy value:p Aram m: :p Aram U:: Return: "" "# Get x vector list x = [u[i:i + m] for I in range (len (U)-M + 1)] # Get y vector list y = [g[g:g + m] for G in range (len (G)-M + 1)] # get all the conditional probabilities list C = [Len ([1 for Y_k in Y if sel F._maxdist (x_i, Y_k) <= SELF.R])/(Len (U)-M + 1.0) for x_i in X] # calculate entropy return Np.sum (Np.log (list (filte R (Lambda x:x, C)))/(Len (U)-M + 1.0) def hujinshishang (self, U, G): "" "computes the reciprocal approximate entropy: return:" "" Return Np.a BS (Self._phi (SELF.M + 1, u, g)-Self._phi (SELF.M, U, g) def hujinshishangbiao (self, U, g): "" To standardize the original data The reciprocal approximate entropy:p Aram u::p Aram G:: Return: "" "u, g = Self._biaozhunhua (U, g) return NP . ABS (Self._phi_b (SELF.M + 1, u, g)-Self._phi_b (SELF.M, U, g))
Approximate entropy calculation using a fast and practical algorithm proposed by Hong Bo
Class Newbaseapen (object): "" "New Algorithm base class" "" @staticmethod def _get_array_zeros (x): "" "" to create a n*n 0 matrix    :p Aram U:: Return: "" "N = np.size (x, 0) return Np.zeros ((n, N), Dtype=int) @staticmethod def _get_c (Z, m): "" "Calculates the algorithm of entropy value:p Aram Z::p Aram M:: Return:" "" N = Len (                Z[0] # probability matrix c calculation c = Np.zeros ((1, n-m + 1)) if m = = 2:for J in range (N-m + 1):            For I in range (n-m + 1): C[0, j] + = Z[j, I] & z[j + 1, i + 1] if m = = 3: For j in Range (N-m + 1): For I in range (n-m + 1): C[0, j] + = Z[j, I] & Z[j + 1 , i + 1] & Z[j + 2, i + 2] if m! = 2 and M! = 3:raise attributeerror (' m ' value is incorrect!) ') data = List (filter (lambda x:x, c[0]/(n-m + 1.0))) If not all (data): return 0 return N P.sum (Np.log (data))/(N-m + 1.0) class NewaPEn (Apen, Newbaseapen): "" "Hong Bo a fast and practical algorithm for calculating approximate entropy" "" Def _get_distance_array (Self, U): "" "gets the distance matrix  :p Aram U:: Return: "" "Z = Self._get_array_zeros (u) FA = Self._dazhi (U) for I in        Range (len (z[0)): Z[i,:] = (Np.abs (u-u[i]) <= FA) + 0 return z def _get_shang (self, M, U):        "" "Calculates entropy:p Aram U:: Return:" "" # Get distance Matrix Z = Self._get_distance_array (U)        Return Self._get_c (Z, M) def Hongbo_jinshishang (self, U): "" "Calculates approximate entropy:p Aram U:: Return: "" "Return Np.abs (Self._get_shang (self.m + 1, u)-Self._get_shang (SELF.M, u))
Calculating the mutual approximate entropy using the fast and practical algorithm proposed by Hong Bo
class NewHuApEn(HuApEn, NewBaseApen):    """    洪波等人提出的快速实用算法计算互近似熵    """    def _get_distance_array(self, U, G):        """        获取距离矩阵        :param U:模板数据        :return:比较数据        """        z = self._get_array_zeros(U)        fa = self._dazhi(U, G)        for i in range(len(z[0])):            z[i, :] = (np.abs(G - U[i]) <= fa) + 0        return z    def _get_shang(self, m, U, G):        """        计算熵值        :param U:        :return:        """        # 获取距离矩阵        Z = self._get_distance_array(U, G)        return self._get_c(Z, m)    def hongbo_hujinshishang(self, U, G):        """        对外的计算互近似熵的接口        :param U:        :param G:        :return:        """        return np.abs(self._get_shang(self.m + 1, U, G) - self._get_shang(self.m, U, G))
Simple test
if __name__ == "__main__":    import time    import random    U = np.array([random.randint(0, 100) for i in range(1000)])    G = np.array([random.randint(0, 100) for i in range(1000)])    ap = NewApEn(2, 0.2)    ap1 = NewHuApEn(2, 0.2)    t = time.time()    print(ap.jinshishang(U))    t1 = time.time()    print(ap.hongbo_jinshishang(U))    t2 = time.time()    print(ap1.hujinshishang(U, G))    t3 = time.time()    print(ap1.hongbo_hujinshishang(U, G))    t4 = time.time()    print(t1-t)    print(t2-t1)    print(t3-t2)    print(t4-t3)
    • After testing, it is found that using fast algorithm is more than 6 times times more efficient than using the definition algorithm.

    • Reference:

    • Wenku.baidu.com/view/4ec89e44b307e87101f696ef.html

    • Http://blog.sina.com.cn/s/blog_6276ec79010118cx.html

    • 79707169

    • Tianyu tour
    • Source:/http www.cnblogs.com/cwp-bg/
    • This article is copyright to the author and the blog Park, welcome, exchange, but without the consent of the author must retain this statement, and in the article obvious location to give the original link.
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.