Write a adaboost algorithm that a person thinks is more detailed

Source: Internet
Author: User

recently in the Machine learning AdaBoost (adaptive boostint) algorithm part of the content, found on the csdn above the discovery, as if not to speak of the particular detailed, of course, may be my poor character, so did not find, In order to prevent the same thing from happening to other people, so write this blog post, as much as possible to explain the algorithm of the deduction process more convenient for everyone to understand the algorithm.

before introducing the adaboost algorithm, we first introduce the strength and weakness of the learning algorithm, which is defined by PAC: weak learning algorithm ---The recognition error rate is less than 1/2 (that is, the accuracy rate is only slightly higher than the random guessing algorithm), The Strong learning Algorithm ---A learning algorithm that recognizes high accuracy and can be completed in polynomial time.

Then I use my understanding of the example to introduce the AdaBoost algorithm, I believe that the calculation process to see everyone has a deeper understanding of the algorithm, because the CSDN insert formula is difficult, so the following steps I use Word to express, Yes, let's start with the iterative process of the adaboost algorithm, which will understand the later computations more deeply.

adaboost algorithm iterative process:

with the iterative process of the adaboost algorithm, we can use the known formula to simplify the above equation, the following formula will be applied more

The derivation part is here, and the next step is to use an example to help you understand the process of the adaboost algorithm, well, the time to witness the miracle comes.

Okay, here we go. The process of the adaboost algorithm is certainly understood, then I then put a code to achieve this process, will definitely deepen the understanding of this algorithm, OK, start.

01.# coding:utf-8 02.from __future__ Import Division 03.import numpy as NP 04.import scipy as SP 05.from weakclassify    Import WEAKC 06.from dml.tool import sign 07.class adabc:08.        def __init__ (SELF,X,Y,WEAKER=WEAKC): 09.            "" Weaker is a class of weak classifier 11. It should has a train (self.                                W) method Pass the weight parameter to train 12.            Pred (Test_set) method which return y formed by 1 or-1 13.        See detail in < Statistics learning method > 14. "'.        X=np.array (X) 16.        Self.y=np.array (y) 17. Self.        Weaker=weaker 18.        Self.sums=np.zeros (Self.y.shape) 19. Self. W=np.ones (self. x.shape[1],1). Flatten (1)/self.        X.SHAPE[1] 20. Self.        Q=0 21. #print self.    W 22.        Def train (self,m=4): 23.        "" "is the maximal weaker classification 25. "'. Self.        g={} 27.        self.alpha={} 28.   For I in Range (M): 29.         Self.            G.setdefault (i) 30.        Self.alpha.setdefault (i) 31.            For I in Range (M): 32. Self. G[i]=self. Weaker (self.            X,SELF.Y) 33. E=self. G[i].train (self.            W) 34. #print self. G[i].t_val,self.            G[i].t_b,e 35.            Self.alpha[i]=1/2*np.log ((1-E)/E) 36.            #print Self.alpha[i] 37. Sg=self. G[i].pred (self.            X) 38. Z=self.            W*np.exp (-self.alpha[i]*self.y*sg.transpose ()) 39. Self.            W= (Z/z.sum ()). Flatten (1) 40. Self.            Q=i 41.            #print Self.finalclassifer (i), ' =========== ' 42.  If Self.finalclassifer (i) ==0:43.                Print i+1, "weak classifier is enough to make the error to 0" 45.    Break 46.        def finalclassifer (self,t): 47.        The 1 to t weak Classifer come together 49. "" Self.sums=self.sums+self. G[t].pred (self.        X). Flatten (1) *self.alpha[t] 51.        #print Self.sums 52. Pre_y=sign (SELf.sums) 53.        #sums =np.zeros (Self.y.shape) 54.        #for i in range (t+1): 55. # sums=sums+self. G[i].pred (self.        X). Flatten (1) *self.alpha[i] 56.        # Print sums 57.        #pre_y =sign (sums) 58.        T= (PRE_Y!=SELF.Y). SUM () 59.    Return T 60.        def pred (Self,test_set): 61.        Sums=np.zeros (Self.y.shape) 62. For I in range (self.            q+1): 63. Sums=sums+self. G[i].pred (self.            X). Flatten (1) *self.alpha[i] 64.        #print sums 65.        Pre_y=sign (sums) 66.   Return pre_y


Let's try the simplest example of the statistical learning method:
can see is also three classifiers there is no wrong points, the choice of weights is similarone of the following-1 means greater than threshold is divided into negative classes, less than divided into positive classes. 1 the opposite

Add some additional data to try it out:
Results:the picture we put out is:
Basic or correct, this is a figure of four sub-classifiers, not the last general classifier of the figure Ah ~ ~ ~


Write a adaboost algorithm that a person thinks is more detailed

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.