The principle of machine learning perceptron algorithm and Python implementation

Source: Internet
Author: User

(1) Perceptron model

The Perceptron model contains multiple input nodes: X0-XN, the weight matrix w0-wn (where X0 and W0 represent the bias factor, General x0=1, X0 in the figure should be Xn) an Output node o, the activation function is the sign function.

(2) Perceptron learning rules

  

Enter the training sample x and the initial weight vector W, take it to the vector point multiplication, and then use the result of the multiply sum to activate the function sign (), get the predicted output O, adjust the initialization weight vector W According to the gap error between the predicted output value and the target value. So again and again until W adjusts to the appropriate result.

(3) The original form of the algorithm

(4) Python code implementation
1 ImportNumPy as NP2 3 4 classPerceptron (object):5     6     """Perceptron Classifier (perceptron classifier)7     8 Parameters (parametric)9     ---------------Ten Eta:float Learning Rate One Learning rate (between 0.0 and 1.0) A training times of n_iter:int weights vectors - Passes over training dataset -      the Attributes (attributes) -     -------------- - W_:1d_array one-dimensional weight vector - Weights after fitting + Errors_:list Record the number of neurons judging errors - Number of misclassifications in every epoch +     """ A      at     #Initializing Objects -     def __init__(self,eta=0.01,n_iter=10): -Self.eta=ETA -Self.n_iter=N_iter -          -     #Training Model in     defFit (self,x,y): -         """ to Fit training data. (Fitting training data) +          - Parameters (parametric) the         ---------------- * :p Aram X:list[np.array] One-dimensional array data set $ :p Aram y: Actual result of the trained data setPanax Notoginseng : return: - weights, initialized to a 0 vector R (m+1), m represents the number of mid-latitude (characteristics) of the dataset the x.shape[1] = (100,2) 100 rows 2 columns: Indicates the number of columns in the dataset that is the number of features +  A Np.zeros (count) initializes the specified amount of count to an array of 0 elements self.w_ = [0. 0.0.] the         """ +          -         #Initialize weights and Errors list $Self.w_=np.zeros (1+x.shape[1]) $self.errors_=[] -          -          for_inchRange (self.n_iter): theerrors=0 -              forXi,targetinchzip (x, y):Wuyi                 #calculates the error between the forecast and the actual value multiplied by the learning rate theupdate=self.eta* (target-Self.predict (xi)) -self.w_[1:]+=update*XI WuSelf.w_[0]+=update*1 -Errors + = Int (update!=0) About self.errors_.append (Errors) $             return Self -          -     #define the propagation process of the Perceptron -     defNet_input (self,x): A         """ + Calculate NET Input the :p Aram X:list[np.array] One-dimensional array data set - : return: Calculate the dot product of a vector $ the concept of vector dot product: the {4,5,6} * {= = 1*4+2*5+3*6 = the  the Description: the sum (i*j for I, J in Zip (x, self.w_[1:])) python Compute dot product -         """ in         Print(x,end="  ") the         Print(self.w_[:],end="  ") theX_dot=np.dot (x,self.w_[1:]) +Self.w_[0] About         Print("The dot product is:%d"% x_dot,end="  ") the         returnX_dot the          the     #Defining predictive Functions +     defPredict (self,x): -Target_pred=np.where (Self.net_input (X) >=0.0,1,-1) the         Print("predicted value:%d"% target_pred,end="  ")Bayi         returnTarget_pred

The principle of machine learning perceptron algorithm and Python implementation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.