Machine learning and Neural Networks (ii): Introduction of Perceptron and implementation of Python code __python

Source: Internet
Author: User

This article mainly introduces the knowledge of Perceptron, uses the theory + code practice Way, and carries out the learning of perceptual device. This paper first introduces the Perceptron model, then introduces the Perceptron learning rules (Perceptron learning algorithm), finally through the Python code to achieve a single layer perceptron, so that readers a more intuitive understanding. 1. Single-layer Perceptron model

Single-layer perceptron is a neural network with single layer computational unit, his structure and functions are so simple that it is rarely used in practical problems, but the Perceptron (perceptron) first proposes self-organizing and self-learning ideas, so he plays a fundamental role in the learning of neural networks. 1.1 Perceptron Model

The figure below is a single layer Perceptron model diagram, he contains multiple input nodes x0-xn, weights vector w0-wn (note, here X0 and W0 represent the bias factor, General x0=1, X0 in the figure should be Xn), an output node O, activation function is the SGN function. (PS pictures are from the network)


According to the above figure, we can see that the output of neurons is:


2. Learning rules for Perceptron:

As I said before, the Perceptron has the ability to learn and adapt, so how does he learn, we look at the picture below


Here, let's explain his process:

First, we enter the training sample X and the initialization weight vector W, the vector point multiplication, then the point multiplication result is used to activate the function sgn (), obtains the actual output O, now we adjust the initialization weight vector W According to the gap error between the actual output O and the desired output d. So repeatedly, until W adjusts to the appropriate results.

So, let's take a look at how we adjust the weight vector w based on the difference between the actual output and the expected output. This is called the Perceptron learning rule:


Here's another point: ETA in the formula 2.19A represents the learning rate, and he says that the amplitude of each adjustment is an artificially set parameter, usually based on empirical values or experimentally. 3. Awareness of the Python code implementation

Well, we already know the Perceptron model and the relevant learning rules, so we can use Python to implement him (code based on Python2.7,anaconda implementation)

#!
    /usr/bin/env python #coding =utf-8 import numpy as NP #感知器分类的学习 class Perceptron: ' ETA: Learning rate N_iter: Training times for weight vectors
        W_: Weight vector errors_: number of times a neuron is judged to be wrong "def __init__ (self,eta=0.01,n_iter=10): Self.eta=eta
        Self.n_iter=n_iter def Fit (self,x,y): ' Input training data x, training neurons, x input samples, Y for sample classification x=[[1,2],[4,5] y=[-1,1] "#初始化权重向量, plus 1 is because W0 Self.w_=np.zeros (1+x.shape[1)) #print (self.w_) #w_ =[0
            , 0,0] self.errors_=[] for I in range (self.n_iter): Errors=0 ' Zip (x,y) =[[1,2,-1],[4,5,1]] ' for xi,target in Zip (x,y): #每次迭代使用一个样本去更新W #相当于u
                pdate=$* (Y-y '), where the predicted results are used to make error judgments update=self.eta* (Target-self.predict (xi)) '
                Xi is a vector [1,2] update is a numeric update*xi equivalent to W1 ' =x1*update;w2 ' =x2*update
      '''          Self.w_[1:]+=update*xi self.w_[0]+=update*1 #打印更新的W_ #print Self
                
    . w_ #统计 the correct number of judgements Errors+=int (update!=0) self.errors_.append (Errors)
        def net_input (self,x): ' Z=w0*1+w1*x1+w2x2+...+wm*xm x0=1 (General w0=0,x0=1) ' Return Np.dot (x,self.w_[1:]) +self.w_[0]*1 def predict (self,x): #相当于sign () function ' y> =0--->1 y<0---->-1 ' return Np.where (Self.net_input (X) >=0.0,1,-1)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.