Ann Neural Network--sigmoid activation function programming exercise (Python implementation)

Source: Internet
Author: User
Tags assert

# ----------# # There is functions to finish:# First, in Activate (), write the sigmoid activation function.# Second, in Update (), write the gradient descent update rule. Updates should be# performed online, revising the weights after each data point.# # ----------ImportNumPy asNpclassSigmoid:"""This class models a artificial neuron with sigmoid activation function.    """    def __init__( Self, weights=Np.array ([1])):"""Initialize weights based on input arguments. Note that no type-checkingis being performed here for simplicity of code.        """         Self. Weights=Weights# note:you don't need to worry about these, attribues for this        # Programming Quiz, but these'll be a useful for if you want to create        # A network out of these sigmoid units!         Self. last_input= 0 # strength of last input         Self. Delta= 0 # Error Signal    defActivate Self, values):"""Takes in @param values, a list of numbers equal to length of weights.@return The output of a sigmoid unit with given inputs based on unitweights.        """                # YOUR CODE here                        # First calculate the strength of the input signal.Strength=Np.dot (values, Self. Weights) Self. last_input=Strength# Todo:modify Strength using the sigmoid activation function and        # return as output signal.        # hint:you Want to create a helper function to compute the        # Logistic function since you'll need it for the update function.Result=  Self. Logistic (Strength)returnResultdefLogistic Self, strength):return 1/(1+Np.exp (-Strength))defUpdate Self, values, train, ETA=.1):"""Takes in a 2D array @param values consisting of a LIST of inputs and a1D Array @param train, consisting of a corresponding list of expectedoutputs. Updates internal weights According to gradient descent usingthese values and an optional learning rate, @param eta.        """        # Todo:for Each data point ...         forX, Y_trueinch Zip(Values, train):# Obtain the output signal for theY_pred=  Self. Activate (X)# YOUR CODE here            # Todo:compute derivative of logistic function at input strength            # RECALL:D/DX Logistic (x) = Logistic (x) * (1-logistic (x))Dx=  Self. Logistic ( Self. Last_input)*(1 -  Self. Logistic ( Self. Last_input))Print("dx{}:".format(DX))Print('\ n')# Todo:update Self.weights based on learning, signal accuracy,            # function slope (derivative) and input valueDelta_w=Eta*(Y_true-y_pred)*Dx*XPrint("delta_w:{} weight before {}".format(Delta_w, Self. Weights)) Self. Weights+=Delta_wPrint("delta_w:{} weight after {}".format(Delta_w, Self. Weights))Print('\ n')defTest ():"""A Few tests to make sure, the Perceptron class performs as expected.Nothing should show on the output if all the assertions pass.    """    defSum_almost_equal (Array1, Array2, tol= 1e-5):return sum(ABS(array1-Array2))<Tol U1=Sigmoid (weights=[3,-2,1])assert ABS(U1.activate (Np.array ([1,2,3]))- 0.880797)< 1e-5U1.update (Np.array ([[1,2,3]]), Np.array ([0]))assertSum_almost_equal (U1.weights, Np.array ([2.990752,-2.018496,0.972257])) U2=Sigmoid (weights=[0,3,-1]) U2.update (Np.array ([[-3,-1,2],[2,1,2]]), Np.array ([1,0]))assertSum_almost_equal (U2.weights, Np.array ([-0.030739,2.984961,-1.027437]))if __name__ == "__main__": Test ()
OUTPUT
Running Test () ... dx0. 104993585404:d elta_w:[-0.0092478  -0.01849561 -0.02774341] Weight before [3,-2,1]delta_w:[-0.0092478  -0.01849561 -0.02774341] weight after [2.9907522  -2.01849561  0.97225659]dx0. 00664805667079:d elta_w:[-0.00198107 -0.00066036  0.00132071] Weight before [0,3,-1]delta_w:[-0.00198107 -0.00066036  0.00132071] weight after [-1.98106867e-03   2.99933964e+00  -9.98679288e-01]dx0. 196791859198:d elta_w:[-0.02875794 -0.01437897 -0.02875794] Weight before [-1.98106867e-03   2.99933964e+00  -9.98679288e-01]delta_w:[-0.02875794 -0.01437897 -0.02875794] weight after [-0.03073901  2.98496067 -1.02743723]all Done!

Ann Neural Network--sigmoid activation function programming exercise (Python implementation)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.