"Neural Networks for Machine Learning" by Hinton Study notes (i)

Source: Internet
Author: User
1. Why We need Machine Learning

It's hard to find some rules or write programs directly to solve a problem. For example: three-dimensional object recognition--we don't know how our brains recognize objects, we can't find good rules to describe this problem, and even if we can find better rules, the complexity of programming can be very high. Deceptive credit card trading-the so-called while outsmart, the cheats are always constantly updated, we need a program that can be constantly updated and constantly identify the scam.

Using the machine learning method can solve the above problem well: we don't have to design the rules ourselves, we just need to give you enough data. We don't have to worry about the changes in the scene (such as the tricks of the cheats), as long as the new data changes, our standards of judgment will change accordingly. Now the computational resources are plentiful, which makes machine learning possible.

There are a lot of machine learning problems in real life: speech recognition (my major study) image recognition deceptive credit card transactions

These are very good motivation to learn using machines. So the question is, what is machine learning? 2. What is Machine Learning

Before we talk about machine learning, we must first understand the nerves in the human brain. The following is a typical synaptic structure:

Most of our neurons are connected in this way, and the axon ends of a neuron are larger, associated with the cell body (cell body) or dendrites of the other neurons, including synaptic membranes, synaptic clearances, and synaptic membranes. The information transmission between nerves is such a process (this is just a general process, which may be not rigorous place, welcomed the criticism that, after all, this is not my major ... ):

After the electrical signals from the last neuron come along, the mitochondria generate energy, pushing the synaptic vesicles to move to the synaptic membranes, releasing the neurotransmitter synaptic vesicles and synaptic membranes to release neurotransmitters to the synaptic gap neurotransmitters to move to synaptic membranes or move to other places, All in all, the neurotransmitter must be consumed (by receptors, or consumed in other ways, perhaps decomposed) before the next one. Anyway the receptor in the synaptic membrane after receiving the neurotransmitter will produce potential changes and generate a pulse signal. So that the stimulus is passed backwards.

There are several places in the synapse that need attention: the neurotransmitters in the neurons are both excitatory and inhibited, so the neurons are also divided into excitability and inhibition. The number of vesicles (transmitters in this area) is different in different neurons, and each neuron is not connected to a single neuron, and usually one neuron has 104 10^. Whether 4 neurons are connected to it or not depends on the sum of all the neurotransmitters that connect to it.

An abstract representation of the relationship between neurons

In addition, our brains have notable features: The brain is partitioned, different regions have different divisions. The areas of the brain begin to have the same function, the back gradually changes, start doing different work the brain is highly concurrent, and it is different from our traditional computer serial computation (an instruction instruction is executed, Turing model) 3. Some simple neurons 1. Linear neurons

Considering the characteristics of synapses, we can get a possible, but also simplest, idealized neuron abstraction:
Y=b+∑ixiwi y=b+\sum_ix_iw_i
which

Y y indicates the output of the neuron. Xi x_i represents the output of the first neuron connected to the neuron. WI w_i represents the weight of the first neuron.

b b is a bias constant that can be used for a while.


Linear neurons

In this simplest neuron we ignore the characteristics of many real neurons, for example, our neurons may not be linear, and the synthesis of other neurotransmitters may not be a simple weighted summation. Why, then? Because this will cause its change is not bounded, may be very large and may be very small, and our body can produce the current is basically a constant size, the potential difference is also in a range of changes. So people are looking for a lot of functions to model neurons.

2. Binary threshold Neurons

Z=b+∑ixiwiy={1,0,if z >= 0otherwise z = b + \sum _i x_iw_i \ y=\begin{cases} 1, & \text{if $z $ >= 0} \ 0 ; \text{otherwise} \end{cases}


Binary threshold neurons

3. Rectified Linear neurons

Z=b+∑ixiwiy={z,0,if z > 0otherwise z = b + \sum _i x_iw_i \ y=\begin{cases} z, & \text{if $z $ > 0} \ 0, & \text{otherwise} \end{cases}


Rectified Linear neurons

4. Sigmoid neurons

Z=b+∑ix

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.