Neurondotnet-User Manual

Source: Internet
Author: User
Neurondotnet-User Manual

Introduction to Neural Network

Has a mentor, supervised and unsupervised, and has no mentor.

Feed-forward Back Propagation Neural Network Feed-forward Backpropagation Neural Networks, ffbpnn

BP algorithm Overview

Learning Rate

Momentum and jitter Jitter

Activation Functions

Kohonen self-organization chart

Neighbor Functions

Delete grid topology Lattice Topology

Kohonen layer Layer shape

Neural Network Design

Application of Neural Network

Set Development Environment

Overview of neurondotnet classes

Some useful code snippets

Neurondotnet application case

 

Introduction to Neural Network

Artificial Neural Network attempts to create a mathematical model of human brain. The human brain is made up of connections between hundreds of millions of parallel neural cells called neurons. Information is transmitted from one neuron to another by means of an electronic pulse through a connection called a syn. A typical neuron is connected to tens of thousands of other neurons through a syn.

The neuron structure has been changing throughout a person's life. The basic structure defined at birth is constantly created and destroyed as new connections evolve from learning. The later changes mainly consist of the enhancement and weakening of the synchastic joints. (For example, the process of learning a new friend's face introduces a change in the SYN intensity ). A large number of neurons, highly interconnected, and the complex behavior of each neuron make the human brain so incredible.

Artificial neural network is only an extremely simplified human brain model. The basic block for forming an artificial neural network-artificial neurons are a very simple structure designed to mimic biological neurons. These neurons communicate with each other through artificial syntaxes.

A typical artificial neural network consists of hundreds of artificial neurons connected by thousands of artificial neural networks. To simplify the entire structure, neurons have the same nature and are packaged into layers. Two neurons in the same layer are not connected to each other.

Shows the structure of the most common three-layer feed-forward neural network. Artificial neurons of the same nature are organized by layer. The first layer is the input layer used to obtain external input. For example, during face recognition, the bitmap image of the face is loaded into the input layer. Each neuron in the input layer is usually associated with all neurons in the next layer (usually called the hidden layer. The output layer obtains the hidden layer signal and provides the result to the external environment.

The mechanisms of data transmission and the various parameter values involved in transmission are adjusted reasonably to make the behavior of the neural network look like an artificial brain. The process of adjusting these parameters is called neural network training. Now there are various algorithms used to train neural networks.

Has a mentor, supervised and unsupervised, and has no mentor.

Neural Networks are trained before they are put into use. The training process includes loading the training sample into the neural network and allowing it to learn by adjusting the weight of the SYN and various parameters. In general, neural networks can be divided into the following two learning types:

  • Mentor Learning Network

In the mentor learning method, Neural Network learns through samples. The training sample set consists of a series of Input Samples and expected output pairs that correspond to these inputs one by one. The neural network adjusts the weight of its SYN to learn the relationship between input and output pairs. The neural network after successful training can be used to find the most suitable output for any valid input.

The goal of a mentor is to give a set (x, f (x) and find function f.

The term 'supervised' is derived from the fact that the expected output value provided by the external environment is used to guide the learning process.

Reverse propagation algorithms are widely used in mentor learning algorithms. Some mentors play an important role in the following applications.

  1. Function modeling and sensor
  2. Data Classification
  3. Pattern Recognition
  • No mentor Learning Network

For the unsupervised learning method, neurons simply receive input sample sets from the external environment. It seems mysterious to imagine that the network only learns from the input sample. However, it can be formally proved that the no-mentor network can construct input representations for decision generation.

The goal of Non-mentor learning is to get representation of input data through a reasonable network.

You can use the following applications without a mentor

  1. Clustering
  2. Simplified dimension
  3. Search for patterns in unorganized data

Self-Organized graphs are usually used for neural networks without tutors

Feed-forward Back Propagation Neural Network

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.