Neural Network and genetic algorithm

Source: Internet
Author: User

The neural network is used to deal with the nonlinear relationship, the relationship between input and output can be determined (there is a nonlinear relationship), can take advantage of the neural network self-learning (need to train the data set with explicit input and output), training after the weight value determination, you can test the new input.

Genetic algorithm is used to solve the problem of the most value, biological evolution, the best. There is no limit to flexibility, and the only difficulty is the choice of coding chromosomes and evaluating functions.

The combination of the two can be determined from two aspects:

The first, auxiliary combination: Using GA to pre-process data, and then solve the problem with NN, such as pattern recognition with GA for feature extraction, and then using NN classification

The second, cooperation, GA and NN work together to deal with the problem, in the NN fixed network topology, the use of GA to determine the link weights, or directly using GA optimization network structure reuse, BP training network.

First, the NN and GA-Assisted bonding methods:

According to the different stages of GA in the process of solving the problem with NN, the following three methods are divided into:
1. Data preprocessing is performed in the pattern classification application, the feature extraction is made by GA, and then the NN is used to classify.
2. In the training network, using GA to select Network Learning parameters or learning rules;
3. Use GA to interpret or analyze neural networks.

1. Data preprocessing in Pattern classification applications
Generally speaking, the success of feature extraction plays a key role in the results of pattern classification. Using GA to select the training data for NN is a new technique. Kelly and DAVISLL use GA to select the parameters of the data set and the scale factor of each feature, so as to reduce the difference between the classes in the data, and increase the difference between the classes, thus greatly improving the performance of the network to the pattern classification. Chang and LIPPMANNLL not only compress the original 153 features using GA to 33 characters in a complex speech recognition case, but also synthesize new features by using GA. Thus, the feature compression not only improves the performance of neural network classification, but also reduces the computational amount greatly. Similarly, Guo and UHRLG also use GA to extract features to train the NN to monitor the accident in the power plant, and the simulation results demonstrate the effectiveness of this method. The difference is that their fitness function takes into account the number of inputs chosen, in addition to the training errors.

Using the idea of evolution, the parameters of BP algorithm, such as learning rate, moment and so on adaptive adjustment, is the first attempt of evolutionary learning rules, harp and others The BP algorithm and network structure are coded together in the chromosome, and the optimal combination of BP algorithm and network structure is obtained through evolutionary operation, and this coding method is essentially a study of the interaction between the learning algorithm and the network structure. Belew and others in the case of pre-fixed network structure, using GA to optimize the learning rate and moment, the results show that the learning rate obtained by GA is always higher than the empirical value. They speculated that the result was the relatively low number of studies used in the experiment. In essence, their research work is to optimize the learning algorithm by adjusting the parameters of the learning algorithm, and the optimization of learning rules (i.e. weight updating rules) which is more profound than the optimization of the learning algorithm is only discussed by the limited researchers. Although Hebb learning rules are widely adopted and used as a basis for many learning algorithms. However Hancockn and others recent studies show that in Artola E. Another learning rule based on other people's work is more effective than optimizing Hebb learning rules. Because compared with this, it can not only learn more patterns, but also can learn the normal mode, can learn the abnormal mode and thus for the specific network and application of the development of automatic optimization of network learning Rules of the method is undoubtedly very attractive.

The human learning ability from relatively weak to very powerful evolutionary process shows that the introduction of evolutionary mechanisms to neural network learning has great potential. Some studies in the field of artificial life embody the interrelationship between learning and evolution, while most work focuses on learning how to guide evolution or the relationship between structural training and weight training. The evolution of learning rules has received only a certain degree of attention. Chalmers in this regard, he assumes that the network weight update depends only on local information, such as input, output signal, training signal and the current connection weight, while the learning rule is expressed as the four independent variables and the 22 product of the linear function between them. The coefficients of these 10 variables and a scale parameter are encoded in chromosomes, and the goal of learning rule evolution is to determine these coefficients. The network structure used in the experiment has no hidden layer, there is only one output node, the input of the network varies from 2 to 7, and the learning rules are tested and evaluated by various linear learning cases. GA finally discovered the well-known 8 rules and some of its variants, these simple and preliminary experiments show that through the learning rules evolved to discover novel, useful learning rules of the potential, however, learning rules form constraints, that is, only two variables of the product without including three, four variables of the product, may also hinder GA from discovering some new learning rules.

3. Using GA to interpret and analyze neural networks
Unlike the use of GA to design nn, some researchers are also trying to use GA to explain and analyze neural networks Suzuki and Kakazu in the attraction domain of the analytical associative memory model. In order to analyze the phenomenon that the memory process in the attraction domain is monotonous, they abstract the basic feature of this phenomenon into 10 polynomial function, and then use GA to optimize the coefficients of polynomial. Eberhart and others use GA to analyze the decision surface of neural network, and provide a new path for the interpretation of neural network.

Neural Network and genetic algorithm

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.