Use Java open-source project JOONE to implement artificial intelligence Programming

Source: Internet
Author: User
Tags natural logarithm

Introduction
Few programmers are not attracted to artificial intelligence programming either here or there. However, many programmers who are interested in AI quickly fall behind due to the complexity of the algorithms they contain. In this article, we will discuss a Java open-source project that can greatly simplify this complexity.
Java object-oriented Neural Network (JOONE) is an open source project that provides Java programmers with a highly adaptive neural network. The source code of the JOONE project is protected by LGPL. In short, this means that the source code is usable and you can use JOONE without paying royalties. JOONE can be downloaded from http://joone.sourceforge.net.
JOONE allows you to easily create a neural network from a Java program. JOONE supports many features, such as multithreading and distributed processing. This means that JOONE can use the advantages of multiple processor computers and multiple computers for distributed processing.
  Neural Network
JOONE uses Java to implement an artificial neural network. An artificial neural network is trying to emulate the function of a biological neural network-a neural network that forms almost all of today's high-life brain forms on Earth. A Neural Network consists of neurons. Figure 1 shows an actual neural source image.


Figure 1: A biological neuron


As you can see from figure 1, the nerves are originally composed of a kernel cell and several long connectors called tentacles. Neurons are connected by these tentacles. Both biological and Artificial Neural Networks transmit signals from one neuron to another through the tentacles.
Use JOONE
In this article, you will see a simple example of how to use JOONE. Neural Networks are widely used in many different application fields. In this article, we will show you how to use JOONE to solve a very simple pattern recognition problem. Pattern recognition is one of the most common applications in neural networks.
Pattern recognition is provided to a neural network to determine whether the neural network can recognize the pattern. This pattern should be distorted to some extent and the neural network can still recognize it. This is like the ability of humans to recognize things (such as a traffic sign. Humans should be able to identify traffic signs on rainy days, sunny days, or evenings. Even though these images may look quite different, the human brain can still determine that they are the same image.
For JOONE programming, you generally need to use two types of objects. You need to use a neuron object to describe one or more neurons with similar features on a layer. Neural networks often have one or two layers of neurons. These neuron layers are linked together through tentacles. These tentacles transmit this pattern to be identified from one neuron layer to another.
The tentacles not only transmit this pattern from one neuron layer to another neuron layer. The tentacles will also generate some diagonal lines pointing to elements in this mode. These diagonal lines will make some elements in this mode more effective when transmitted to the next neuron layer. These slashes are usually called weights, which form a neural network storage system. You can change the behavior of a neural network by adjusting the weights stored in the tentacles.
The tentacles also assume another role in JOONE. In JOONE, we can regard the tentacles as a data catheter. Just as the tentacles transmit the pattern from one neuron layer to another, the specified version of the tentacles is used to pass the pattern into and out of the neural network. The following shows how a simple single-layer neural network is built and pattern recognized.

Training Neural Networks
To achieve the purpose of this article, we will guide JOONE to identify a very simple pattern. In this mode, we will examine a binary Boolean operation, such as XOR. The truth table of this XOR operation is listed as follows:

X Y X XOR Y
0 0 0
0 1 1
1 0 1
1 1 0


As you can see from the preceding table, the result of the XOR operation is true only when X and Y have different values (1 ). In other cases, the result of the XOR operation is false (0 ). By default, JOONE retrieves input from text files stored in your system. These text files are read by using a special antenna called FileInputSynapse. To train XOR operations, you must create an input file that contains the data shown above. This file is displayed in List 1.
List 1: content of the input file to solve the XOR Problem
0.0; 0.0; 0.0.
0.0; 1.0; 1.0.
1.0; 0.0; 1.0.
1.0; 1.0; 0.0.

Now we analyze a simple program that instructs JOONE to identify XOR operations and generate correct results. We now analyze the process of training the neural network to be processed. The training process includes submitting the XOR problem to the neural network and then observing the results. If this result is not expected, the training algorithm adjusts the weight stored in the tentacles. The gap between the actual output and the expected output of a neural network is called an error. The training continues until the error is less than an acceptable value. This level is usually a percentage, such as 10%. We now analyze the code that must be used to train a neural network.
The training starts by establishing a neural network, and a hidden input layer and output layer must also be created.
// First, create these three layers
Input = new SigmoidLayer ();
Hidden = new SigmoidLayer ();
Output = new SigmoidLayer ();

Each layer is created using the SigmoidLayer of the joone object. The Sigmoidlayer generates an output based on the natural logarithm. JOONE also contains another layer, instead of the layer type you may choose to use.
Next, each layer is assigned a name. These names will help you later identify this layer during debugging.
input.setLayerName("input");
hidden.setLayerName("hidden");
output.setLayerName("output");

Each layer must be defined. We will specify the "line" number in each layer. The line number corresponds to the number of neurons in this layer.
input.setRows(2);
hidden.setRows(3);
output.setRows(1);

From the code above, we can see that the input layer has two neurons, the hidden neurons in the hidden layers, and the output layer contains one neuron. This is of great significance for Neural Networks that contain two input neurons and one output neural, because the XOR operator receives two parameters and produces one result.
To use this neuron layer, we must also create tentacles. In this example, we need to use multiple tentacles. These tentacles are implemented using the following code.
// Input-> hidden connection.
FullSynapse synapse_IH = new FullSynapse ();
// Concealed-> output connection.
FullSynapse synapse_HO = new FullSynapse ();

Just like in the case of the neural source layer, the tentacles may also be named to facilitate program debugging. The following code names the new tentacles.
synapse_IH.setName("IH");
synapse_HO.setName("HO");

Finally, we must connect the tentacles to the proper neuron layer. The following code implements this.
// Link the input layer to the operation layer
Input. addOutputSynapse (synapse_IH );
Hidden. addInputSynapse (synapse_IH );
// Connect the primary layer to the output layer
Hidden. addOutputSynapse (synapse_HO );
Output. addInputSynapse (synapse_HO );

Now that a neural network has been created, we must create a monitor object for tuning the neural network. The following code creates a monitor object.
// Create a monitor object and set learning parameters
Monitor = new Monitor ();
Monitor. setLearningRate (0.8 );
Monitor. setMomentum (0.3 );

The learning speed and motivation are used as parameters to specify the training generation mode. JOONE uses the backpropagation learning algorithm. For more information about the learning speed or motivation, refer to the backpropagation algorithm.
This monitor object should be assigned to each neuron layer. The following code implements this.
input.setMonitor(monitor);
hidden.setMonitor(monitor);
output.setMonitor(monitor);

Like many Java objects, the JOONE monitor allows listeners to be added to it. As the training progresses, JOONE notifies the listener about the training process. In this simple example, we use:
monitor.addNeuralNetListener(this);
We must now establish input tentacles. As mentioned above, we will use a FileInputSynapse to read a disk file. The disk file is not the only input type that JOONE can accept. JOONE is flexible for different input sources. To enable JOONE to receive other input types, you only need to create a new antenna to accept the input. In this example, we will simply use FileInputSynapse. FileInputSynapse is first instantiated.
inputStream = new FileInputSynapse();
Then, you must notify FileInputSynapse which columns to use. The files displayed in List 1 use the first two columns of input data. The following code creates the first two columns for input to the neural network.
// The first two columns contain input values.
InputStream. setFirstCol (1 );
InputStream. setLastCol (2 );

Then, we must provide the name of the input file, which is directly from the user interface. An edit control is provided to collect the names of input files. The following code sets the file name for FileInputSynapse.
// This is the file name that contains the input data
InputStream. setFileName (inputFile. getText ());

As mentioned above, an antenna is only a data catheter between the neurons. FileInputSynapse is the data catheter used to access the neural network. To achieve this more easily, we must add FileInputSynapse to the input layer of the neural network.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.