<< Chapter < Page Chapter >> Page >

Neural network

We used the Pattern Recognition tool in MATLAB R2014a (nprtool) to generate a neural network for our feature and target matrices. [3] The work flow for the general neural network design process has seven primary steps:

  1. Collect data
  2. Create the network
  3. Configure the network
  4. Initialize the weights and biases
  5. Train the network
  6. Validate the network (post-training analysis)
  7. Use the network

We used the MATLAB defaults for weights and biases, and tested multiple numbers of hidden layers. 20 hidden layers gave us the best testing results.

Architecture

A neural network is composed of multiple layers of neuron models. These neurons can have multiple weighted inputs. An example neuron with R inputs is shown below in Figure 2. Each input is weighted (initially random pre-training) and the sum of the weighted inputs and the bias forms the input to the transfer function f. These weights and biases are adjusted during training [4].

general neuron.

Figure 2. General neuron model.

The neurons then use a differentiable transfer function f to generate their output. Tangent sigmoid output neurons are generally used for pattern recognition problems. Since our neural network had a hidden layer of 20 neurons, a tan-sigmoid transfer function was used for the hidden neurons.

Neural Network Block Diagram.

Figure 3. Our output neural network.

Back-propagation training

This type of neural network architecture is called a feedforward network. They often have hidden layers of sigmoid neurons followed by an output layer of linear neurons. We train using a method called back-propagation. Back-propagation is commonly used in image processing applications because the problem is extremely complex (many features) but has a clear solution (certain number of classes).

Back-propagation works in small iterative steps. The input matrix is applied to the network, and the network produces an initial output based on the current state of it's synaptic weights (before training the output will be random since these weights are random). This output is compared to the target matrix, and a mean-squared error signal is calculated. [5]

The error value is propagated backwards through the network and small changes are made to the weights in each layer to reduce this error. The whole process is repeated for each training image (9 for each class), then back to the first case. This process is iterated until the error no longer changes.

Testing

The inputs to the neural network are the feature matrix and the target matrix. The feature matrix used for the neural network is the same used for SVM. This matrix is 150 samples by 12 elements (150 images, 12 features). The target matrix is used to identify the correct class of each image. This matrix is 150 samples by 5 elements (150 images, 5 classes). To construct this matrix, we simply coded a 1 in the column of the corresponding class for each image and a 0 in every other column.

We used nprtool to automatically separate the input matrix into the training (60%), validation (20%), and test (20%) matrices. These percentages are consistent with the SVM method.

After the data is loaded, MATLAB generates a neural network with our specifications (20 hidden layers, back propagation training). We trained the network and output the confusion matrices and receiver operating characteristic (ROC) curves. The confusion matrices show the results of the neural network for the training, validation, and testing phases and the ROC curve plots the true positive rate vs. false positive rate.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Automatic white blood cell classification using svm and neural networks. OpenStax CNX. Dec 16, 2015 Download for free at http://legacy.cnx.org/content/col11924/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Automatic white blood cell classification using svm and neural networks' conversation and receive update notifications?

Ask