Pages

Sunday, September 26, 2010

NEURAL NETWORKS

NEURAL NETWORKS

ABSTRACT:


Artificial neural networks are a method of computation and information processing that takes advantage of today's technology. Mimmicing the processes found in biological nuerons

l artificial neural networks are used to predict and learn from a given set a data. Neural networks are more robust at data analysis than statistical methods because of their ability to handle small variations of parameters and noise.The basic element of a neural network is the perceptron. First proposed by Frank Rosenblatt in 1958 at Cornell University, the perceptron has 5 basic elements:an n-vector input, weights, summing function, threshold device, and an output. Outputs are in the form of -1 and/or +1. The threshold has a setting which governs the output based on the summation of input vectors. If the summation falls below the threshold setting, a -1 is the output. If the summation exceeds the threshold setting, +1 is the output. A more technical investigation of a Single Neuron Preceptron shows that it can have an input vector X of N dimensions. These inputs go through a vector W of Weights of N dimension. Processed by the Summation Node, "a" is generated where "a" is the "dot product" of vectors X and W plus a Bias. "A" is then processed through an activation function which compares the value of "a" to a predefined Threshold. If "a" is below the Threshold, the perceptron will not fire. If it is above the Threshold, the perceptron will fire one pulse whose amplitude is predefined.


for more info visit.
http://www.enjineer.com/forum

No comments:

Post a Comment