The Full Wiki

More info on ADALINE

ADALINE: Wikis

Advertisements
  
  

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

From Wikipedia, the free encyclopedia

ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is a single layer neural network. It was developed by Professor Bernard Widrow and his graduate student Ted Hoff at Stanford University in 1960. It is based on the McCulloch-Pitts neuron. It consists of a weight, a bias and a summation function.

The difference between Adaline and the standard (McCulloch-Pitts) perceptron is that in the learning phase the weights are adjusted according to the weighted sum of the inputs (the net). In the standard perceptron, the net is passed to the activation (transfer) function and the function's output is used for adjusting the weights.

There also exists an extension known as Madaline.

Learning inside a single layer ADALINE

Definition

Adaline is a single layer neural network with multiple nodes where each node accepts multiple inputs and generates one output. Given the following variables:

  • x is the input vector
  • w is the weight vector
  • n is the number of inputs
  • θ some constant
  • y is the output

then we find that the output is y=\sum_{j=1}^{n} x_j w_j + \theta. If we further assume that

  • xn + 1 = 1
  • wn + 1 = θ

then the output reduces to the dot product of x and w y=x_j \cdot w_j

Learning Algorithm

Let us assume:

  • η is the learning rate (some constant)
  • d is the desired output
  • o is the actual output

then the weights are updated as follows w \leftarrow w + \eta(d-o)x. The ADALINE converges to the least squares error which is E = (do)2 [1].

References


Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message