Книга: Richa Kathuria Karthikeyan «Backpropagation and it`s Modifications»

Backpropagation and it`s Modifications

Производитель: "LAP Lambert Academic Publishing"

Gradient based methods are one of the most widely used error minimization methods used to train back propagation networks. The BP training algorithm is a supervised learning method for multi-layered feedforward neural networks. It is essentially a gradient descent local optimization technique which involves backward error correction of the network weights. It has many limitations of convergence, getting trapped in local minima and performance. To solve this there are different modifications like introducing momentum and bias terms, conjugate gradient are used. In the conjugate gradient algorithms a search is performed along conjugate directions, which produces generally faster convergence than steepest descent directions. Here, in this monograph we will consider parity bit checking problem with conventional backpropagation method and other methods. We have to construct suitable neural network and train it properly. The training dataset has to be used to train classification engine ... ISBN:9783847379355

Издательство: "LAP Lambert Academic Publishing" (2012)

ISBN: 9783847379355

См. также в других словарях:

  • Synaptic weight — In neuroscience and computer science, synaptic weight refers to the strength or amplitude of a connection between two nodes, corresponding in biology to the amount of influence the firing of one neuron has on another. The term is typically used… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»