back-propagation

Computing Dictionary

back-propagation definition


(Or "backpropagation") A learning algorithm for modifying a feed-forward neural network which minimises a continuous "error function" or "objective function." Back-propagation is a "gradient descent" method of training in that it uses gradient information to modify the network weights to decrease the value of the error function on subsequent tests of the inputs. Other gradient-based methods from numerical analysis can be used to train networks more efficiently.
Back-propagation makes use of a mathematical trick when the network is simulated on a digital computer, yielding in just two traversals of the network (once forward, and once back) both the difference between the desired and actual output, and the derivatives of this difference with respect to the connection weights.
The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source
Explore Dictionary.com
Previous Definition: back-projection
Next Definition: back-rest
Words Near: back-propagation
More from Thesaurus.com
Synonyms and Antonyms for back-propagation
More from Reference.com
Search for articles containing back-propagation
Dictionary.com Word FAQs

Dictionary.com presents 366 FAQs, incorporating some of the frequently asked questions from the past with newer queries.

Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature