alexa Error Weighting in Artificial Neural Networks Learning Interpreted as a Metaplasticity Model
Biomedical Sciences

Biomedical Sciences

International Journal of Biomedical Data Mining

Author(s): Diego Andina, Aleksandar Jevti, Alexis Marcano, J M Barrn Adame

Abstract Share this page

Many Artificial Neural Networks design algorithms or learning methods imply the minimization of an error objective function. During learning, weight values are updated following a strategy that tends to minimize the final mean error in the Network performance. Weight values are classically seen as a representation of the synaptic weights in biological neurons and their ability to change its value could be interpreted as artificial plasticity inspired by this biological property of neurons. In such a way, metaplasticity is interpreted in this paper as the ability to change the efficiency of artificial plasticity giving more relevance to weight updating of less frequent activations and resting relevance to frequent ones. Modeling this interpretation in the training phase, the hypothesis of an improved training is tested in the Multilayer Perceptron with Backpropagation case. The results show a much more efficient training maintaining the Artificial Neural Network performance.

This article was published in Bio-inspired Modeling of Cognitive Tasks and referenced in International Journal of Biomedical Data Mining

Relevant Expert PPTs

Relevant Speaker PPTs

Recommended Conferences

Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2017-18
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

 
© 2008-2017 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version
adwords