Version 1
: Received: 27 February 2023 / Approved: 1 March 2023 / Online: 1 March 2023 (01:57:59 CET)
Version 2
: Received: 2 March 2023 / Approved: 3 March 2023 / Online: 3 March 2023 (01:24:54 CET)
Version 3
: Received: 8 March 2023 / Approved: 9 March 2023 / Online: 9 March 2023 (02:04:21 CET)
BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832.
BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832.
BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832.
BOUGHAMMOURA, A. A Two-Step Rule for Backpropagation. International Journal of Informatics and Applied Mathematics 2023, doi:10.53508/ijiam.1265832.
Abstract
We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursively computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.
Keywords
Artificial neural networks; back-propagation
Subject
Computer Science and Mathematics, Computational Mathematics
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Commenter: Ahmed Boughammoura
Commenter's Conflict of Interests: Author