Menu: Home :: go to Journal :: switch to Russian :: switch to English
You are here: all Journals and Issues→ Journal→ Issue→ Article

ON THE GRADIENT OF NEURONETWORK FUNCTION

Annotation

The paper proposes a matrix formula for the gradient of neuronetwork function ∇Wf(X;W) with respect to the parameter vector W .

Keywords

neuronetwork function; neural network; backpropagation algorithm; Hadamard product

Full-text in one file

Download

DOI

10.20310/1810-0198-2017-22-3-552-557

UDC

519.85

Pages

552-557

References

1. Haykin С. Neural Networks: A Comprehensive Foundation. М.: Viliams, 2006. 2. Rumelhart D.E., Hinton G.E., Williams R.J., Learning Internal Representations by Error Propagation // Parallel Distributed Processing, Cambridge: MA, MIT Press. 1986. V. 1. P. 318–362.

Received

2017-03-02

Section of issue

Functional-differential equations and inclusions and their applications to mathematical modeling

Для корректной работы сайта используйте один из современных браузеров. Например, Firefox 55, Chrome 60 или более новые.