Changes

Jump to: navigation, search

A-Team

96 bytes added, 10:39, 8 March 2019
Initial Profile
[[File:neuralnet_chart.jpg]]
After the initial profile it is obvious that the dot product function consumes 97.94% of our run time. Additionally, the transpose function also consumes 1.45% which seems messily, however during back propagation transpose is also called, as well as two rectifiers(activation functions), reluPrime and relu. https://wikimedia.org/api/rest_v1/media/math/render/svg/6883e7bff0d9ac2f89caa6c905be539bf7c13d65
// Back propagation
113
edits

Navigation menu