Changes

Jump to: navigation, search

A-Team

13 bytes removed, 16:32, 8 March 2019
Relu = f(x) = {0 for x > 0, x otherwise}
After the initial profile it is obvious that the dot product function consumes 97.94% of our run time. Additionally, the transpose function also consumes 1.45% which seems messily, however during back propagation transpose is also called, as well as two rectifiers(activation functions), reluPrime and relu. Where reluPrime is a binary activation function.
======= Relu = f(x) = {0 for x > 0, x otherwise}=======
=======ReluPrime = f(x) = {0 for x > 0, 1 otherwise}=======
113
edits

Navigation menu