Home Page XOR Neural Network

Layers

Dense Layer Activation Layer

Activation functions Cost functions

Training modes

Backpropagation Minibatched backprop
What is it Different functions

Activation functions

What is it

Activation functions are non-linear functions used in the Activation layer.

Different functions

Leaky relu: ActivationFunctions.leaky_relu

Tanh: ActivationFunctions.tanh