Home Page XOR Neural Network

Layers

Dense Layer Activation Layer

Activation functions Cost functions

Training modes

Backpropagation Minibatched backprop
What is it How to create it

Layers > Activation

What is it

The Activation layer takes all the inputs and applies the activation function on each one.

It allows the network to learn non-linear functions, so without an Activation layer the network would only be able to learn linear functions. So it's an important part of the network.

How to create an Activation layer

To create a layer we first need a neural network to create the layer in, as seen in the XOR example.

We can then call the add_layer method on the neural network and create the layer that we want as the parameter for the function. In our case we are going to create an Activation layer.

We only need one parameter, it is the activation function we chose for this layer. A commonly used one is the leaky relu function, as the derivative is easy to compute and the derivative does not fades out as for a sigmoid for example. Other activation functions can be found in here.

model.add_layer(nn.Activation.new(.leaky_relu))