Home Page XOR Neural Network

Layers

Dense Layer Activation Layer

Activation functions Cost functions

Training modes

Backpropagation Minibatched backprop
What is it Parameters

Training > Backpropagation

What is it

Backpropagation is a commonly used training algorithm for neural networks.

Parameters

We need to specify different parameters to be able to make the neural network train successfully.

The learning_rate is a factor that controls the speed of the adjustements of the weights and biases. If it is high the training can be faster but it can overshoot too. If it is low the learning will take more time but can be more precise and get closer to the solution more safely.

The momentum is a factor that controls how much of the precedent adjustement is applied with the new adjustement to the weights and biases. It allows the network to converge a lot faster in a lot of cases as the modification will get more and more momentum towards the solution and less towards noise from the data.

The nb_epochs is how many epochs the learning will last. An epoch is finished when the neural network has seen all the dataset one time. It will apply the modifications calculated with backpropagation on each item of the dataset at the end of each epoch.

classifier tells the training algorithm if it needs to keep track of the accuracy of the classification and other convenient things for classifiers.

The neural newtork will print it's performance every print_interval epochs.

The cost_function is the cost function that you choose for this training.

training and test are the dataset that are used respectively for the training and the tests.

test_params allows finer control over frequency of the tests and what is displayed during the tests.

struct BackpropTrainingParams { learning_rate f64 momentum f64 nb_epochs int classifier bool print_interval int cost_function CostFunctions training Dataset test Dataset test_params TestParams }