In deep learning, backpropagation is one of the two sub-processes of the training process, which adjust the parameters of the forward propagation with respect to the error it produces.
Backpropagation stands for “backward propagation of errors”. It refers to the algorithm used for training feedforward neural networks by repeatedly adjusting the network’s weights to minimize the difference between the actual output vector of the net and the desired output vector.
Backpropagation aims to minimize the cost function by adjusting the network’s weights and biases. The cost function gradients determine the level of adjustment concerning parameters like activation function, weights, bias, etc.