Loss function is an important part in artificial neural networks, which is used to measure the inconsistency between predicted value (^y) and actual label (y). It is a non-negative value, where the robustness of model increases along with the decrease of the value of loss function.

**Read this article to see the various types of loss functions**

Some of the loss functions are:

- mean squared error
- mean absolute error
- mean absolute percentage error
- mean squared logarithmic error
- kullback leibler divergence

See the article for the full list

**Have a doubt or thought? Join the discussion now**

This is a companion discussion topic for the original entry at http://iq.opengenus.org/types-of-loss-function/