Loss function, also known as cost function is a function that represents the difference between the actual and predicted values of a model. That is, if the function value is near 0, the model is considered more accurate than a model that has a function value around 1. Two popular loss functions are MSE and CEE.
✓ Mean Square Error (MSE)
✓ Cross Entropy Error (CEE)
In thermodynamics, entropy represents 'molecular disorder.' But in information theory, this term represent 'the degree of uncertainty in information' or the 'the average amount of information about a probabilistic event.' Let's look at the following terms about CEE.
- Information Quantity
Information quantity has an equation as shown below. The reason for using the base 2 logarithmic function is simply that information theory uses the binary system of 0s and 1s.
- Cross Entropy Error
Cross entropy is defined as the amount of information a particular event X has for different probability distributions p and q. It is the calculated amount of information in the same event for two probability distributions. This can be thought of as calculating the average value for p based on the amount of information for q.
Cross entropy, like the mean squared error, is difficult to predict if the value is too big. Let's look at an example below.
Suppose that the actual distribution has the same probability for both events, but the model sees a higher probability in event Y. If events occurred in an order of , then the cross entropy of model is:
[Click 'details' of a model of your choice, and click on the 'Loss' tab to learn more about th loss function of the selected model.]