# 13. Loss Functions

Loss functions are responsible for calculating some measure of wrongness. We usually use these to inform the neural network and us, just how good or bad our predictions $$\hat{y}$$ are compared to the ground truth $$y$$.

Loss Functions Status

Category

Name

Docs

Forward

Backward

loss

Categorical Cross Entropy (CCE)

Categorical Cross Entropy (CCE)

loss

Mean Absolute Error (MAE)

Mean Absolute Error (MAE)

loss

Mean Squared Error (MSE)

Mean Squared Error (MSE)

## 13.4. Loss Abstraction

Neural Network Loss Functions.

class fhez.nn.loss.loss.Loss

Abstract loss class to unify loss function format.

Calculate gradient of loss with respect to $$\hat{y}$$.

property cache

Get caching dictionary of auxilary data.

disable_cache()

Disable caching.

enable_cache()

Enable caching.

abstract forward(signal: numpy.ndarray, y: numpy.ndarray, y_hat: numpy.ndarray)

Calculate loss(es) given one or more truths.

property inputs

Get cached input stack.

Neural networks backpropogation requires cached inputs to calculate the gradient with respect to x and the weights. This is a utility method that initialises a stack and allows you to easily append or pop off of it so that the computation can occur in FILO.

property is_cache_enabled

Get status of whether or not caching is enabled.

update()

Loss funcs have no params so do nothing.