Skip to contents

Functions to calculate different performance metrics.

In the case of get_bias: Calculate the bias b, i.e. the average difference between predicted y and observed z values:

  bias = mean(y - z)

Usage

get_bias(predicted, observed, ...)

root_mean_squared(predicted, observed, ...)

mean_absolute_error(predicted, observed, ...)

Arguments

predicted

Vector containing the predictions y.

observed

Vector containing the observations z.

...

relative Boolean. If true give the result as a ratio to the average observation mean(ovserved).

Value

m A number representing the relative or absolute value for the metric.

Functions

  • root_mean_squared(): Calculate the square root of the average squared difference between prediction and observation:

    RMSE = sqrt(sum(predicted - observed)^2) / length(predicted)

  • mean_absolute_error(): Calculate the average of the absolute differences between prediction and observation:

    MAE = mean(abs(predicted - observed))

Note

NA values are completely ignored.

See also

Examples

predicted = c(21.5, 22.2, 19.1)
observed = c(20, 20, 20)
get_bias(predicted, observed)
#> [1] 0.04666667
get_bias(predicted, observed, relative = FALSE)
#> [1] 0.9333333

root_mean_squared(predicted, observed)
#> [1] 0.08113774
root_mean_squared(predicted, observed, relative = FALSE)
#> [1] 1.622755

mean_absolute_error(predicted, observed)
#> [1] 0.07666667
mean_absolute_error(predicted, observed, relative = FALSE)
#> [1] 1.533333