This function estimates over 40 Metrics for assessing the quality of Machine Learning Models. The purpose is to provide a wrapper which brings all the metrics on the table and makes it easier to use them to select a model.
Arguments
- Observed
The Observed data in a data frame format
- yvalue
The Response variable of the estimated Model
- modeli
The Estimated Model (Model = a + bx)
- K
The number of variables in the estimated Model to consider
- Name
The Name of the Models that need to be specified. They are ARIMA, Values if the model computes the fitted value without estimation like Essembles, SMOOTH (smooth.spline), Logit, Ensembles based on weight - EssemWet, QUADRATIC polynomial, SPLINE polynomial.
- Form
Form of the Model Estimated (LM, ALM, GLM, N-LM, ARDL)
- kutuf
Cutoff for the Estimated values (defaults to 0.5 if not specified)
- TTy
Type of response variable (Numeric or Response - like binary)
Value
A list with the following components:
Absolute Errorof the Model.
Absolute Percent Errorof the Model.
Accuracyof the Model.
Adjusted R Squareof the Model.
`Akaike's` Information Criterion AICof the Model.
Area under the ROC curve (AUC)of the Model.
Average Precision at kof the Model.
Biasof the Model.
Brier scoreof the Model.
Classification Errorof the Model.
F1 Scoreof the Model.
fScoreof the Model.
GINI Coefficientof the Model.
kappa statisticof the Model.
Log Lossof the Model.
`Mallow's` cpof the Model.
Matthews Correlation Coefficientof the Model.
Mean Log Lossof the Model.
Mean Absolute Errorof the Model.
Mean Absolute Percent Errorof the Model.
Mean Average Precision at kof the Model.
Mean Absolute Scaled Errorof the Model.
Median Absolute Errorof the Model.
Mean Squared Errorof the Model.
Mean Squared Log Errorof the Model.
Model turning point errorof the Model.
Negative Predictive Valueof the Model.
Percent Biasof the Model.
Positive Predictive Valueof the Model.
Precisionof the Model.
Predictive Residual Sum of Squaresof the Model.
R Squareof the Model.
Relative Absolute Errorof the Model.
Recallof the Model.
Root Mean Squared Errorof the Model.
Root Mean Squared Log Errorof the Model.
Root Relative Squared Errorof the Model.
Relative Squared Errorof the Model.
`Schwarz's` Bayesian criterion BICof the Model.
Sensitivityof the Model.
specificityof the Model.
Squared Errorof the Model.
Squared Log Errorof the Model.
Symmetric Mean Absolute Percentage Errorof the Model.
Sum of Squared Errorsof the Model.
True negative rateof the Model.
True positive rateof the Model.
Examples
library(splines)
library(readr)
Model <- lm(states ~ bs(sequence, knots = c(30, 115)), data = Data)
MLMetrics(Observed = Data, yvalue = Data$states, modeli = Model, K = 2,
Name = "Linear", Form = "LM", kutuf = 0, TTy = "Number")
#> Warning: NaNs produced
#> Warning: actual should be a list of vectors. Converting to a list.
#> Warning: predicted should be a list of vectors. Converting to a list.
#> $`Absolute Error`
#> [1] 460
#>
#> $`Absolute Percent Error`
#> [1] 51
#>
#> $Accuracy
#> [1] 0
#>
#> $`Adjusted R Square`
#> [1] 0.77
#>
#> $`Akaike's Information Criterion AIC`
#> [1] 1000
#>
#> $`Area under the ROC curve (AUC)`
#> [1] 0
#>
#> $`Average Precision at k`
#> [1] 0
#>
#> $Bias
#> [1] 1.7e-16
#>
#> $`Brier score`
#> [1] 8
#>
#> $`Classification Error`
#> [1] 1
#>
#> $`F1 Score`
#> [1] 0
#>
#> $fScore
#> [1] 0
#>
#> $`GINI Coefficient`
#> [1] 0.8
#>
#> $`kappa statistic`
#> [1] 0
#>
#> $`Log Loss`
#> [1] Inf
#>
#> $`Mallow's cp`
#> [1] 3
#>
#> $`Matthews Correlation Coefficient`
#> [1] 0
#>
#> $`Mean Log Loss`
#> [1] -480
#>
#> $`Mean Absolute Error`
#> [1] 2.3
#>
#> $`Mean Absolute Percent Error`
#> [1] 0.25
#>
#> $`Mean Average Precision at k`
#> [1] 0
#>
#> $`Mean Absolute Scaled Error`
#> [1] 0.74
#>
#> $`Median Absolute Error`
#> [1] 1.9
#>
#> $`Mean Squared Error`
#> [1] 8.4
#>
#> $`Mean Squared Log Error`
#> [1] 0.072
#>
#> $`Model turning point error`
#> [1] 110
#>
#> $`Negative Predictive Value`
#> [1] 0
#>
#> $`Percent Bias`
#> [1] -0.1
#>
#> $`Positive Predictive Value`
#> [1] 0
#>
#> $Precision
#> [1] 1
#>
#> $`Predictive Residual Sum of Squares`
#> [1] 0
#>
#> $`R Square`
#> [1] 0.78
#>
#> $`Relative Absolute Error`
#> [1] 0.47
#>
#> $Recall
#> [1] 1
#>
#> $`Root Mean Squared Error`
#> [1] 2.9
#>
#> $`Root Mean Squared Log Error`
#> [1] 0.27
#>
#> $`Root Relative Squared Error`
#> [1] 0.47
#>
#> $`Relative Squared Error`
#> [1] 0.22
#>
#> $`Schwarz's Bayesian criterion BIC`
#> [1] 1000
#>
#> $Sensitivity
#> [1] 0
#>
#> $specificity
#> [1] 0
#>
#> $`Squared Error`
#> [1] 1700
#>
#> $`Squared Log Error`
#> [1] 14
#>
#> $`Symmetric Mean Absolute Percentage Error`
#> [1] 0.21
#>
#> $`Sum of Squared Errors`
#> [1] 1700
#>
#> $`True negative rate`
#> [1] 0
#>
#> $`True positive rate`
#> [1] 0
#>
