mlr3learners (version 0.6.0)

mlr_learners_regr.kknn: k-Nearest-Neighbor Regression Learner

Description

k-Nearest-Neighbor regression. Calls kknn::kknn() from package kknn.

Arguments

Initial parameter values

  • store_model:

    • See note.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("regr.kknn")
lrn("regr.kknn")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3learners, kknn

Parameters

IdTypeDefaultLevelsRange
kinteger7\([1, \infty)\)
distancenumeric2\([0, \infty)\)
kernelcharacteroptimalrectangular, triangular, epanechnikov, biweight, triweight, cos, inv, gaussian, rank, optimal-
scalelogicalTRUETRUE, FALSE-
ykerneluntyped-
store_modellogicalFALSETRUE, FALSE-

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrKKNN

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

LearnerRegrKKNN$new()


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrKKNN$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

References

Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. tools:::Rd_expr_doi("10.5282/ubm/epub.1769").

Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733--2763. tools:::Rd_expr_doi("10.1214/12-AOS1049").

Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21--27. tools:::Rd_expr_doi("10.1109/TIT.1967.1053964").

See Also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Examples

Run this code
if (requireNamespace("kknn", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("regr.kknn")
print(learner)

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}

Run the code above in your browser using DataLab