Learn R Programming

iimi (version 1.2.2)

train_iimi: train_iimi()

Description

Trains a XGBoost (default), Random Forest, or Elastic Net model using user-provided data.

Usage

train_iimi(
  train_x,
  train_y,
  method = "xgb",
  nrounds = 100,
  min_child_weight = 10,
  gamma = 20,
  ntree = 200,
  mtry = 10,
  k = 5,
  ...
)

Value

A Random Forest, XGBoost, Elastic Net model

Arguments

train_x

A data frame or a matrix of predictors.

train_y

A response vector of labels (needs to be a factor).

method

The machine learning method of choice, Random Forest or XGBoost, or Elastic Net model. Default is XGBoost model.

nrounds

Max number of boosting iterations for XGBoost model. Default is 100.

min_child_weight

Default is 10.

gamma

Minimum loss reduction required in XGBoost model. Default is 20.

ntree

Number of trees in Random Forest model. Default is 100.

mtry

Default is 10.

k

Number of folds. Default is 5.

...

Other arguments that can be passed to randomForest, xgboost, or glmnet.

Examples

Run this code
if (FALSE) {
df <- convert_rle_to_df(example_cov)
train_x <- df[,-c(1:4)]
train_y = c()
for (ii in 1:nrow(df)) {
  seg_id = df$seg_id[ii]
  sample_id = df$sample_id[ii]
  train_y = c(train_y, example_diag[seg_id, sample_id])
}
trained_model <- train_iimi(train_x = train_x, train_y = train_y)
}




Run the code above in your browser using DataLab