Learn R Programming

quickSentiment (version 0.1.0)

xgb_model: Train a Gradient Boosting Model using XGBoost

Description

This function trains a model using the xgboost package. It is highly efficient and natively supports sparse matrices, making it ideal for text data. It automatically handles both binary and multi-class classification problems.

Usage

xgb_model(train_vectorized, Y, test_vectorized, parallel = FALSE)

Value

A list containing two elements:

pred

A vector of class predictions for the test set.

model

The final, trained `xgb.Booster` model object.

Arguments

train_vectorized

The training feature matrix (e.g., a `dfm` from quanteda).

Y

The response variable for the training set. Should be a factor.

test_vectorized

The test feature matrix, which must have the same features as `train_vectorized`.

parallel

Logical

Examples

Run this code
# Create dummy vectorized data
train_matrix <- matrix(runif(100), nrow = 10)
test_matrix <- matrix(runif(50), nrow = 5)
y_train <- factor(sample(c("P", "N"), 10, replace = TRUE))

# Run model
model_results <- xgb_model(train_matrix, y_train, test_matrix)
print(model_results$pred)

Run the code above in your browser using DataLab