This function can be used to train or fine-tune a transformer based on Longformer architecture with the help of the python libraries 'transformers', 'datasets', and 'tokenizers'.
train_tune_longformer_model(
ml_framework = aifeducation_config$get_framework,
output_dir,
model_dir_path,
raw_texts,
p_mask = 0.15,
val_size = 0.1,
n_epoch = 1,
batch_size = 12,
chunk_size = 250,
full_sequences_only = FALSE,
min_seq_len = 50,
learning_rate = 0.03,
n_workers = 1,
multi_process = FALSE,
sustain_track = TRUE,
sustain_iso_code = NULL,
sustain_region = NULL,
sustain_interval = 15,
trace = TRUE,
keras_trace = 1,
pytorch_trace = 1,
pytorch_safetensors = TRUE
)This function does not return an object. Instead the trained or fine-tuned model is saved to disk.
string Framework to use for training and inference.
ml_framework="tensorflow" for 'tensorflow' and ml_framework="pytorch"
for 'pytorch'.
string Path to the directory where the final model
should be saved. If the directory does not exist, it will be created.
string Path to the directory where the original
model is stored.
vector containing the raw texts for training.
double Ratio determining the number of words/tokens for masking.
double Ratio determining the amount of token chunks used for
validation.
int Number of epochs for training.
int Size of batches.
int Size of every chunk for training.
bool TRUE for using only chunks
with a sequence length equal to chunk_size.
int Only relevant if full_sequences_only=FALSE.
Value determines the minimal sequence length for inclusion in training process.
bool Learning rate for adam optimizer.
int Number of workers. Only relevant if ml_framework="tensorflow".
bool TRUE if multiple processes should be activated.
Only relevant if ml_framework="tensorflow".
bool If TRUE energy consumption is tracked
during training via the python library codecarbon.
string ISO code (Alpha-3-Code) for the country. This variable
must be set if sustainability should be tracked. A list can be found on
Wikipedia: https://en.wikipedia.org/wiki/List_of_ISO_3166_country_codes.
Region within a country. Only available for USA and Canada See the documentation of codecarbon for more information. https://mlco2.github.io/codecarbon/parameters.html
integer Interval in seconds for measuring power
usage.
bool TRUE if information on the progress should be printed
to the console.
int keras_trace=0 does not print any
information about the training process from keras on the console.
keras_trace=1 prints a progress bar. keras_trace=2 prints
one line of information for every epoch.
Only relevant if ml_framework="tensorflow".
int pytorch_trace=0 does not print any
information about the training process from pytorch on the console.
pytorch_trace=1 prints a progress bar.
bool If TRUE a 'pytorch' model
is saved in safetensors format. If FALSE or 'safetensors' not available
it is saved in the standard pytorch format (.bin). Only relevant for pytorch models.
Beltagy, I., Peters, M. E., & Cohan, A. (2020). Longformer: The Long-Document Transformer. tools:::Rd_expr_doi("10.48550/arXiv.2004.05150")
Hugging Face Documentation https://huggingface.co/docs/transformers/model_doc/longformer#transformers.LongformerConfig
Other Transformer:
create_bert_model(),
create_deberta_v2_model(),
create_funnel_model(),
create_longformer_model(),
create_roberta_model(),
train_tune_bert_model(),
train_tune_deberta_v2_model(),
train_tune_funnel_model(),
train_tune_roberta_model()