Learn R Programming

sits (version 1.1.0)

.torch_light_temporal_attention_encoder: Torch module for temporal attention encoder

Description

Defines a torch module for temporal attention encoding.

This implementation is based on the code made available by Vivien Garnot https://github.com/VSainteuf/lightweight-temporal-attention-pytorch

Usage

.torch_light_temporal_attention_encoder(
  timeline,
  in_channels = 128,
  n_heads = 16,
  n_neurons = c(256, 128),
  dropout_rate = 0.2
)

Value

A linear tensor block.

for Classifying Satellite Image Time Series"

Arguments

timeline

Timeline of input time series.

in_channels

Dimension of the positional encoder.

n_heads

Number of attention heads.

n_neurons

Dimensions of MLP that processes the output of the attention heads.

dropout_rate

Dropout_rate.

Author

Charlotte Pelletier, charlotte.pelletier@univ-ubs.fr

Gilberto Camara, gilberto.camara@inpe.br

Rolf Simoes, rolf.simoes@inpe.br

Felipe Souza, lipecaso@gmail.com

References

Vivien Sainte Fare Garnot and Loic Landrieu, "Lightweight Temporal Self-Attention for Classifying Satellite Image Time Series", https://arxiv.org/abs/2007.00586