Defines a torch module for temporal attention encoding.
This implementation is based on the code made available by Vivien Garnot https://github.com/VSainteuf/lightweight-temporal-attention-pytorch
.torch_light_temporal_attention_encoder(
timeline,
in_channels = 128,
n_heads = 16,
n_neurons = c(256, 128),
dropout_rate = 0.2
)
A linear tensor block.
for Classifying Satellite Image Time Series"
Timeline of input time series.
Dimension of the positional encoder.
Number of attention heads.
Dimensions of MLP that processes the output of the attention heads.
Dropout_rate.
Charlotte Pelletier, charlotte.pelletier@univ-ubs.fr
Gilberto Camara, gilberto.camara@inpe.br
Rolf Simoes, rolf.simoes@inpe.br
Felipe Souza, lipecaso@gmail.com
Vivien Sainte Fare Garnot and Loic Landrieu, "Lightweight Temporal Self-Attention for Classifying Satellite Image Time Series", https://arxiv.org/abs/2007.00586