Learn R Programming

attention

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.

CRAN install

The package can be installed from CRAN using:

install.packages('attention')

Preview version

The development version, to be used at your peril, can be installed from GitHub using the remotes package.

if (!require('remotes')) install.packages('remotes')
remotes::install_github('bquast/attention')

Development

Development takes place on the GitHub page.

https://github.com/bquast/attention

Bugs can be filed on the issues page on GitHub.

https://github.com/bquast/attention/issues

Copy Link

Version

Install

install.packages('attention')

Monthly Downloads

610

Version

0.4.0

License

GPL (>= 3)

Maintainer

Bastiaan Quast

Last Published

November 10th, 2023

Functions in attention (0.4.0)

RowMax

Maximum of Matrix Rows
SoftMax

SoftMax sigmoid function
attention

Attnention mechanism
ComputeWeights

SoftMax sigmoid function