Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

average attention mask #904

Open
vparmain opened this issue Jan 27, 2023 · 1 comment
Open

average attention mask #904

vparmain opened this issue Jan 27, 2023 · 1 comment
Assignees

Comments

@vparmain
Copy link

Describe the requested improvement
In their paper on L-TAE (https://arxiv.org/pdf/2007.00586.pdf), authors describe a method called "attention mask" which aim at situating the discriminant source of data for each class.

Associated sits API function
Is there a way to implement this approch in sits?

@gilbertocamara
Copy link
Contributor

Dear @vparmain

The main difference between the TAE and LightTAE methods proposed by Garnot et al. is the use of the average attention mask to reduce the number of parameters in the model. Both TAE and LightTAE are available on sits using the sits_tae() and sits_lighttae() functions.

To implement sits_lighttae() we used the papers by Vivien Garnot and reference code at
https://github.com/VSainteuf/lightweight-temporal-attention-pytorch

We also used the code made available by Maja Schneider at
https://github.com/maja601/RC2020-psetae

Please let us know if this is what you requested.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants