Plot the vector graph of attention based text visualisation
-
Updated
Apr 12, 2019 - Python
Plot the vector graph of attention based text visualisation
This repository contains PyTorch implementation of 4 different models for classification of emotions of the speech.
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
[AAAI SAP 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Ipython Notebooks for solving problems like classification, segmentation, generation using latest Deep learning algorithms on different publicly available text and image data-sets.
Tensorflow Implementation of Im2Latex
Forex price movement forecast
A Tensorflow 2 (Keras) implementation of DA-RNN (A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction, arXiv:1704.02971)
Enhanced BiLSTM Inference Model for Natural Language Inference
This is a implementation of integrating a simple but efficient attention block in CNN + bidirectional LSTM for video classification.
attention mechanism in keras, like Dense and RNN...
SERVER: Multi-modal Speech Emotion Recognition using Transformer-based and Vision-based Embeddings
The open source implementation of the multi grouped query attention by the paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
Generating text sequences using attention-based Bi-LSTM
Deep representation of visual and textual descriptions using StackGAN
This repository implementation of the Attention mechanism using Tensorflow using various examples.
📃 | Deep Text Recognition Implementation using PyTorch
Experiments with Deep Learning for generating music
Add a description, image, and links to the attention-lstm topic page so that developers can more easily learn about it.
To associate your repository with the attention-lstm topic, visit your repo's landing page and select "manage topics."