text summarization using Seq2Seq model using bidirectional LSTM encoder-decoder architecture and GLOVE embedding
download model weights --> https://drive.google.com/open?id=1d_WCtusv9UW0GjiMfESgzHGeNzJUonJ4
dataset --> https://www.kaggle.com/sunnysai12345/news-summary
GLOVE embedding --> http://nlp.stanford.edu/data/glove.42B.300d.zip
attention.py --> https://github.com/thushv89/attention_keras/blob/master/layers/attention.py