This paper proposes a self-attention enhanced
Recurrent Neural Networks for the task of poetry
generation in Hindi language. The proposed framework
uses Long Short-Term Memory with multi-head selfattention mechanism. We have utilized the multi-head
self-attention component to further develop the element
determination and hence protect reliance over longer
lengths in the recurrent neural network architectures.
The paper uses a Hindi poetry dataset to train the
network to generate poetries given a set of words as input.
The two LSTM models proposed in the paper are able to
generate poetries with significant meaning.
Keywords : Hindi Poetry Generation, Text Generation, Poetry Generation, Long Short-Term Memory.