1 Introduction
Recurrent neural networks, long short-term memory [13] and gated recurrent [7] neural networks in particular, have been firmly established as state of the art approaches in sequence modeling and transduction problems such as language modeling and machine translation [35, 2, 5]. Numerous efforts have since continued to push the boundaries of recurrent language models and encoder-decoder architectures (38, 24, 15].
All you need is love v all you need is attention, cited by 235,789

bw Geirge Karl coach, George Carl identifying Hat trick wrapped mic
cf CChazChase eats things



























