Tag: RNN
Are Recurrent Neural Networks capable of warping time?
‘Can Recurrent neural networks warp time?’ is authored by Corentin Tallec and Yann Ollivier to be presented at ICLR 2018.
This paper explains that...
Paper in Two minutes: Attention Is All You Need
A paper on a new simple network architecture, the Transformer, based solely on attention mechanisms
The NIPS 2017 accepted paper, Attention Is All You Need,...
Unity releases ML-Agents v0.3: Imitation Learning, Memory-Enhanced Agents and more
The Unity team has released the version 0.3 of their anticipated toolkit ML-Agents. The new release is jam-packed with features on the likes of...
Yoshua Bengio et al on Twin Networks
The paper “Twin Networks: Matching the Future for Sequence Generation”, is written by Yoshua Bengio in collaboration with Dmitriy Serdyuk, Nan Rosemary Ke, Alessandro...