Yikang Shen: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks (ICLR2019)
Steven Van Vaerenbergh Steven Van Vaerenbergh
3.41K subscribers
2,164 views
0

 Published On May 13, 2019

Speaker: Yikang Shen

Paper: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Authors: Yikang Shen, Shawn Tan, Alessandro Sordoni, Aaron Courville

In general, natural language is governed by a tree structure: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). This is a strict hierarchy: when a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM allows different neurons to track information at different time scales, the architecture does not impose a strict hierarchy. This paper proposes to add such a constraint to the system by ordering the neurons; a vector of "master" input and forget gates ensure that when a given unit is updated, all of the units that follow it in the ordering are also updated. To this end, we propose a new RNN unit: ON-LSTM, which achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.

Presented at ICLR 2019

show more

Share/Embed