Paper: | PS-1A.25 | ||
Session: | Poster Session 1A | ||
Location: | H Lichthof | ||
Session Time: | Saturday, September 14, 16:30 - 19:30 | ||
Presentation Time: | Saturday, September 14, 16:30 - 19:30 | ||
Presentation: | Poster | ||
Publication: | 2019 Conference on Cognitive Computational Neuroscience, 13-16 September 2019, Berlin, Germany | ||
Paper Title: | Do LSTMs know about Principle C? | ||
Manuscript: | Click here to view manuscript | ||
License: | ![]() This work is licensed under a Creative Commons Attribution 3.0 Unported License. |
||
DOI: | https://doi.org/10.32470/CCN.2019.1241-0 | ||
Authors: | Jeff Mitchell, Nina Kazanina, Conor Houghton, Jeff Bowers, University of Bristol, United Kingdom | ||
Abstract: | We investigate whether a recurrent network trained on raw text can learn an important syntactic constraint on coreference. A Long Short-Term Memory (LSTM) network that is sensitive to some other syntactic constraints was tested on psycholinguistic materials from two published experiments on coreference. Whereas the participants were sensitive to the Principle C constraint on coreference the LSTM network was not. Our results suggest that, whether as cognitive models of linguistic processes or as engineering solutions in practical applications, recurrent networks may need to be augmented with additional inductive biases to be able to learn models and representations that fully capture the structures of language underlying comprehension. |