
Fulltext:
55368.pdf
Embargo:
until further notice
Size:
171.0Kb
Format:
PDF
Description:
publisher's version
Source
Connection Science, 18, 3, (2006), pp. 287-302ISSN
Publication type
Article / Letter to editor

Display more detailsDisplay less details
Organization
SW OZ DCC PL
Former Organization
SW OZ NICI CO
Journal title
Connection Science
Volume
vol. 18
Issue
iss. 3
Page start
p. 287
Page end
p. 302
Subject
Linguistic Information Processing; PsycholinguisticsAbstract
Connectionist models of sentence processing must learn to behave systematically by generalizing from a small training set. To what extent recurrent neural networks manage this generalization task is investigated. In contrast to Van der Velde et al. (Connection Sci., 16, pp. 21-46, 2004), it is found that simple recurrent networks do show so-called weak combinatorial systematicity, although their performance remains limited. It is argued that these limitations arise from overfitting in large networks. Generalization can be improved by increasing the size of the recurrent layer without training its connections, thereby combining a large short-term memory with a small long-term memory capacity. Performance can be improved further by increasing the number of word types in the training set.
This item appears in the following Collection(s)
- Academic publications [202786]
- Electronic publications [100859]
- Faculty of Social Sciences [27100]
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.