Encoding word-order and semantic information using modular neural networks

Abstract

Vector space models have been successfully used for lexical semantic representation. Some of these models rely on distributional properties of words in large corpora, and have been contrasted with human performance on semantic similarity and priming in lexical decision tasks. Neural network models of lexical representation have been classically of reduced size due to computational limitations. Recently, associative memory models have been related to semantic space models such as LSA and Bayesian classificators. Our goal is to build lexical representations that include semantic and word-order information by using context-dependent neural networks. The model integrates word-order and semantic information in a way reminiscent of the BEAGLE model. In this work we train a multi-modular neural network on a large corpus, and use it to account for reaction times in lexical priming studies.


Back to Table of Contents