Extraordinary Natural Ability: Anagram Solution as an Extension of Normal Reading Ability

Abstract

A recurrent connectionist model of normal word reading, parameterized on an artificial lexicon, can solve anagrams in which the letters of a word have been highly permuted. Distinctive predictions of the model about competition effects in anagram solution and about which reorderings should be hard were supported by two experiments in which human subjects solved English anagrams. The results are in line with prior work on anagram solution, which supports the idea that skilled anagram solvers employ their naturally acquired knowledge of word structure to succeed at this unusual task. The results extend previous modeling efforts by showing how anagram solution ability may be closely related to normal reading ability. Although the model proposed in the current paper is not a learning model, we suggest that exploring the dynamics of recurrent neural networks like the one we propose may provide an avenue by which the theory of cognition can address highly extrapolative generalization ability, in which people excel at tasks that are qualitatively distinct from their training experiences.


Back to Thursday Posters