Statistical Learning of Nonadjacencies Predicts On-line Processing of Long-Distance Dependencies in Natural Language


Statistical learning (SL) research aims to clarify the potential role that associative-based learning mechanisms may play in language. Understanding learners’ processing of nonadjacent statistical structure is vital to this enterprise, since language requires the rapid tracking and integration of long-distance dependencies. This paper builds upon existing nonadjacent SL work by introducing a novel paradigm for studying SL on-line. By capturing the temporal dynamics of the learning process, the new paradigm affords insights into the time course of learning and the nature of individual differences. Across 3 interrelated experiments, the paradigm and results thereof are used to bridge knowledge of the empirical relation between SL and language within the context of nonadjacency learning. Experiment 1 therefore charts the micro-level trajectory of nonadjacency learning and provides an index of individual differences in the new task. Substantial differences are further shown to predict participants’ sentence processing of complex, long-distance natural dependencies in Experiment 2. SRN simulations in Experiment 3 then closely capture key patterns of human nonadjacency processing, attesting to the efficacy of associative-based learning mechanisms that appear foundational to performance in the new, language-linked task.

Back to Thursday Papers