Syntactic priming in language comprehension allows linguistic expectations to converge on the statistics of the input


Human language is characterized by variability in that the way in which language is used varies depending, for example, on facts about the identity of the speaker or author, the social context, and surrounding linguistic material. Variability poses formidable challenges to the systems underlying language comprehension, which are known to exploit statistical contingencies in the input to overcome the inherent noisiness of perception; nevertheless, we seem to comprehend language with apparent ease. How is this possible? Here we argue that we are able to comprehend language efficiently in part by continuously adapting to the statistics of novel linguistic situations. We argue further that adaptation specifically allows comprehenders’ expectations to converge towards the actual statistics of the linguistic input. Concretely, we show that readers can adjust their linguistic expectations in light of recent experience such that (a) previously difficult structures become easier to process, and, even more strikingly, (b) previously easy to process structures come to incur a processing cost.

Back to Table of Contents