Enhancing Methodological Rigor for Computational Cognitive Science: Complexity Analysis


Complexity analysis provides a measure of how well the computations being performed by a cognitive model are supported by the constraints of biological computing. We argue that research in computational cognitive science will be greatly aided by treating computational complexity as a primary issue on the same order of importance as significance analysis in experimental work. In this paper, we give a brief review of computational complexity and its application in computer science. We then show how it can be applied to incorporate biological constraints abstractly into computational cognitive science research, even in the presence of massive uncertainty about the underlying neuroscience. To ground this discussion of complexity, examples are drawn from two recent pieces of work: Xu and Tenenbaum's work applying Bayesian inference to word learning and Forbus and Hinrich's work on the construction of a large-scale analogical reasoner.

Back to Thursday Papers