Lesson 2: Modeling Human Cognition

How do we make intelligent inferences about objects, events, and relations in the world, from a quick glance? From infancy to adulthood, how do we learn new concepts and reason about novel situations, from little experience? In this unit, Josh Tenenbaum introduces a framework for addressing these questions based on the creation of generative models of the physical and social worlds that enable probabilistic inference about objects and events.

Part 1 introduces the concept of a generative model, and how such a model can provide causal explanations of the world, how uncertainty can be embodied in probabilities incorporated in the model, and how it can guide inferences used to plan and to solve problems.

Part 2 illustrates the framework through problem domains in which learning and inference are captured by probabilistic generative models in which knowledge is organized into graphical structures that are learned from experience. You will see how the framework can be used to model how humans learn new word concepts, infer likely causes of disease, and infer properties of objects.

The framework described in Parts 1 and 2 has limitations when to comes to common-sense reasoning about physical and social behaviours. Greater reasoning power and flexibility can be achieved by building knowledge into a probabilistic program, much like a computer program, that can simulate behaviour in a way that incorporates uncertainty about the world.

Part 3 illustrates the idea of model building through the creation of probabilistic programs, through examples of intuitive physics, inferring properties of a human face or body from a single visual image, and making inferences about planning, beliefs, and desires.

image

Josh Tenenbaum and colleagues propose that our intuitions about properties like the stability of a stack of objects may derive from “probabilistic programs” in our heads that can simulate, with some uncertainty, the physics that governs how objects behave in space and time. Aspects of these programs are learned from infancy, as a child interacts with the world. (Image courtesy of Josh Tenenbaum, used with permission.)

Unit Activities

Useful Background

  • Introduction to cognitive science
  • Probability and statistics
  • Introduction to machine learning, including probabilistic inference methods

Lesson 2.1: Computational Cognitive Science Part 1


Description: Exploring how humans learn new concepts and make intelligent inferences from little experience. Using probabilistic generative models to reason about the physical and social world, and provide rich causal explanations of behavior.

Instructor: Josh Tenenbaum


Click here for the lesson transcript

Click here for the lesson slides

Lesson 2.2: Computational Cognitive Science Part 2


Description: Exploring how humans learn new concepts and make intelligent inferences from little experience. Using probabilistic generative models to reason about the physical and social world, and provide rich causal explanations of behavior.

Instructor: Josh Tenenbaum


Click here for the lesson transcript

Lesson 2.3: Computational Cognitive Science Part 3


Description: Exploring how humans learn new concepts and make intelligent inferences from little experience. Using probabilistic generative models to reason about the physical and social world, and provide rich causal explanations of behavior.

Instructor: Josh Tenenbaum


Click here for the lesson transcript

Further Study

Additional information about the speaker’s research and publications can be found at his website:

See the tutorial by Tomer Ulman on the implementation of probabilistic models using the Church programming language.

Baker, C. L., R. Saxe, et al. This resource may not render correctly in a screen reader.“Action Understanding as Inverse Planning.” (PDF) Cognition 113 (2009): 329–49.

Battaglia, P. W., J. B. Hamrick, et al. This resource may not render correctly in a screen reader.“Simulation as an Engine of Physical Scene Understanding.” (PDF - 2.0MB) Proceedings of the National Academy of Sciences 110, no. 45 (2013): 18327–32.

Goodman, N. D., and J. B. Tenenbaum (e-book). Probabilistic Models of Cognition.

Kulkarni, T., P. Kohli, et al. This resource may not render correctly in a screen reader.“Picture: An Imperative Probabilistic Programming Language for Scene Perception.” (PDF - 2.4MB) Proceedings IEEE Conference Computer Vision and Pattern Recognition (2015): 4390–99.

Perfors, A., J. B. Tenenbaum, et al. This resource may not render correctly in a screen reader.“A Tutorial Introduction to Bayesian Models of Cognitive Development.” (PDF) Cognition 120 (2011): 302–21.

Tenenbaum, J. B., C. Kemp, et al. This resource may not render correctly in a screen reader.“How to Grow a Mind: Statistics, Structure, and Abstraction.” (PDF) Science 331 (2011): 1279–85.