Concepedia

TLDR

Many real‑life decision‑making problems involve higher‑order structure linking stimuli, actions, and rewards, and it is unknown whether regions like the vmPFC use a stored model of this structure or simply learn values without higher‑order assumptions. To discriminate between these possibilities, we scanned human subjects with fMRI while they performed a simple decision‑making task with higher‑order structure, probabilistic reversal learning. We found that neural activity in the vmPFC was more consistent with a computational model that exploits higher‑order structure than with simple reinforcement learning, suggesting that the vmPFC uses an abstract model of task structure to guide choice and support complex social interactions and abstract strategizing.

Abstract

Many real-life decision-making problems incorporate higher-order structure, involving interdependencies between different stimuli, actions, and subsequent rewards. It is not known whether brain regions implicated in decision making, such as the ventromedial prefrontal cortex (vmPFC), use a stored model of the task structure to guide choice (model-based decision making) or merely learn action or state values without assuming higher-order structure as in standard reinforcement learning. To discriminate between these possibilities, we scanned human subjects with functional magnetic resonance imaging while they performed a simple decision-making task with higher-order structure, probabilistic reversal learning. We found that neural activity in a key decision-making region, the vmPFC, was more consistent with a computational model that exploits higher-order structure than with simple reinforcement learning. These results suggest that brain regions, such as the vmPFC, use an abstract model of task structure to guide behavioral choice, computations that may underlie the human capacity for complex social interactions and abstract strategizing.

References

YearCitations

Page 1