Article | Published:

Dopamine neurons encode the better option in rats deciding between differently delayed or sized rewards

Nature Neuroscience volume 10, pages 16151624 (2007) | Download Citation

Subjects

Abstract

The dopamine system is thought to be involved in making decisions about reward. Here we recorded from the ventral tegmental area in rats learning to choose between differently delayed and sized rewards. As expected, the activity of many putative dopamine neurons reflected reward prediction errors, changing when the value of the reward increased or decreased unexpectedly. During learning, neural responses to reward in these neurons waned and responses to cues that predicted reward emerged. Notably, this cue-evoked activity varied with size and delay. Moreover, when rats were given a choice between two differently valued outcomes, the activity of the neurons initially reflected the more valuable option, even when it was not subsequently selected.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    Dopamine, learning and motivation. Nat. Rev. Neurosci. 5, 483–494 (2004).

  2. 2.

    Getting formal with dopamine and reward. Neuron 36, 241–263 (2002).

  3. 3.

    & Reward, motivation and reinforcement learning. Neuron 36, 285–298 (2002).

  4. 4.

    , , & Associative learning mediates dynamic shifts in dopamine signaling in the nucleus accumbens. Nat. Neurosci. 10, 1020–1028 (2007).

  5. 5.

    & Importance of unpredictability for reward responses in primate dopamine neurons. J. Neurophysiol. 72, 1024–1027 (1994).

  6. 6.

    , & Discrete coding of reward probability and uncertainty by dopamine neurons. Science 299, 1898–1902 (2003).

  7. 7.

    , & Coding of predicted reward omission by dopamine neurons in a conditioned inhibition paradigm. J. Neurosci. 23, 10402–10410 (2003).

  8. 8.

    & Dopamine neurons report an error in the temporal prediction of reward during learning. Nat. Neurosci. 1, 304–309 (1998).

  9. 9.

    , & Dopamine responses comply with basic assumptions of formal learning theory. Nature 412, 43–48 (2001).

  10. 10.

    , & A framework for mesencephalic dopamine systems based on predictive hebbian learning. J. Neurosci. 16, 1936–1947 (1996).

  11. 11.

    & Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron 47, 129–141 (2005).

  12. 12.

    , , , & Dopamine neurons can represent context-dependent prediction error. Neuron 41, 269–280 (2004).

  13. 13.

    , , & Dopamine cells respond to predicted events during classical conditioning: evidence for eligibility traces in the reward-learning network. J. Neurosci. 25, 6235–6242 (2005).

  14. 14.

    , , , & Midbrain dopamine neurons encode decisions for future action. Nat. Neurosci. 9, 1057–1063 (2006).

  15. 15.

    & Reward-predicting activity of dopamine and caudate neurons—a possible mechanism of motivational control of saccadic eye movement. J. Neurophysiol. 91, 1013–1024 (2004).

  16. 16.

    , , , & Impulsive choice induced in rats by lesions of the nucleus accumbens core. Science 292, 2499–2501 (2001).

  17. 17.

    & The pharmacology of impulsive behaviour in rats: the effects of drugs on response choice with varying delays of reinforcement. Psychopharmacology (Berl.) 128, 161–170 (1996).

  18. 18.

    Relative and absolute strength of response as a function of frequency of reinforcement. J. Exp. Anal. Behav. 4, 267–272 (1961).

  19. 19.

    , , , & Theory and method in the quantitative analysis of “impulsive choice” behaviour: implications for psychopharmacology. Psychopharmacology (Berl.) 146, 362–372 (1999).

  20. 20.

    et al. Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement. Psychopharmacology (Berl.) 160, 290–298 (2002).

  21. 21.

    & Choices, values and frames. Am. Psychol. 39, 341–350 (1984).

  22. 22.

    et al. Single units in the pigeon brain integrate reward amount and time-to-reward in an impulsive choice task. Curr. Biol. 15, 594–602 (2005).

  23. 23.

    Choice Over Time (Russel Sage Foundation, New York, 1992).

  24. 24.

    Some empirical evidence on dynamic inconsistency. Econ. Lett. 8, 201–207 (1981).

  25. 25.

    , , & Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice. J. Neurosci. 24, 4718–4722 (2004).

  26. 26.

    , , & Limbic corticostriatal systems and delayed reinforcement. Ann. NY Acad. Sci. 1021, 33–50 (2004).

  27. 27.

    et al. Effects of orbital prefrontal cortex dopamine depletion on intertemporal choice: a quantitative analysis. Psychopharmacology (Berl.) 175, 206–214 (2004).

  28. 28.

    , & Effects of dopaminergic drugs on delayed reward as a measure of impulsive behavior in rats. Psychopharmacology (Berl.) 150, 90–101 (2000).

  29. 29.

    , & The effects of d-amphetamine, chlordiazepoxide, alpha-flupenthixol and behavioural manipulations on choice of signalled and unsignalled delayed reinforcement in rats. Psychopharmacology (Berl.) 152, 362–375 (2000).

  30. 30.

    , , , & Previous cocaine exposure makes rats hypersensitive to both delay and reward magnitude. J. Neurosci. 27, 245–250 (2007).

  31. 31.

    , & Encoding of time-discounted rewards in orbitofrontal cortex is independent of value representation. Neuron 51, 509–520 (2006).

  32. 32.

    , & Adaptive coding of reward value by dopamine neurons. Science 307, 1642–1645 (2005).

  33. 33.

    & Heterogeneity of ventral tegmental area neurons: single-unit recording and iontophoresis in awake, unrestrained rats. Neuroscience 85, 1285–1309 (1998).

  34. 34.

    , & Comparison of effects of L-dopa, amphetamine and apomorphine on firing rate of rat dopaminergic neurones. Nat. New Biol. 245, 123–125 (1973).

  35. 35.

    , & Dopamine auto- and postsynaptic receptors: electrophysiological evidence for differential sensitivity to dopamine agonists. Science 206, 80–82 (1979).

  36. 36.

    , & Choice values. Nat. Neurosci. 9, 987–988 (2006).

  37. 37.

    , & Striatonigrostriatal pathways in primates form an ascending spiral from the shell to the dorsolateral striatum. J. Neurosci. 20, 2369–2382 (2000).

  38. 38.

    & The connections of the dopaminergic system with the striatum in rats and primates: an analysis with respect to the functional and compartmental organization of the striatum. Neuroscience 96, 451–474 (2000).

  39. 39.

    , & Lesions of dorsolateral striatum preserve outcome expectancy, but disrupt habit formation in instrumental learning. Eur. J. Neurosci. 19, 181–189 (2004).

  40. 40.

    et al. Dissociable roles of ventral and dorsal striatum in instrumental conditioning. Science 304, 452–454 (2004).

  41. 41.

    , & A neostriatal habit learning system in humans. Science 273, 1399–1402 (1996).

  42. 42.

    , , , & Neurotoxic lesions of basolateral, but not central, amygdala interfere with Pavlovian second-order conditioning and reinforcer devaluation effects. J. Neurosci. 16, 5256–5265 (1996).

  43. 43.

    , & Orbitofrontal cortex and representation of incentive value in associative learning. J. Neurosci. 19, 6610–6614 (1999).

  44. 44.

    , , , & Control of response selection by reinforcer value requires interaction of amygdala and orbitofrontal cortex. J. Neurosci. 20, 4311–4319 (2000).

  45. 45.

    , & Encoding predictive reward value in human amygdala and orbitofrontal cortex. Science 301, 1104–1107 (2003).

  46. 46.

    , & Limbic cortical-ventral striatal systems underlying appetitive conditioning. Prog. Brain Res. 126, 263–285 (2000).

  47. 47.

    et al. Central amygdala ERK signaling pathway is critical to incubation of cocaine craving. Nat. Neurosci. 8, 212–219 (2005).

Download references

Acknowledgements

We thank Y. Niv, P. Shepard, G. Morris and W. Schultz for thoughtful comments on this manuscript, and S. Warrenburg at International Flavors and Fragrances for his assistance in obtaining odor compounds. This work was supported by grants from the US National Institute on Drug Abuse (R01-DA015718, G.S.; K01-DA021609, M.R.R.), the National Institute of Mental Health (F31-MH080514, D.J.C.), the National Institute on Aging (R01-AG027097, G.S.) and the National Institute of Neurological Disorders and Stroke (T32-NS07375, M.R.R.).

Author information

Author notes

    • Matthew R Roesch
    •  & Donna J Calu

    These authors contributed equally to this work.

Affiliations

  1. Department of Anatomy and Neurobiology, University of Maryland School of Medicine, 20 Penn Street, HSF-2 S251, Baltimore, Maryland 21201, USA.

    • Matthew R Roesch
    •  & Geoffrey Schoenbaum
  2. Program in Neuroscience, University of Maryland School of Medicine, 20 Penn Street, HSF-2 S251, Baltimore, Maryland 21201, USA.

    • Donna J Calu
  3. Department of Psychiatry, University of Maryland School of Medicine, 20 Penn Street, HSF-2 S251, Baltimore, Maryland 21201, USA.

    • Geoffrey Schoenbaum

Authors

  1. Search for Matthew R Roesch in:

  2. Search for Donna J Calu in:

  3. Search for Geoffrey Schoenbaum in:

Contributions

M.R.R., D.J.C. and G.S. conceived the experiments. M.R.R. and D.J.C. carried out the recording work and assisted with electrode construction, surgeries and histology. The data were analyzed by M.R.R. and G.S., who also wrote the manuscript with assistance from D.J.C.

Corresponding author

Correspondence to Matthew R Roesch.

Supplementary information

PDF files

  1. 1.

    Supplementary Text and Figures

    Supplementary Figures 1–6 and Data

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nn2013

Further reading Further reading