ArticlePDF Available

Differential effects of dopaminergic manipulations on risky choice

Authors:

Abstract and Figures

Evaluation of risks and rewards associated with different options is facilitated by components of the mesocorticolimbic dopamine (DA) system. Augmenting or reducing DA activity increases or decreases preference for larger, uncertain rewards when reward probabilities decrease within a session. However, manipulations of DA activity may differentially alter risky choice when shifts in the relative value of probabilistic rewards are greater or lesser than those experienced previously. We investigated the effects of amphetamine and the DA antagonist flupenthixol on risk discounting, whereby we altered the manner in which reward probabilities changed. Rats chose between a "Small/Certain" (one pellet) and a "Large/Risky" lever that delivered four pellets in a probabilistic manner that changed during a session. Separate groups of rats were trained with a descending (100%, 50%, 25%, 12.5%), ascending (12.5-100%) or mixed (100%, 12.5%, 25%, 50%) order of probabilities associated with the large/risky option. Flupenthixol consistently decreased preference for the large/risky option. In contrast, amphetamine increased preference for the large/risky lever when the probabilities decreased over a session, but reduced preference in the ascending condition. Reductions in normal DA tone consistently biases choice away larger, probabilistic rewards. In contrast, increases in DA release may disrupt adjustments in behavior in response to changes in the relative value of certain versus uncertain rewards. These findings further clarify the role of DA in mediating risk/reward judgments and how perturbations in DA signaling may interfere with the ability to adjust decision making in response to changes in reward contingencies.
No caption available
… 
No caption available
… 
No caption available
… 
No caption available
… 
No caption available
… 
Content may be subject to copyright.
A preview of the PDF is not available
... Even amongst healthy populations, amphetamine-induced elevations in DA correlate with risk-taking behavior in the Iowa Gambling Task (IGT; Oswald et al., 2015), a commonly used risk-based decision-making paradigm. Finally, preclinical pharmacological studies in animal models support causal links between DA and risk-behavior, in that DA manipulations change risk-based decision-making in rodents (Simon et al., 2011;St Onge et al., 2010). Thus, parsing the role of DA in risk-taking behavior remains vital. ...
... There are two main classes of DA receptors, DA-1-like and DA-2-like receptors (D1R and D2R respectively), both of which contribute to, or have been associated with, risk-taking behavior in a variety of tasks across species (see Soutschek et al., 2023 for a review of human studies; Ishii et al., 2018;Simon et al., 2011;Soutschek et al., 2023;St Onge et al., 2010). However, the specific contribution of each DA receptor class to risk-taking appears to be mediated by several factors including task specifications (Soutschek et al., 2023;Winstanley and Floresco, 2016). ...
... Moreover, it has been reported that the order in which different effort conditions are presented within a sessionascending or descending-can make a substantial difference to the cognitive demands of the task and what neural systems are recruited. Indeed, some decision-making tasks using fixed-ratio designs featuring descending rather than ascending effort demand fail to show even the basic expected discounting profile [18], and interventions such as drugs can show very different effects depending on the order of demand presentation [18,21,22]. These findings indicate that in addition to decisionmaking per se, these tasks present off-target cognitive demands such as the requirement of the subject to calculate, infer, or remember the effort associated with two identical stimuli in the presence of changing levels of effort [18]. ...
... It is widely reported that the dopamine (DA) system plays a critical role in regulating effort-related decision making, and systemically administered agents targeting this system have been a consistent focus of efforts aimed at discovering new treatments [18,[24][25][26][27][28][29][30][31][32]. However, systemic manipulations of dopamine can affect many other processes such as timing ability, tolerance to delays of reinforcement, memory, behavioral arousal or resistance to extinction [6,19,21,[33][34][35]. Thus, several studies have attempted to unravel the effects of dopaminergic manipulations on physical effort-and delaybased decision-making and have attempted to isolate drugs' effects on motivation from their arousal components by adjusting the delays of reinforcement [18][19][20], comparing different efforts (e.g., repeated lever presses vs. different lever weights) [36] or combining different tasks (e.g., PR vs. Holddown task) [6]. ...
Article
Full-text available
Effort-based decision-making is impaired in multiple psychopathologies leading to significant impacts on the daily life of patients. Preclinical studies of this important transdiagnostic symptom in rodents are hampered, however, by limitations present in currently available decision-making tests, including the presence of delayed reinforcement and off-target cognitive demands. Such possible confounding factors can complicate the interpretation of results in terms of decision-making per se. In this study we addressed this problem using a novel touchscreen Rearing-Effort Discounting (RED) task in which mice choose between two single-touch responses: rearing up to touch an increasingly higher positioned stimulus to obtain a High Reward (HR) or touching a lower stimulus to obtain a Low Reward (LR). To explore the putative advantages of this new approach, RED was compared with a touchscreen version of the well-studied Fixed Ratio-based Effort Discounting (FRED) task, in which multiple touches are required to obtain an HR, and a single response is required to obtain an LR. Results from dopaminergic (haloperidol and d-amphetamine), behavioral (changes in the order of effort demand; fixed-ratio schedule in FRED or response height in RED), and dietary manipulations (reward devaluation by pre-feeding) were consistent with the presence of variables that may complicate interpretation of conventional decision-making tasks, and demonstrate how RED appears to minimize such variables. Neuropsychopharmacology; https://doi.
... Although there have been no relevant clinical studies, some preclinical studies yielded results in line with our findings. An animal study investigating the effects of amphetamine and dopamine (DA) antagonists on risk discounting found that blockade of D1 or D2 receptors of DA reduced the preference for risky options [27]. Another animal study on the effect of noradrenaline (NA) on risk discounting found that suppress of NA release reduced the preference for risky options [28]. ...
Article
Full-text available
Increasing evidence shows that risk preference is associated with schizophrenia. However, the causality and direction of this association are not clear; Therefore, we used Mendelian randomization (MR) to examine the potential bidirectional relationship between risk preference and schizophrenia. Genome-wide association studies (GWAS) summary data on risk preference of 939,908 participants from the UK Biobank and 23andMe were used to identify general risk preference. Data from 320,404 subjects (76,755 cases and 243,649 controls) from The Psychiatric Genomics Consortium were used to identify schizophrenia. The weighted median (WM), the inverse variance weighted (IVW), and the Mendelian randomization-Egger (MR-Egger) methods were used for the MR analysis to estimate the causal effect and detect the directional pleiotropy. The GWAS summary data were respectively from two combined samples, containing 939,908 and 320,404 subjects of European ancestry. Mendelian randomization evidence suggested that risk preference was associated with increased onset of schizophrenia (OR = 2.84, 95CI%: 1.77–4.56, P = 1.58*10 − 5) and that schizophrenia was also associated with raised risk preference (OR = 1.11, 95CI%: 1.07–1.15, P = 7.98*10 − 8). With the use of large-scale GWAS data, robust evidence suggests an interaction between risk preference and schizophrenia. This also indicates that early identification of and intervention for increased risk preference may improve the prognosis of schizophrenia.
... One limitation of the current findings concerns the task designs, in which the contingencies in the RDT, intertemporal choice task, and ROVP task were set to an ascending order (i.e., increasing delays or probability of punishment). This issue is important as, in some cases, the same manipulation can have opposite effects on choice behavior in block design decision-making tasks such as those used here, depending on whether the contingencies increase or decrease across blocks (St Onge et al., 2010;Orsini et al., 2017bOrsini et al., , 2018. Such differences in the results of manipulations depending on the order in which choice contingencies are presented have been interpreted as effects on the ability to adapt choices in the context of contingency changes (i.e., behavioral flexibility). ...
Article
Full-text available
Many individuals undergo mating and/or other aspects of reproductive experience at some point in their lives, and pregnancy and childbirth in particular are associated with alterations in the prevalence of several psychiatric disorders. Research in rodents shows that maternal experience affects spatial learning and other aspects of hippocampal function. In contrast, there has been little work in animal models concerning how reproductive experience affects cost–benefit decision making, despite the relevance of this aspect of cognition for psychiatric disorders. To begin to address this issue, reproductively experienced (RE) and reproductively naïve (RN) female Long-Evans rats were tested across multiple tasks that assess different forms of cost–benefit decision making. In a risky decision-making task, in which rats chose between a small, safe food reward and a large food reward accompanied by variable probabilities of punishment, RE females chose the large risky reward significantly more frequently than RN females (greater risk taking). In an intertemporal choice task, in which rats chose between a small, immediate food reward and a large food reward delivered after a variable delay period, RE females chose the large reward less frequently than RN females. Together, these results show distinct effects of reproductive experience on different forms of cost–benefit decision making in female rats, and highlight reproductive status as a variable that could influence aspects of cognition relevant for psychiatric disorders.
... In addition, the group with preserved reward processing was less likely to be on antipsychotics than the global deficit group. Antipsychotics are posited to exert their effects in part through blocking dopamine D 2 receptors [80], and dopamine depletion has been associated with lower effort and difficulty valuing probability costs in preclinical studies [81,82], components that are integral to reward processing. Thus, one possibility is that participants in this group demonstrated relatively intact reward processing in part because their dopamine systems were not influenced by antipsychotics. ...
Article
Full-text available
Reward processing impairments are a key factor associated with negative symptoms in those with severe mental illnesses. However, past findings are inconsistent regarding which reward processing components are impaired and most strongly linked to negative symptoms. The current study examined the hypothesis that these mixed findings may be the result of multiple reward processing pathways (i.e., equifinality) to negative symptoms that cut across diagnostic boundaries and phases of illness. Participants included healthy controls (n = 100) who served as a reference sample and a severe mental illness-spectrum sample (n = 92) that included psychotic-like experiences, clinical high-risk for psychosis, bipolar disorder, and schizophrenia participants. All participants completed tasks measuring four RDoC Positive Valence System constructs: value representation, reinforcement learning, effort–cost computation, and hedonic reactivity. A k-means cluster analysis of the severe mental illness-spectrum samples identified three clusters with differential reward processing profiles that were characterized by: (1) global reward processing deficits (22.8%), (2) selective impairments in hedonic reactivity alone (40.2%), and (3) preserved reward processing (37%). Elevated negative symptoms were only observed in the global reward processing cluster. All clusters contained participants from each clinical group, and the distribution of these groups did not significantly differ among the clusters. Findings identified one pathway contributing to negative symptoms that was transdiagnostic and transphasic. Future work further characterizing divergent pathways to negative symptoms may help to improve symptom trajectories and personalized treatments.
Preprint
Rationale: Psychostimulants, such as amphetamine (AMPH) and methylphenidate (MPH), non-selectively elevate extracellular concentrations of the catecholamine neurotransmitters, dopamine (DA) and norepinephrine (NE), and are common pharmacological strategies used to improve prefrontal cortex (PFC)-dependent cognitive dysfunction. However, this approach can be problematic given AMPH has been shown to increase preference for risky choices in a rodent assay of risk/reward decision making. SK609 is a novel NE reuptake blocker that selectively activates DA D3 receptors without affinity for the DA transporter. SK609 has been shown to improve cognitive performance without increasing psychostimulant-like spontaneous locomotor activity, suggesting SK609 may benefit neurocognitive function without psychostimulant-like side effect liability. Objectives: We compared AMPH, MPH, and SK609 within dose ranges that display their cognitive enhancing properties in a probabilistic discounting task (PDT) of risk/reward decision making behavior to assess their potential to increase risky choice preference. Methods: Rats chose between small/certain rewards delivered with 100% certainty and large/risky rewards delivered with descending probabilities across a session (100-6.25%) following administration of AMPH (0.25-1 mg/kg), MPH (2-8 mg/kg), and SK609 (4 mg/kg). Results: AMPH and MPH increased risky choice behavior at doses previously reported to enhance cognition, whereas SK609 did not. AMPH and MPH also reduced sensitivity to non-rewarded risky choices. Conclusions: These data highlight the combination of NE transporter blockade and selective D3 activation in pro-cognitive action without psychostimulant-like side effect liability. The absence of DA transporter blockade and non-selective dopaminergic activation are beneficial properties of SK609 that differentiates it from the traditional pro-cognitive psychostimulants.
Preprint
Full-text available
Rationale Adolescent cannabis use is linked to later-life changes in cognition, learning, and memory. Rodent experimental studies suggest Δ ⁹ -tetrahydrocannabinol (THC) influences development of circuits underlying these processes, especially in the prefrontal cortex, which matures during adolescence. Objective We determined how 14 daily THC injections (5mg/kg) during adolescence persistently impacts medial prefrontal cortex (mPFC) dopamine-dependent cognition. Methods In adult Long Evans rats treated as adolescents with THC (AdoTHC), we quantify performance on two mPFC dopamine-dependent reward-based tasks—strategy set shifting and probabilistic discounting. We also determined how acute dopamine augmentation with amphetamine (0, 0.25, 0.5 mg/kg), or specific chemogenetic stimulation of ventral tegmental area (VTA) dopamine neurons and their projections to mPFC impacts probabilistic discounting. Results AdoTHC sex-dependently impacts acquisition of cue-guided instrumental reward seeking, but has minimal effects on set-shifting or probabilistic discounting in either sex. When we challenged dopamine circuits acutely with amphetamine during probabilistic discounting, we found reduced discounting of improbable reward options, with AdoTHC rats being more sensitive to these effects than controls. In contrast, neither acute chemogenetic stimulation of VTA dopamine neurons nor pathway-specific chemogenetic stimulation of their projection to mPFC impacted probabilistic discounting in control rats, although stimulation of this cortical dopamine projection slightly disrupted choices in AdoTHC rats. Conclusions These studies confirm a marked specificity in the cognitive processes impacted by AdoTHC exposure. They also suggest that some persistent AdoTHC effects may alter amphetamine-induced cognitive changes in a manner independent of VTA dopamine projections to mPFC, or via alterations of non-VTA dopamine neurons.
Article
Full-text available
The basal ganglia (BG) contribute to reinforcement learning (RL) and decision making, but unlike artificial RL agents, it relies on complex circuitry and dynamic dopamine modulaton of opponent striatal pathways to do so. We develop the OpAL* model to assess the normative advantages of this circuitry. In OpAL*, learning induces opponent pathways to differentially emphasize the history of positive or negative outcomes for each action. Dynamic DA modulation then amplifies the pathway most tuned for the task environment. This efficient coding mechanism avoids a vexing explore-exploit tradeoff that plagues traditional RL models in sparse reward environments. OpAL* exhibits robust advantages over alternative models, particularly in environments with sparse reward and large action spaces. These advantages depend on opponent and nonlinear Hebbian plasticity mechanisms previously thought to be pathological. Finally, OpAL* captures risky choice patterns arising from DA and environmental manipulations across species, suggesting that they result from a normative biological mechanism.
Article
Full-text available
Although previous research has emphasized the beneficial effects of dopamine (DA) on functions of the prefrontal cortex (PFC), recent studies of animals exposed to mild stress indicate that excessive DA receptor stimulation may be detrimental to the spatial working memory functions of the PFC (Arnsten and Goldman-Rakic, 1990; Murphy et al., 1994, 1996a,b, 1997). In particular, these studies have suggested that supranormal stimulation of D 1 receptors may contribute to the detrimental actions of DA in the PFC (Murphy et al., 1994, 1996a). The current study directly tested this hypothesis by examining the effects of infusing a full D 1 receptor agonist, SKF 81297, into the PFC of rats performing a spatial working memory task, delayed alternation. SKF 81297 produced a dose-related impairment in delayed-alternation performance. The impairment was reversed by pretreatment with a D 1 receptor antagonist, SCH 23390, consistent with drug actions at D 1 receptors. SCH 23390 by itself had no effect on performance, although slightly higher doses impaired performance (Murphy et al., 1994, 1996a). There was a significant relationship between infusion location and drug efficacy; animals with cannulae anterior to the PFC were not impaired by SKF 81297 infusions. Taken together, these results demonstrate that supranormal D 1 receptor stimulation in the PFC is sufficient to impair PFC working memory function. These cognitive data are consistent with recent electrophysiological studies of D 1 receptor mechanisms affecting the PFC (Williams and Goldman-Rakic, 1995; Yang and Seamans, 1996). Increased D 1 receptor stimulation during stress may serve to take the PFC “off-line” to allow posterior cortical and subcortical structures to regulate behavior, but may contribute to the vulnerability of the PFC in many neuropsychiatric disorders.
Article
Full-text available
Latent inhibition (LI) refers to decrement in conditioning to a stimulus as a result of its prior nonreinforced preexposure. This robust phenomenon has been shown in classical and instrumental conditioning procedures and in many mammalian species, including humans. Development of LI reflects decreased associability of, or attention to, stimuli that predict no significant outcome. The fact that LI reflects attentional processes has become important to neuroscientists who see LI as a convenient tool for measuring the effects of drug treatments and lesions on attention. Data on brain systems studied for their involvement in LI are surveyed. These are presented in sections on noradrenergic, cholinergic, dopaminergic, serotonergic, and septo-hippocampal manipulations. It is concluded that the neural substrates of LI include the mesolimbic dopaminergic system (MDS), the mesolimbic serotonergic system (MSS), and the hippocampus. The preexposed stimulus loses its capacity to affect behavior in conditioning, even though it predicts reinforcement, because the hippocampus inhibits the switching mechanism of the nucleus accumbens via the subiculum-accumbens pathway. This hippocampal action is modulated by the MSS via its interactions with the hippocampal system or MDS, or both. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Dopamine (DA) in the medial prefrontal cortex (PFC) can modulate the short-term retention of information and other executive functions. The present study examined whether administration of a DA D₁ agonist into the PFC could have differential effects on memory retrieval in circumstances in which memory was either excellent or poor. Separate groups of rats were trained on a delayed version of the radial maze task. On the test day, the delay between the phases was either 30 min or 12 hr. Infusions of the D₁ receptor agonist SKF 81297 (0.05, 0.10, or 0.20 Ag/0.5 P1) into the PFC before the test phase improved memory retrieval after a 12-hr delay but disrupted performance after a 30-min delay. These data suggest that D₁ receptor activity can exert differential effects over PFC function, depending on the strength of the memory trace. When memory is decremented by an extended delay, activation of PFC DA D₁ receptors by an agonist can improve cognitive function. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Rationale: Inability to tolerate delays to reward is an important component of impulsive behaviour, and has been suggested to reflect dysfunction of dopamine systems. Objectives: The present experiments examined the effects of signalling a delayed, large reward on rats' ability to choose it over a small, immediate reward, and on the response to amphetamine, a dopamine receptor antagonist, and a benzodiazepine. Methods: Three groups of Lister hooded rats were tested on a two-lever discrete-trial delayed reinforcement task in which they chose one pellet delivered immediately or four pellets delivered after a delay. This delay increased from 0 to 60s during each session. Trials began with illumination of a houselight: in the Houselight group, this remained on during the delay and feeding period. In the No Cue group, the houselight was extinguished at the moment of choice. In the Cue group, a stimulus light was illuminated during the delay. Once trained, the rats were challenged with d-amphetamine (0.3, 1.0, 1.6mg/kg), chlordiazepoxide (1.0, 3.2, 5.6, 10mg/kg), α-flupenthixol (0.125, 0.25, 0.5mg/kg), and various behavioural manipulations. Results: Subjects' choice became and remained sensitive to the delay; the cue speeded learning. Amphetamine decreased choice of the large reinforcer in the No Cue group and increased it in the Cue group. α-Flupenthixol and chlordiazepoxide generally decreased preference for the delayed reinforcer; flupenthixol reduced the cue's effects, but chlordiazepoxide did not interact with the cue condition. Conclusions: Signals present during a delay can enhance the ability of amphetamine to promote choice of delayed rewards.
Article
Full-text available
Damage to various regions of the prefrontal cortex (PFC) impairs decision making involving evaluations about risks and rewards. However, the specific contributions that different PFC subregions make to risk-based decision making are unclear. We investigated the effects of reversible inactivation of 4 subregions of the rat PFC (prelimbic medial PFC, orbitofrontal cortex [OFC], anterior cingulate, and insular cortex) on probabilistic (or risk) discounting. Rats were well trained to choose between either a "Small/Certain" lever that always delivered 1 food pellet, or another, "Large/Risky" lever, which delivered 4 pellets, but the probability of receiving reward decreased across 4 trial blocks (100%, 50%, 25%, and 12.5%). Infusions of gama-aminobutyric acid agonists muscimol/baclofen into the medial PFC increased risky choice. However, similar medial PFC inactivations decreased risky choice when the Large/Risky reward probability increased over a session. OFC inactivation increased response latencies in the latter trial blocks without affecting choice. Anterior cingulate or insular inactivations were without effect. The effects of prelimbic inactivations were not attributable to disruptions in response flexibility or judgments about the relative value of probabilistic rewards. Thus, the prelimbic, but not other PFC regions, plays a critical role in risk discounting, integrating information about changing reward probabilities to update value representations that facilitate efficient decision making.
Article
Full-text available
One common procedure for obtaining delay-discounting functions consists of a choice between a larger reinforcer that is presented after an increasing delay and a smaller reinforcer that is always presented immediately within session. Repeating the same context of delay presentation (e.g. ascending delay order) in a discrete-choice paradigm, however, may lead to a perseverative response pattern when rats are used as subjects. The purpose of this study was to increase the variability in delay presentation (i.e. ascending and descending delays) in an attempt to reduce a perseverative response pattern and gain tighter control over choice by reinforcer amount and delay. For one group of rats (n = 8), delays to reinforcer presentation were differentially signaled by a flashing houselight and for one group of rats (n = 8) the delays were unsignaled. Effects of delay signal and d-amphetamine on choice were evaluated in both groups. Similar rates of delay discounting and area under the curve (AUC) were observed with both ascending and descending delay presentations and with signaled and unsignaled delays to reinforcement. Increasing the variability in delay order resulted in differences in the choice pattern during 0-s probe sessions. d-Amphetamine had little or no effect on AUC at low doses, but decreased AUC at the highest doses tested, that is, 1.0 and 1.7 mg/kg. Some of the changes in AUC after d-amphetamine administration may have been because of disruption in discrimination of the different food amounts.
Article
Background: Delays between actions and their outcomes severely hinder reinforcement learning systems, but little is known of the neural mechanism by which animals overcome this problem and bridge such delays. The nucleus accumbens core ( AcbC), part of the ventral striatum, is required for normal preference for a large, delayed reward over a small, immediate reward ( self-controlled choice) in rats, but the reason for this is unclear. We investigated the role of the AcbC in learning a free-operant instrumental response using delayed reinforcement, performance of a previously-learned response for delayed reinforcement, and assessment of the relative magnitudes of two different rewards. Results: Groups of rats with excitotoxic or sham lesions of the AcbC acquired an instrumental response with different delays ( 0, 10, or 20 s) between the lever-press response and reinforcer delivery. A second ( inactive) lever was also present, but responding on it was never reinforced. As expected, the delays retarded learning in normal rats. AcbC lesions did not hinder learning in the absence of delays, but AcbC-lesioned rats were impaired in learning when there was a delay, relative to sham-operated controls. All groups eventually acquired the response and discriminated the active lever from the inactive lever to some degree. Rats were subsequently trained to discriminate reinforcers of different magnitudes. AcbC-lesioned rats were more sensitive to differences in reinforcer magnitude than sham-operated controls, suggesting that the deficit in self-controlled choice previously observed in such rats was a consequence of reduced preference for delayed rewards relative to immediate rewards, not of reduced preference for large rewards relative to small rewards. AcbC lesions also impaired the performance of a previously-learned instrumental response in a delay-dependent fashion. Conclusions: These results demonstrate that the AcbC contributes to instrumental learning and performance by bridging delays between subjects' actions and the ensuing outcomes that reinforce behaviour.
Article
The capacity to predict future events permits a creature to detect, model, and manipulate the causal structure of its interactions with its environment. Behavioral experiments suggest that learning is driven by changes in the expectations about future salient events such as rewards and punishments. Physiological work has recently complemented these studies by identifying dopaminergic neurons in the primate whose fluctuating output apparently signals changes or errors in the predictions of future salient and rewarding events. Taken together, these findings can be understood through quantitative theories of adaptive optimizing control.
Article
Reward-predicting cues evoke activity in midbrain dopamine neurons that encodes fundamental attributes of economic value, including reward magnitude, delay and uncertainty. We found that dopamine release in rat nucleus accumbens encodes anticipated benefits, but not effort-based response costs unless they are atypically low. This neural separation of costs and benefits indicates that mesolimbic dopamine scales with the value of pending rewards, but does not encode the net utility of the action to obtain them.