ArticlePDF Available

Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials

Authors:

Abstract and Figures

We present a spiking neuron model that allows for an analytic calculation of the correlations between pre- and postsynaptic spikes. The neuron model is a generalization of the integrate-and-fire model and equipped with a probabilistic spike-triggering mechanism. We show that under certain biologically plausible conditions, pre- and postsynaptic spike trains can be described simultaneously as an inhomogeneous Poisson process. Inspired by experimental findings, we develop a model for synaptic long-term plasticity that relies on the relative timing of pre- and post-synaptic action potentials. Being given an input statistics, we compute the stationary synaptic weights that result from the temporal correlations between the pre- and postsynaptic spikes. By means of both analytic calculations and computer simulations, we show that such a mechanism of synaptic plasticity is able to strengthen those input synapses that convey precisely timed spikes at the expense of synapses that deliver spikes with a broad temporal distribution. This may be of vital importance for any kind of information processing based on spiking neurons and temporal coding.
Content may be subject to copyright.
LETTER Communicated by Misha Tsodyks
Modeling Synaptic Plasticity in Conjunction with the Timing
of Pre- and Postsynaptic Action Potentials
Werner M. Kistler
J. Leo van Hemme n
Physik Department der TU M ¨
unchen, D-85747 Garching bei M ¨
unchen, Germany
We present a spiking neuron model that allows for an analytic calcula-
tion of the correlations between pre- and postsynaptic s pikes. The neuron
model is a generalization of the integrate-and-re model and equipped
with a probabilis tic spike-triggering mechanism. We show that under
certain biologically plaus ible conditions, pre- and postsynaptic spike
trains can be described simultaneously as an inhomogeneou s Poisson
process.
Inspired by experimental ndings , we develop a model for synaptic
long-term plasticity that relies on the relative timing of pre- and post-
synaptic action potentials. Being give n an input s tatistics, we compute
the stationary synaptic weights that result from the temporal correlations
between the pre- and postsynaptic spikes. By means of both analytic cal-
culations and computer simulations, we show that such a mec hanism of
synaptic plasticity is abl e to s trengthen those input synapses that c onvey
precisely timed s pikes at the expense of synapse s that deliver spikes with
a broad temporal distribution. This may be of vital importance for any
kind of information processing based on spiking neurons and tempor al
coding.
1 Introduction
There is growing experimental evidence that the strength of a synapse is
persistently changed according to the relative timing of the arrival of presy-
naptic spikes and the triggering of postsynaptic action potentials (Markram,
L ¨ubke, Frotscher, & Sakmann, (1997); Bell, Han, Sugawara, & Grant, 1997) .
A theoretical approach to this phenomenon is far from being trivial because
it involves the calculation of correlations between pre- and postsynaptic
action potentials (Gerstner, Kempter, van Hemmen, & Wagner 1996). In the
context of simple neuron models such as the standard integrate-and-re
model, the determination of the ring-time distribution involves the solu-
tion of rst-passage time problems (Tuckwell, 1988). These problems are
known to be hard, the more so if the neuron model is extended so as to
include biologically realistic postsynaptic potentials.
Neural Computation 12, 385–405 (2000) c
°2000 Massachusetts Institute of Technology
386 Werner M. Kistler and J. Leo van Hemmen
We are going to use a neuron model, the spike-response model, which
is a generalization of the integrate-and-re model (Gerstner & van H em-
men, 1992; Gerstner, 1995; Kistler, Gerstner, & van Hemmen, 1997). This
model combines analytic simplicity with the ability to give a faithful de-
scription of “biological” neurons in terms of postsynaptic potentials, after-
potentials, and other aspects (Kistler, et al., 1997). In order to be able to solve
the rst-passage time problem, the sharp ring threshold is replaced by a
probabilistic spike-triggering mechanism (section 2). In section 3 we inves-
tigate conditions that allow for a description of the postsynaptic spike train
as an inhomogeneous Poisson process, if the presynaptic spike trains are
described by inhomogeneous Poisson processes too. Using this stochastic
neuron model, we can analytically compute the distribution of the rst post-
synaptic ring time f or a given ensemble of input spike trains in two limiting
cases: for low- and high-ring threshold (section 4). Finally, in section 5 a
model of synaptic plasticity is introduced and analyzed in the context of a
neuron that receives input from presynaptic neurons with different tempo-
ral precision. We show analytically and by computer simulations that the
mechanism of synaptic plasticity is able to distinguish between synapses
that convey spikes with a narrow or a broad temporal jitter. The result-
ing stationary synaptic weight vector favors synapses that deliver precisely
timed spikes at the expense of the other synapses so as to produce again
highly precise postsynaptic spikes.
2 The Model
We will investigate a neuron model—the spike-response model—that is in
many aspects similar to but much more general than the standard integrate-
and-re model (Gerstner & van Hemmen, 1992; Gerstner, 1995). The mem-
brane potential is given by a combin ation of pre- an d postsynaptic contribu-
tions described by a respons e kern el 2that gives the form of an elementary
postsynaptic potential and a kernel gthat accounts for refractoriness and
has the function of an afterpotential, so that
hi(t)=X
j,f
Jij2(t¡tf
j¡Dij)¡X
f
g(t¡tf
i). (2.1)
Here, hiis the membrane potential of neuron i,Jij is the strength of the
synapse connecting neuron jto neuron i,Dij is the corresponding axonal
delay from jto i, and ftf
j,f=1, 2, . . .gare the ring times of neuron j.
Causality is respected if both 2(s)and g(s)vanish identically for s<0. De-
pending on the choice for 2and g, the spike-response model can reproduce
the standard integrate-and-re model or even mimic complex Hodgkin-
Huxley-type neuron models (Kistler et al., 1997). A typical choice for 2is
2(t)=t/tMexp(1¡t/tM)H(t), where tMis a membrane time constant and
Modeling Synaptic Plasticity 387
Hthe Heaviside step function with H(t)=1 for t>0 and H(t)=0 else-
where. As for the refractory function g, a simple exponential can be used,
for example, g(t)=g0exp(¡t/tref)H(t).
Equation 2.1 could also be generalized so as to account for short-term
depression and facilitation if the constant coupling Jij is replaced by a func-
tion of the previous spike arrival times at synapse i7! j(Kistler & van
Hemmen, 1999). In this context, however, we will concentrate on situations
where only a single spike is transmitted across each synapse, so that effects
of short-term plasticity do not come into action.
The crucial point for what follows is the way in which new ring events
are dened. Instead of using a deterministic threshold criterion for spike
release, we describe the spike train of the postsynaptic neuron as a stochastic
process with the probability density of having a spike at time tbeing a
nonlinear function ºof the membrane potential h(t). If we neglect the effect
of the afterpotential, g=0, the spike train of neuron iis an inhomogeneous
Poisson process with r ate º=º(hi)for every xed set of presynaptic ring
times ftf
j,j6=ig.
Spike triggering in neurons is more or less a threshold process. In order
to mimic this observation, we have chosen the rate function º(h)to be
º(h)=ºmaxH(h¡#), (2.2)
where ºmax is a positive constant and #the ring threshold. We note that
ºmax is not the maximum ring rate of the neuron, which is given by the
form of g(t), but a kind of a reliability parameter. The largerºmax, the faster
the neuron will re after the threshold #has been reached from below. The
role of the parameter ºmax is further discussed in section 4.
3 Noisy Input
For a xed set of presynaptic spikes at times ftf
j,j6=igand a negligible after-
potential (g=0), the membrane potential of neuron iat time tis uniquely
determined and given by
hi(t)=X
j,f
Jij2(t¡tf
j¡Dij). (3.1)
In this case, the spike train of neuron iis a Poisson process with rateºmax
for those time intervals with hi(t)>#and no spikes in between.
Things become more complicated if we include an afterpotential after
each spike of neuron iand if the arrival times of the presynaptic spikes are
no longer xed but become random variables too. We will not discuss the
rst point because we are mainly interested in the rst spike of neuron iafter
a period of inactivity. As a consequence of the second point, the membrane
388 Werner M. Kistler and J. Leo van Hemmen
potential hi(t)is no longer a xed function of time, but a stochastic process
because every realization of presynaptic ring times ftf
j,j6=igresults in a
different realization of the time course of the membrane potential hi(t)as
given through equation 3.1. The composite process is thus doubly stochastic
in the sense that in a rst step, a set of input spike trains is drawn from
an ensemble characterized by Poisson rates ºi. This realization of input
spike trains then determines the output rate function, which produces in a
second step a specic realization of the postsynaptic spike train. The natural
question is how to char acterize the n ature of the resulting p ostsynaptic sp ike
train.
For an inhomogeneous Poisson process with rate function º, the expec-
tation of the number kof events in an interval [T1,T2] is
Efk|ºg=ZT2
T1
dtº(t). (3.2)
If the rate f unctionºis replaced by a stochasti c process, we have to integrate
over all possible realizations of º(with respect to a measur e m) in order to
obtain the (unconditional) expectation of the number of events,
Efkg=Zdm(º)Efk|ºg=«Efk|ºg¬º=ZT2
T1
dtN
º(t), (3.3)
with h¢iº=R¢dm(º)and N
º(t)=hº(t)iº. Hence, the composite process has
the same expectation of the number of events as an inhomogeneous Poisson
process with rate function N
º(t), whic h is just the expectation of º(t). Never-
theless, the composite process is not equivalent to a Poisson process. This
can be seen by calculating the probabilities of observing kevents within
[T1,T2]. For a Poisson process with xed rate function º, this probability
equals
prob fkevents in [T1,T2]|ºg=1
k!"ZT2
T1
dtº(t)
#k
exp "¡ZT2
T1
dtº(t)
#, (3.4)
whereas for the composite process, we obtain
prob fkevents in [T1,T2]g(3.5)
=«prob fkevents in [T1,T2]|ºg¬º
=1
X
n=0
(¡1)n
k!n!*"ZT2
T1
dtº(t)#nCk+º
=1
X
n=0
(¡1)n
k!n!ZT2
T1
dt1...ZT2
T1
dtnCk«º(t1). . . º(tnCk)¬º. (3.6)
Modeling Synaptic Plasticity 389
The last expression contains expectations of products of the rate function º
evaluated at times t1, . . . , tnCk. Since the membrane potential his a contin-
uous function of time, the rate function is (at least) piecewise continuous,
and º(ti)and º(tj)with ti6=tjare in general not statistically independent.
The calculation of correlations such as C(ti,tj):=hº(ti)º(tj)i ¡ hº(ti)ihº(tj)i
is thus far from being trivial (Bartsch & van Hemmen, 1998).
In the following we approximate the composite process by an inhomo-
geneous Poisson process with rate function N
º. This approximation can be
justied for large n umbers of over lapping excitatory p ostsynaptic potentials
(EPSPs), when, due to the law of large numbers (Lamperti, 1966), the mem-
brane potential as a sum of independent EPSPs and thus the rate function
º(t)is almost always close to its expectation value N
º(t). The approxima-
tion is equivalent to neglecting correlations in the rate function º. That is,
to good approximation (as we will see shortly), we are allowed to assume
º(t1), . . . , º(tkCn)to be independent for ti6=tj,i6=j. In doing so, we can
replace the expectation of the product by the product of expectations under
the time integrals of equation 3.5 and obtain
prob fkevents in [T1,T2]g
=1
X
n=0
(¡1)n
k!n!ZT2
T1
dt1...ZT2
T1
dtnCkN
º(t1)¢. . . ¢N
º(tnCk)
=1
k!"ZT2
T1
dtN
º(t)#k
exp "¡ZT2
T1
dtN
º(t)#, (3.7)
which is th e probability of observin g kevents for aninhomogeneous Poisson
process with rate function N
ºthat is simply the expectation of the process
º. The efcacy of this approximation remains to be tested. This is done
throughout the rest of this article by comparing theoretical results based on
this approximation with numerical simulations.
We now calculate N
ºfor neuron i. We assume that the spike train of the
presynaptic neurons , labeled j, can be described by an inhomogeneous Pois-
son process with rate function º
j. A description of spike trains of cortical
neurons in terms of inhomogeneous Poisson processes is appropriate for
time windows below approximately 1 second (Rieke, Warland, de Ruyter
van Steveninck, & Bialek, 1997), which is sufcient for our purpose. With
these preliminaries we can calculate the rst and the second moment of the
distribution of the resulting membrane potential.
The average postsynaptic potential generated by a single presynaptic
neuron with rate function º
jis
*X
f
2(t¡tf
j)+=(ºj¤2)( t). (3.8)
390 Werner M. Kistler and J. Leo van Hemmen
Here ¤denotes convolution, that is, (f¤g) (t)=Rdt0f(t0)g(t¡t0). The second
moment is (Kempter, Gerstner, van Hemmen, & Wagner, 1998, appendix A)
*2
4X
f
2(t¡tf
j)3
5
2+=£(º
j¤2)(t)¤2C³º
j¤22´(t). (3.9)
Returning to equation 3.1, we see that the expectation value N
hiof the mem-
brane potential and the corresponding variance s2
iare given by
N
hi(t)=Efhi(t)g=X
j
Jij(ºj¤2)(t)(3.10)
and
s2
i(t)=Var fhi(t)g=X
j
J2
ij(º
j¤22)(t). (3.11)
Because of the central limit theorem, the membrane potential at a certain
xed time thas a gaussian distribution, if there are sufciently many EPSPs
that superimpose at time t; (for an estimate by the Berry-Esseen inequality
we refer to Kempter et al., 1998). Equipped with the gaussian assumption,
we can calculate N
º(t),
N
º(t)=ZdhG[h¡N
hi(t),si(t)]º(h)
=
ºmax
2(1Cerf "N
hi(t)¡#
p2s2
i(t)#), (3.12)
with Gbeing a normalized gaussian, G(h,s)=(2p s 2)¡1/2exp £¡h2/(2s2)¤.
We note that this is the rst time that we have used explicitly the func-
tional dependence (see equation 2.2) of the ring rate ºon the membrane
potential h.
The result so far is that if the membrane potential is a sum of many
EPSPs so that we can assume h(t)to be gaussian distributed and the mem-
brane potentials for different times to be independent, and if the effect of
the afterpotential is negligible, then the postsynaptic spike train is given
by an inhomogeneous Poisson process with rate function N
ºas given by
equation 3.12.
4 First Pass age Time
In the previous section we discussed an approximation that allows us to
replace the doubly stochastic process that describes the postsynaptic spike
Modeling Synaptic Plasticity 391
train by an inhomogeneous Poisson process with a rate function N
ºthat is
independent of the specic realization of the presynaptic input. This ap-
proximation allows us to calculate the distribution of the rst postsynaptic
spike analytically.
The probability of observing no spikes in the interval [t0,t] is
prob fno spike in [t0,t]g=exp µ¡Zt
t0
dt0N
º(t0), (4.1)
and the density for the rst ring time after time t0is
prst(t)=d
dt(1¡prob fno spike in [t0,t]g)
=N
º(t)exp µ¡Zt
t0
dt0N
º(t0). (4.2)
We are going to study the rst-spike distribution (rst passage time) in a
specic context. We consider a single neuron having Nindependent synap-
ses. Each synapse idelivers a spike train determined by an inhomogeneous
Poisson process wi th rate function ºi(t)=(2p s2
i)¡1/2exp £¡t2/2s2
i¤, so that
on average, one spik e arrives at each sy napse with a temporal jitter siaround
t=0.
We can calculate the averaged rate function N
ºfor this setup,
N
º(t)=
ºmax
28
<
:
1Cerf 2
4PN
i=1JiN2i(t)¡#
q2PN
i=1J2
iN
22
i(t)3
59
=
;
, (4.3)
with expectation and variance being given as convolutions
N2i(t)=(ºi¤2) (t), and N
22
i(t)=(ºi¤22)( t). (4.4)
The resulting density of the rst postsynaptic spike is shown in Figure 1.
4.1 High Threshold. We calculate the density of the rst postsynaptic
spike explicitly for two limiting cases. First, we assume that the probability
of observing at least one postsynaptic spike is very small,
ZdtN
º(t)¿1. (4.5)
In this case, the neuron will re, if it will re at all, in the neighborhood of the
maximum of the postsynaptic potential. We expand the terms involving the
392 Werner M. Kistler and J. Leo van Hemmen
Figure 1: Plots to compare the density function for the rst postsynaptic spike
as reconstructed from simulations (solid line) with the analytical approximation
using the averaged rate function of equation 4.3 (dotted line). The dashed line
represents the mean rate N
º. (A, B) N=100 synapses with strength Ji=1/Nre-
ceive spike trains determined by an inhomogeneous Poisson process with rate
function º
i=(2p s2)¡1/2ex p £¡t2/2s2¤and s=1. The EPSPs are described by
alpha functions t/texp(1¡t/t)with time constant t=1 so that the maxi-
mum of the membrane potential amounts to h=1, if all spikes were to arrive
simultaneously. The postsynaptic response is characterized by ºmax =1 and
#=0.5 in Aand #=0.75 in B. Increasing the threshold obviously improves
the temporal precision of the postsynaptic response, but the overall probability
of a postsynaptic spike is decreased. (C, D) The same plots as Aand Bbut for
N=10 instead of 100 synapses. The synaptic weights have been scaled so that
Ji=1/10. The central approximation that asserts statistical independenc e of the
membrane potential at different times produces surprisingly good results for as
little as N=10 overlapping EPSPs.
membrane potential in the neighborhood of the maximum at t0to leading
order in t,
PN
i=1JiN2i(t)¡#
q2PN
i=1J2
iN
22
i(t)
=h0¡h2(t¡t0)2CO(t¡t0)3. (4.6)
The averaged rate function N
ºcan thus be written
N
º(t)=
ºmax
2n1Cerf hh0¡h2(t¡t0)2CO(t¡t0)3io
=N
º0¡N
º2(t¡t0)2CO(t¡t0)3, (4.7)
Modeling Synaptic Plasticity 393
with
N
º0=
ºmax
2[1Cerf (h0)], and N
º2=
ºmax h2
ppe¡h2
0. (4.8)
It turns out that for ¡0.5 <h0<0.5, the averaged rate function N
ºcan be
approximated very well by the clipped parabola
N
º(t)¼(N
º0¡N
º2(t¡t0)2, for |t¡t0| < pN
º0/N
º2,
0, elsewhere. (4.9)
Using this approximation, we can calculate the integral over N
ºand obtain
the distribution of the rst postsynaptic spike,
prst(t)=hN
º0¡N
º2(t¡t0)2iexp "N
º2
3(t¡t0)3¡N
º0(t¡t0)¡2N
º0
3sN
º0
N
º2#
£H(sN
º0
N
º2¡|t¡t0|!. (4.10)
For h0<¡0.5, however, a gaussian function is the better choice,
N
º(t)¼N
ºpostG(t¡t0,spost), (4.11)
with N
ºpost =qpN
º3
0/N
º2and spost =pN
º0/(2N
º2). Since N
ºpost ¿1, we have
prst(t)¼N
º(t). (4.12)
4.2 Low Threshold. We now discuss the case where the postsynaptic
neuron res with probability near to one, that is,
1¡ZdtN
º(t)¿1. (4.13)
In this case, the neuron will re approximately at time t0when the mem-
brane potential crosses the threshold, h(t0)=#. We again expand the terms
containing the membrane potential to leading order in (t¡t0),
PN
i=1JiN2i(t)¡#
q2PN
i=1J2
iN
22
i(t)
=h1(t¡t0)CO(t¡t0)2, (4.14)
and assume that the averaged rate function N
ºis already saturated outside
the region where this linearization is valid. We may therefore put
N
º(t)=
ºmax
2©1Cerf [h1(t¡t0)]ª, (4.15)
394 Werner M. Kistler and J. Leo van Hemmen
Figure 2: Approximation of the averaged rate function N
º(left) and the density
of the rst postsynaptic spike prst (r ight) for various threshold values. The solid
line gives the numerical result; the dashed line represents the analytic approx-
imation. (A, B) Approximation for low-threshold value (#=0.5) as in equa-
tions 4.15 and 4.16. (C, D), Approximation by a clipped parabola for #=0.65,
as in equations 4.9 and 4.10. (E, F) Gaussian approximation for high-threshold
values (#=0.8); see equations 4.11 and 4.12 . The number of synapses is N=100;
the other parameters are as in Figure 1.
and obtain for the density of the rst postsynaptic spike,
prst(t)=N
º(t)exp µ¡N
º(t)(t¡t0)¡
ºmax
2pph1
e¡h2
1(t¡t0)2. (4.16)
Outside the neighborhood of the threshold crossing, that is, for |t¡t0|À
h¡1
1,prst can be approximated by
prst(t)¼ºmaxe¡ºmax (t¡t0)H(t¡t0),|t¡t0|Àh¡1
1. (4.17)
4.3 Optimal Threshold. As can be seen in Figure 2, the width of the
rst-spike distribution decreases with increasing threshold #. The tempo-
ral precision of the postsynaptic spike can thus be improved by increasing
Modeling Synaptic Plasticity 395
Figure 3: Reliability (A) and precision (B) of the postsynaptic ring event as
a function of the threshold #. The neuron receives input through N=100
synapses; for details, see the caption to Figure 1. Reliability is dened as the
overall ring probability N
º
post =Rdtprst (t). Precision is the inverse of the length
of the interval containing 90% of the postsynaptic spikes, Dt=t2¡t1with
Rt1
¡1dtprst(t)=R1
t2dtprst (t)=0.05 N
º
post .Cdemonstrates that there is an optimal
threshold in the sense that N
ºpost /Dtexhibits a maximum at #¼0.6. The short
and the long dashed lines represent the results obtained through the high- and
the low-threshold approximations, respectively; See equations 4.18–4.20.
the ring threshold #. Th e overall ring probability of the postsynaptic neu-
ron, however, drops to zero if the threshold exceeds the maximum value
of the membrane potential. The trade-off between the precision of the post-
synaptic response and the reliability, that is, the overall ring probability
of the postsynaptic neuron, is illustrated in Figure 3. There is an optimal
threshold where the expectation value of the precision, that is, the product
of reliability and precision, reaches its maximum (Kempter et al., 1998).
A straightforward estimate demonstrates the functional dependence of
precision and reliability on the various parameters. For high-threshold val-
ues, the density prst for the rst postsynaptic spike can be approximated
by a gaussian function with variance N
º0/2N
º2. The precision of the ring time
is thus proportional to
Dt¡1/pN
º2/N
º0=(2h2e¡h2
0
pp[1 Cerf (h0)]!1/2
, (4.18)
which depends solely on the time course of the membrane potential and is
396 Werner M. Kistler and J. Leo van Hemmen
independent of ºmax . The reliability of the postsynaptic respon se amounts to
N
ºpost =qpN
º3
0/N
º2=ºmax (pp[1 Cerf (h0)]3
8h2e¡h2
0!1/2
, (4.19)
which is proportional to ºmax .
For low threshold , prst(t)is dominated by an ex ponential decay /e¡ºmaxt,
resulting in a constant precision,
Dt¡1¼
ºmax
3.00 , (4.20)
and reliability close to unity.
5 Synapti c Long-Ter m Plasticity
There is experimental evidence (Markram et al., 1997) that the strength of
synapses betwee n cortical n eurons is change d according to the relative tim-
ing of pre- and postsynaptic spikes. We develop a simple model of synaptic
long-term plasticity that mimics the experimental observations. Using the
densities for the rst postsynaptic spike calculated in the previous section,
we nd analytic expressions for the stationary synaptic weights that result
from the statistics of the presynaptic spike train.
5.1 Modelin g Synaptic Plastic ity. Very much in the thrust of the spike-
response model, we describe the change of a synaptic weight as the linear
response to pre- and postsynaptic spikes. Spike trains are formalized by
sums of Dirac delta functions, S(t)=Pfd(t¡tf), with a delta function
for each ring time tf. The most general ansatz up to and including terms
bilinear in the p re- and postsynaptic spike tr ain for the chang eof the sy naptic
weight Jis
d
dtJ(t)=Spre(t)µdpre CZdt0kpre(t0)Spost(t¡t0)
CSpost(t)µdpost CZdt0kpost (t0)Spre(t¡t0). (5.1)
This is a combination of changes that are induced by the pre- (Spre ) and
the postsynaptic (Spost) spike train. Furthermore, dpre (dpost) is the amount
by which the synaptic weight is changed if a single presynaptic (postsy-
naptic) spike occurs. This non-Hebbian contribution to synaptic plasticity,
which is well documented in the literature (Alonso, Curtis, & Llin´as, 1990;
Nelson, Fields, Yu, & Liu, 1993; Salin, Malenka, & Nicoll, 1996; Urban & Bar-
rionuevo, 1996), is necessary to ensure that the xed point of the synaptic
Modeling Synaptic Plasticity 397
strength can be reached from very low initial values and even zero post-
synaptic activity (take 0 <dpre ¿1) and to regulate postsynaptic activity
(take ¡1¿dpost <0). Finally, kpre(s)(kpost(s)) is the additional change in-
duced by a presynaptic (postsynaptic) spike that occurs smilliseconds after
a postsynaptic (presynaptic) spike. This modication of synaptic strength
depends critically on the timing of pre- and postsynaptic spikes (Dan &
Poo, 1992; Bell et al., 1997; Markram et al., 1997). Causality is respected if
the kernels kvanish identically for negative arguments.
In order to restrict the synaptic weights to the interval [0, 1], we treat
synaptic potentiation and depression separately and assume that synaptic
weight changes due to depression are proportional to the current synaptic
weight J(t), whereas changes due to potentiation are proportional to [1 ¡
J(t)], so that
d
dtJ(t)=[1 ¡J(t)]µd
dtJ(t)
LTP
¡J(t)µd
dtJ(t)
LTD
(5.2)
with
µd
dtJ(t)
LTP/D
=Spre(t)µdLTP/D
pre CZdt0kLTP/D
pre (t0)Spost(t¡t0)
CSpost(t)µdLTP/D
post CZdt0kLTP/D
post (t0)Spre(t¡t0), (5.3)
and dLTP/D
pre/post,kLTP/D
pre/post(t0)¸0.
We are interested in the net chang ehDJiof the synaptic weight integrated
over some time interval and averaged over an ensembl e of presynaptic spike
trains,
hDJi=½Zdtµd
dtJ(t)¶¾. (5.4)
If synaptic weights change only adiabatically, that is, if the change of the
synaptic weight is small as compared Jand (1¡J), we have
hDJi=[1 ¡J(t)]DDJLTPE¡J(t)DDJLTDE, (5.5)
with
DDJLTP/DE=*Zdtµd
dtJ(t)LTP/D+
=dLTP/D
pre N
ºpre CdLTP/D
post N
ºpost
CZdtZdt0º(tIt0)kLTP/D(tIt0), (5.6)
398 Werner M. Kistler and J. Leo van Hemmen
where N
ºpre,post =Rdtºpre,post(t)is the expected number of pre- and postsy-
naptic spikes, and kLTP/D(tIt0)=kLTP/D
pre (t¡t0)CkLTP/D
post (t0¡t)is the chang e of
the synaptic weight due to a presynaptic spike at time tand a postsynaptic
spike at time t0. In passing we note that º(tIt0)dtdt0is the joint probability
of nding a presynaptic spike in the interval [t,tCdt)and a postsynaptic
spike in [t0,t0Cdt0).
5.2 Stationary Weights. In the following we concentrate on calculating
the stationary synaptic weights dened by
hDJi=0() J=«DJLTP¬
«DJLTP¬C«DJLTD ¬. (5.7)
To this end, we return to the situation described at the beginning of Sec-
tion 4 where a neuron receives input through Nind ependent synapses , each
spike train being described by an inhomogeneous Poisson process with rate
function ºi, and 1 ·i·N. We study a single synapse and assume that this
synapse conveys (on average) one action potential with a gaussian distri-
bution centered around t=0 so that º(t)=G(t,spre). The synaptic weight
of this synapse is denoted by J. Furthermore, we assume that the presy-
naptic volley of action potentials triggers at most one postsynaptic spike,
as is the case, for instance, if postsynaptic action potentials are followed by
strong hyperpolarizing afterpotentials. This assumption relieves us from
the need to discuss the inuence of the shape of the afterpotential gbe-
cause the postsynaptic spike statistic is now completely described by the
rst-passage-time prst.
We discuss the case where the ring time of a single presynaptic neuron
and the postsynaptic ring time are virtually independent. This might be
considered as an approximation of the case where either the corresponding
synaptic weight is small or the number of overlapping EPSPs needed to
reach the ring threshold is large. The joint probability ºi(t,t0)of the ring
of the presynaptic neuron iand the rst postsynaptic spike is then simply
the product of the probabilities,
ºi(t,t0)=ºi(t)prst(t0). (5.8)
We investigate a learning rule that is inspired by the observation that a
synapse is weakened (depressed) if the presynaptic spike arrives a short
time after the postsynaptic spike and is strengthened (potentiated) if the
presynaptic spike arrives a short time before the postsynaptic spike and ,
thus, is contributing to the postsynaptic neuron’s ring (Markram et al.,
1997). We can mimic this mechanism within our formalism by setting
kLTP(tIt0)=2LTP exp h¡(t0¡t)/tLTPiH(t0¡t), (5.9)
Modeling Synaptic Plasticity 399
Figure 4: S ynaptic plasticity based on the timing of pre- an d postsynaptic spikes.
The kernels CkLTP (solid line) and ¡kLTD (dashed line) d escribe the modi cation
of synaptic strength as it is caused by the combination of a postsynaptic spike
at time 0 and a presynaptic spike arriving at time t. See equations 5.9–5.10 with
2LTP =2LTD =0.1 and tLTP =tLTD =1.
and
kLTD(tIt0)=2LTD exp h¡(t¡t0)/tLTDiH(t¡t0), (5.10)
with2LTP/D>0 and tLTP/Das time constants of the “synaptic memory” (see
Figure 4). Our choice of adopting exponentials to describe the kernels kLTP
and kLTD is motivated by both biological plausibility and the need to keep
the mathematical analysis as simple as possible.
With these prerequisites, we can calculate the mean potentiation and
depression due to the correlation of pre- and postsynaptic spikes (cf. equa-
tion 5.6) for the limiting cases of high and low threshold. If we approximate
the postsynaptic ring by a gaussia n distribution (cf. equation 4.12),
prst(t)=N
ºpostG(t¡t0,spost), (5.11)
we nd
ZdtZdt0º(tIt0)kLTP(tIt0)
=2LTP Z1
¡1
dtZ1
t
dt0G(t,spre)N
ºpostG(t0¡t0,srst)exp h¡(t0¡t)/tLTP
i
=1
2N
ºpost2LTP exp µ1
2³s
tLTP´2¡t0
tLTPerfc "1
2
p2s
tLTP ¡t0
p2s#, (5.12)
and
ZdtZdt0º(tIt0)kLTD(tIt0)
=1
2N
ºpost2LTD exp µ1
2³s
tLTD´2Ct0
tLTDerfc "1
2
p2s
tLTD Ct0
p2s#, (5.13)
400 Werner M. Kistler and J. Leo van Hemmen
Figure 5: Stationary synaptic weights. (A) Three-dimensional plot of the sta-
tionary synaptic weight for high threshold as a function of sand t0, where
s2=s2
pre Cs2
post is the sum of the variances of pre- and postsynaptic ring time
and t0the time between the arriva l of the presynaptic spike and the ring of
the postsynaptic action potential. (B) Contour plot of the same function as in A.
(C, D) Plot of the stationary synaptic weight in the case of low-threshold #as
a function of the standard deviation spre of the presynaptic ring time and the
time t0when the expectation of the membrane potential reaches the threshold
from below. The parameters used to describe synaptic plasticity are dLTD
post =0.01,
dLTP
pre =0.001, dLTD
pre =dLTP
post =0, 2LTP =2LTD =0.1, tLTP =tLTD =1; See equa-
tions 5.2, 5.9, and 5.10.
with s2=s2
pre Cs2
post. Using equation 5.7, we are thus able to calculate the
stationary weight of a synapse that conveys a single action potential with
a gaussian distribution with variance s2
pre, given the postsynaptic spike at
t=t0with variance s2
post. This result is illustrated in Figure 5. If the variance
of both pre- and postsynaptic spike is small, that is, s2=s2
pre Cs2
post ¿
1, the stationary weight is dominated by the kernels kLTP/D: The weight
is close to zero if the presynaptic spike occurs after the postsynaptic one
(t0<0) and close to unity if the presynaptic spike arrives a short time
before the postsynaptic action potential (t0>0). For large variances, this
dependence on the timing of the spikes is smoothed, and the stationary
Modeling Synaptic Plasticity 401
weight is determined by the rates of pre- and postsynaptic spikes.
We obtain a similar result for low threshold if we approximate the post-
synaptic ring time by an exponential distribution (cf. equation 4.17),
ZdtZdt0º(tIt0)kLTP(tIt0)
=2LTP Z1
¡1
dtZ1
max(t,t0)
dt0G(t,spre)ºmaxe¡º
max (t0¡t0)e¡(t0¡t)/tLTP
=
2LTPºmax
ºmax C1/tLTP (1
2exp µ1
2³spre
tLTP ´2
¡t0
tLTP
£erfc "s
p2tLTP ¡t0
p2spre #
C1
2exp µ1
2¡spreºmax¢2Ct0ºmax
£erfc "spreºmax
p2Ct0
p2spre #), (5.14)
and
ZdtZdt0º(tIt0)kLTD(tIt0)
=
2LTDºmax
ºmax ¡1/tLTD (1
2exp µ1
2³spre
tLTD ´2
Ct0
tLTD
£erfc "s
p2tLTD Ct0
p2spre #
¡1
2exp µ1
2¡spreºmax¢2Ct0ºmax
£erfc "spreºmax
p2Ct0
p2spre #). (5.15)
5.3 Synaptic Weights as a Function of the Input Statistics. We saw in
the previous section tha tthe stationary synapti c weights can be calculated as
a function of the parameters characterizing the distribution of the pre- and
postsynaptic spikes. T he synaptic weights , on the other hand , determine the
distribution of the postsynaptic spike. If we are interested in the synaptic
weights that are produced by given input statistics, we thus have to solve a
self-consistency problem.
402 Werner M. Kistler and J. Leo van Hemmen
The self-consistency problem can be solved numerically for the limiting
cases of low and high threshold, where the distribution of the postsynaptic
ring time is completely characterized by only a few parameters. That is,
starting from a given vector Jof synaptic couplings, we calculate the pa-
rameters that characterize the postsynaptic ring time using the results of
section 4 in the limiting cases of high- and low-ring threshold. In a second
step, we calculate the corresponding stationary synaptic weight vector J
as it is determined by the statistics of pre- and postsynaptic ring times;
(cf. section 5.2). Finally, we solve the self-consistency equation J=Jusing
standard numerics.
In the simple case where all synapses deliver spikes with a common statis-
tics, the stable solution to the self-consistency equation is unique because
the mean postsynaptic ring time is a monotone function of the synaptic
weight. But if the synapses are different, say, with respect to the mean spike
arrival time, the solution in general will no longer be unique.
We illustrate the solution of the self-consistency problem by means of
an example and consider a neuron that receives spikes via Nindependent
synapses. Again, each presynaptic spike train is described by an inhomo-
geneous Poisson process with a gaussian rate G(t,si)centered at t=0. In
contrast to our previous examples, the synapses are different with respect
to the temporal precision siwith which they deliver their action potentials.
Figure 6A shows the resulting synaptic weights if we start with two
groups of presynaptic neurons that deliver action potentials with s1=0.1
and s2=1.0, respectively. The number of neurons contained in group 1 and
group 2 are n1=20 and n2=80, respectively, but the results turned out
to be insensitive to the actual numbers. (A more balanced ratio of synapses
would shift the curve J1(#)slightly to the left). We compare the results
obtained from the unique solution of the self-consistency equation with
simulations of a straightforward implementation of the synaptic dynamics
given in equation 5.2. The number of iterations required for the synaptic
weights to settle down in the stationary values depends on the amplitude
of the synaptic weight change described by dLTP/Dand 2LTP/Dand on the
amount of postsynaptic activity and, thus, on the ring threshold. With the
parameters used in Figure 6, stationarity is usually reached in fewer than
1000 iterations.
Finally, a brief discussion of the results shown in Figure 6 is in order.
For a high ring threshold, the synaptic weights of both groups are close
to their maximum value because otherwise the neuron would not re at
all. This is due to the constant presynaptic potentiation dLTP
pre >0. For very
low threshold, the postsynaptic ring is triggered by the rst spikes that
arrive at the neuron. These spikes stem from presynaptic neurons with a
broad ring-time distribution (group 2). Spikes from group 1 neurons thus
arrive mostly after the postsynaptic neuron has already red and the corre-
sponding synapses are depressed accordingly. This explains why synapses
Modeling Synaptic Plasticity 403
Figure 6: (A) Synaptic weights for a neuron receiving input from two groups of
synapses. One group (n1=20) delivers precisely timed spikes (spre =0.1) and
the other one (n2=80) spikes with a broad distribution of arrival times (spre =
1.0). The synapse sare su bject to a mechanism tha t modies th e synaptic strength
according to the relative timing of pre- and postsynaptic s pikes, see equation 5.2.
The upper trace shows the resulting synaptic weight for the group of precise
synapses; the lower trace corresponds to the second group. The solid lines give
the analytic result for either the low-threshold (#<0.4) or the high-threshold
(#>0.4) approximation. The dashed lines show the results of a computer
simulation. The parameter s for the synaptic plasticity are the same as in Figure 5.
(B, C, D) precision Dt¡1, reliability N
ºpost , and efciency N
º
post /Dtas a function of
the threshold #for the same neuron as in A. See Figure 3 for the denition of
N
ºpost and Dt.
delivering precisely timed spikes are weaker than group 2 synapses, unless
the ring threshold exceeds a certain critical value.
The most i nteresting nding is that in the intermediate range of th e ring
threshold, synapses that deliver precisely timed spikes are substantially
stronger than synapses with poor temporal precision. Furthermore, there is
an optimal threshold at #¼0.2 where the ratio of the synaptic strengths
of precise and unprecise synapses is largest. The ring of the postsynaptic
neuron is thus driven mainly by spikes from group 1 neurons, and the jitter
of the postsynaptic spikes is minimal (see Figure 6B).
6 Discuss ion
We have presented a stochastic neuron model that allows for a description
of pre- and postsynaptic spike trains by inhomogeneous Poisson processes
404 Werner M. Kistler and J. Leo van Hemmen
provided that the number of overlapping postsynaptic potentials is not too
small. In view of the huge number of synapses that a single neuron carries
on its dendritic tree and the number of synchronous excitatory postsynap-
tic potentials required to reach threshold—104. . . 105synapses and several
tens of EPSPs—the approximations used seem to be justied. Since ring
of a neuron is triggered by a large number of synchronous presynaptic ac-
tion potentials, the postsynaptic ring time and the arrival time of a single
presynaptic spike can be assumed to be independent and the correlation of
pre- and postsynaptic ring times is readily calculated.
In addition to the spike-triggering mechanism, we have developed a
model for synaptic long-term plasticity based on the relative timing of pre-
and postsynaptic ring events. The dynamics of the synaptic weights can
be analyzed in connection with the former neuron model. In particular,
stationary synaptic weights can be calculated as limit states for a given
ensemble of presynaptic spike trains. It turns out that this mechanism is
able to tune the synaptic weights according to the temporal precision of the
spikes they convey. Synapses that deliver precisely timed action potentials
are favored at the expense of synapses that deliver spikes with a large tem-
poral jitter. This learning process does not rely on an external teacher signal;
it is unsupervised. Furthermore, there is only one free parameter that has
to be xed: the ring threshold #. It is conceivable that a small, local net-
work of inhibitory interneurons could be entrusted with setting the ring
threshold.
The result concerning the tuning of synaptic weights has two conse-
quences that are open to experimental verication. First, it shows that neu-
rons are able to tune their synaptic weights so as to select inputs from
those presynaptic neurons that provide information that is meaningful in
the sense of a temporal code. Second, this tuning ensures the preservation of
temporally encoded information because noisy synapses are depressed and
the neuron is thus able to re its action potential with high temporal preci-
sion. We believe that this mechanism has clear relevance to any information
processing based on temporal spike coding.
Acknowledgments
W. M. K. gratefully acknowledges nancial support from the Boehringer
Ingelheim Fonds.
References
Alonso, A., Curtis, M. de, & Llin ´as, R. (1990). Postsynaptic Hebbian and non-
Hebbian long-term potentiation of synaptic efcacy in the entorhinal cortex
in slices and in the isolated adult guinea pig brain. Proc. Natl. Acad. Sci. USA,
87, 9280–9284.
Modeling Synaptic Plasticity 405
Bartsch, A. P., & van Hemmen, J. L. (1998). Correlations in networks of spiking
neurons. Unpublished manuscript. Munich: Physics Department, Technical
University of Munich.
Bell, C. C., Han, V. Z., Sugawara, Y., & Grant, K. (1997). Synaptic plasticity in a
cerebellum-like structure depends on temporal order. Nature,387, 278–281.
Dan, Y., & Poo, M. (1992). Hebbian depression of isolated neuromuscular
synapses in vitro. Science,256, 1570–1573.
Gerstner, W. (1995). Time structure of the activity in neural network models.
Phys. Rev. E,51, 738–758.
Gerstner, W., & van Hemmen, J. L. (1992). Associative memory in a network of
“spiking” neurons. Network,3. 139–164.
Gerstner, W., Kempter, R., van Hemmen, J. L. & Wagner,H. (1996). A neuronal
learning rule for sub-millisecond temporal coding. Nature,384, 76–78.
Kempter, R., Gerstner, W., van Hemmen, J. L., & Wagner, H. (1998). Extracting
oscillations: Neuronal coincidence detection with noisy periodic spike input.
Neural Comput.,10, 1987–2017.
Kistler, W. M., Gerstner, W., & van Hemmen, J. L. (1997). Reduction of the
Hodgkin-Huxle yequations to a single-variable threshold model. Neural Com-
put.,9, 1015–1045.
Kistler, W. M., & van Hemmen, J. L. (1999). Short-term synaptic plasticity and
network behavior. Neural Comput.,11, 1579–1594.
Lamperti, J. (1966). Probability: A survey of the mathematical theory. New York:
Benjamin.
Markram, H., L ¨ubke, J., Frotscher, M., & Sakmann B. ( 1997). Regulation of synap-
tic efcacy by coincidence of postsynaptic APs and EPSPs. Science,275, 213–
215.
Nelson, P.G., Fields, R. D., Yu, C., & Liu, Y. (1993). Synapse elimination from the
mouse neuromuscular junction in vitro: A non-Hebbian activity-dependen t
process. J. Neurobiol.,24, 1517–1530.
Rieke, F., Warland , D., de Ruyter van S teveninck, R., & B ialek W. (1997) . Spikes—
Exploring the neural code. Cambridge, MA: MIT Press.
Salin, P. A., Malenka, R. C., & Nicoll, R. A. (1996). Cyclic AMP mediates a presy-
naptic form of LTP at cerebellar parallel ber synapses. Neuron,16, 797–803.
Tuckwell, H. C. (1988) . Introduction to t heoretical neurobiology, (Vol. 2.) Cambridge :
Cambridge University Press.
Urban, N. N., & Barrionuevo, G. ( 1996). Ind uction of Hebb ian and non-Hebbian
mossy ber long-term potentiation by distinct patterns of high-frequency
stimulation. J. Neurosci.,16, 4293–4299.
Received August 3, 1998; accepted April 29, 1999.
... Numerous computational studies of synaptic plasticity exist, and the models can be grouped into three main classes that employ phenomenological, optimal, and biophysical approach. Phenomenological models are abstract, and encode data and intuitions about synaptic plasticity taking into account spike timing (Gerstner et al., 1996;Kempter et al., 1999;Kistler and van Hemmen, 2000;. Optimal models use some optimality criterion to deduce the rules of synaptic modifications (Toyoizumi et al., 2005;Pfister et al., 2006). ...
Article
Full-text available
Synaptic plasticity is believed to be a key mechanism underlying learning and memory. We developed a phenomenological N-methyl-D-aspartate (NMDA) receptor-based voltage-dependent synaptic plasticity model for synaptic modifications at hippocampal CA3-CA1 synapses on a hippocampal CA1 pyramidal neuron. The model incorporates the GluN2A-NMDA and GluN2B-NMDA receptor subunit-based functions and accounts for the synaptic strength dependence on the postsynaptic NMDA receptor composition and functioning without explicitly modeling the NMDA receptor-mediated intracellular calcium, a local trigger of synaptic plasticity. We embedded the model into a two-compartmental model of a hippocampal CA1 pyramidal cell and validated it against experimental data of spike-timing-dependent synaptic plasticity (STDP), high and low-frequency stimulation. The developed model predicts altered learning rules in synapses formed on the apical dendrites of the detailed compartmental model of CA1 pyramidal neuron in the presence of the GluN2B-NMDA receptor hypofunction and can be used in hippocampal networks to model learning in health and disease.
... Additive STDP has hard bounds, i.e., there is a hard cutoff at a minimal and maximal weight value. Multiplicative STDP has soft bounds, and therefore the amount of plasticity depends on the actual synaptic strength (Gütig et al. 2003;Kistler and Leo van Hemmen 2000;Rubin et al. 2001;van Rossum et al. 2000) (see experimental findings on weight dependence (Bi and Poo 1998;Turrigiano and Nelson 2004)). Models of STDP also can take into account homeostasis Kempter et al. 1999;Pfister and Gerstner 2006;van Rossum et al. 2000) (see Turrigiano and Nelson 2004 for experiments). ...
... Additive STDP has hard bounds, i.e., there is a hard cutoff at a minimal and maximal weight value. Multiplicative STDP has soft bounds, and therefore the amount of plasticity depends on the actual synaptic strength (Gütig et al. 2003;Kistler and Leo van Hemmen 2000;Rubin et al. 2001;van Rossum et al. 2000) (see experimental findings on weight dependence (Bi and Poo 1998;Turrigiano and Nelson 2004)). Models of STDP also can take into account homeostasis Kempter et al. 1999;Pfister and Gerstner 2006;van Rossum et al. 2000) (see Turrigiano and Nelson 2004 for experiments). ...
... In this work, we start from a bottom-up approach by analytically investigating conditions on Poisson firing at the single neuron level and introducing conditions on the balance of inhibitory and excitatory currents. Poisson firing condition has been studied before in integrate and fire neurons with non-conductance based and only excitatory current (Kistler and van Hemmen (2000)). Here, we investigate conditions on balanced conductance-based input currents that lead to Possion firing. ...
Article
Full-text available
We investigate spontaneous critical dynamics of excitatory and inhibitory (EI) sparsely connected populations of spiking leaky integrate-and-fire neurons with conductance-based synapses. We use a bottom-up approach to derive a single neuron gain function and a linear Poisson neuron approximation which we use to study mean-field dynamics of the EI population and its bifurcations. In the low firing rate regime, the quiescent state loses stability due to saddle-node or Hopf bifurcations. In particular, at the Bogdanov-Takens (BT) bifurcation point which is the intersection of the Hopf bifurcation and the saddle-node bifurcation lines of the 2D dynamical system, the network shows avalanche dynamics with power-law avalanche size and duration distributions. This matches the characteristics of low firing spontaneous activity in the cortex. By linearizing gain functions and excitatory and inhibitory nullclines, we can approximate the location of the BT bifurcation point. This point in the control parameter phase space corresponds to the internal balance of excitation and inhibition and a slight excess of external excitatory input to the excitatory population. Due to the tight balance of average excitation and inhibition currents, the firing of the individual cells is fluctuation-driven. Around the BT point, the spiking of neurons is a Poisson process and the population average membrane potential of neurons is approximately at the middle of the operating interval [VRest,Vth]. Moreover, the EI network is close to both oscillatory and active-inactive phase transition regimes.
... context of EI networks with long-term synaptic plasticity in all types of synapses which has not been studied previously. For analytical analysis of STDP on average weight connections, we use the inhomogenous Poisson neuron assumption that has been studied in Kistler and van Hemmen (2000) and Burkitt et al. (2007). Under general conditions on inhibitory and excitatory STDP kernels, i.e., negative integral of the excitatory and positive integral of the inhibitory STDP kernels, learning results in a balanced internal state. ...
Article
Full-text available
Dynamics of an interconnected population of excitatory and inhibitory spiking neurons wandering around a Bogdanov-Takens (BT) bifurcation point can generate the observed scale-free avalanches at the population level and the highly variable spike patterns of individual neurons. These characteristics match experimental findings for spontaneous intrinsic activity in the brain. In this paper, we address the mechanisms causing the system to get and remain near this BT point. We propose an effective stochastic neural field model which captures the dynamics of the mean-field model. We show how the network tunes itself through local long-term synaptic plasticity by STDP and short-term synaptic depression to be close to this bifurcation point. The mesoscopic model that we derive matches the directed percolation model at the absorbing state phase transition.
Article
Neuromorphic devices emulating the brain functionality have achieved a substantial scientific leap in the past decade. Among them, neuromorphic vertical transistors have attracted extensive interests due to their abilities of mitigating physical channel‐length limitations and developing new‐generation 3D integrated electronics. In this review, we focus on the recent advances and major trends in the field of emerging neuromorphic devices based on vertical transistor. We start with a brief introduction of biological nervous system and neuromorphic behaviours. Next, we discuss the latest advances in the development of vertical transistor and shed light on bio‐related neuromorphic applications. They are discussed in detail from the aspects of materials and potential applications. Finally, future research challenges in neuromorphic vertical transistor are briefly summarized, and their prospects are discussed. This article is protected by copyright. All rights reserved.
Article
The neuron model serves as the foundation for building a neural network. The goal of neuron modeling is to shoot a tradeoff between the biological meaningful and the implementation cost, so as to build a bridge between brain science knowledge and the brain‐like neuromorphic computing. Unlike previous neuron models with linear static synapses, the focus of this research is to model neurons with relatively detailed nonlinear dynamic synapses. First, a universal soma‐synapses neuron (SSN) is proposed. It contains a soma represented by a leaky integrate‐and‐fire neuron and multiple excitatory and inhibitory synapses based on ion channels dynamics. Short‐term plasticity and spike‐timing‐dependent plasticity linked to biological microscopic mechanisms are also presented in the synaptic models. Then, SSN is implemented on field‐programmable gate array (FPGA). The performance of each component in SSN is analyzed and evaluated. Finally, a neural network SSNN composed of SSNs is deployed on FPGA and used for testing. Experimental results show that the stimulus‐response characteristics of SSN are consistent with the electrophysiological test findings of biological neurons, and the activities of SSNN exhibit a promising prospect. We provide a prototype for embedded neuromorphic computing with a small number of relatively detailed neuron models.
Article
We investigate spike-timing dependent plasticity (STPD) in the case of a synapse connecting two neuronal cells. We develop a theoretical analysis of several STDP rules using Markovian theory. In this context there are two different timescales, fast neuronal activity and slower synaptic weight updates. Exploiting this timescale separation, we derive the long-time limits of a single synaptic weight subject to STDP. We show that the pairing model of presynaptic and postsynaptic spikes controls the synaptic weight dynamics for small external input on an excitatory synapse. This result implies in particular that mean-field analysis of plasticity may miss some important properties of STDP. Anti-Hebbian STDP favors the emergence of a stable synaptic weight. In the case of an inhibitory synapse the pairing schemes matter less, and we observe convergence of the synaptic weight to a nonnull value only for Hebbian STDP. We extensively study different asymptotic regimes for STDP rules, raising interesting questions for future work on adaptative neuronal networks and, more generally, on adaptative systems.
Article
At present, the Synthetic Aperture Radar (SAR) image classification method based on Convolution Neural Network (CNN) has faced some problems such as poor noise resistance and generalization ability. Spiking Neural Network (SNN) is one of the core components of brain-like intelligence and has good application prospects. This article constructs a complete SAR image classifier based on unsupervised and supervised learning of SNN by using spike sequences with complex spatio-temporal information. We firstly expound on the spiking neuron model, the receptive field of SNN, and the construction of spike sequence. Then we put forward an unsupervised learning algorithm based on STDP and a supervised learning algorithm based on gradient descent in series. The average classification accuracy of single layer and bilayer unsupervised learning SNN in three categories images on MSTAR dataset is 81.1% and 82.9%, respectively. Furthermore, the convergent output spike sequences of unsupervised learning can be used as teaching signals. Based on the TensorFlow framework, a single layer supervised learning SNN is built from the bottom, and the classification accuracy reaches 90.2%. By comparing noise resistance and model parameters between SNNs and CNNs, the effectiveness and outstanding advantages of SNN are verified. Code to reproduce our experiments is available at https://github.com/Jiankun-chen/Supervised-SNN-with-GD.
Article
Full-text available
AbslracL ?he Hopfield network prwides a simple model of an awciative memory in a neuronal stmcture. It is, however, based on highly artificial assumptions, especially lhe use of formal lwo-state neurons or graded-response neurons. In this paper we address (he question of whal happens if formal neurons are replaced by a model of 'spiking' neurons. We do 50 in two steps. First, we show how Lo include refractoriness and noise into a simple threshold model of neuronal spiking. The spike -ins resulting f" such a model reproduce the distribulion of interspike intelvals and gain functions found in real neurons. In a semnd slep we mnncct the model neurons YI as to form a large associalive memory system. ?he spike transmission is described by a synaptic kernel which includes axonal delays, 'Hebbian' synaptic efficacies, and a realistic postsynaptic response. lbe mllective behaviour of the sys'em is predicted by a set of dynamical equations which are exact in the limit of a large and fully mnnected network that has to stox a Bnile number of pattems. We show that in a stalionary retrieval State the statistics of the spiking dynamics is mmpletely wiped oul and lhe system reduces LO a network of gaded-response neurons. In the rase of an scillatory retrimal state, however, the spiking noise and the internal time mnslanls of the neurons kmme important and determine the behaviour of the system.
Article
Full-text available
Long-term potentiation (LTP) was investigated in the mammalian entorhinal cortex by using two in vitro preparations, the isolated brain and the entorhinal cortex slice. Hebbian and non-Hebbian types of LTP appear to be present in layer II entorhinal cortex cells as demonstrated using two protocols: (i) tetanic stimulation of the piriform-entorhinal cortex afferent pathway to generate homosynaptic potentiation and (ii) postsynaptic subthreshold rhythmic membrane potential manipulation not paired to presynaptic activation, which gives rise to non-Hebbian LTP. The induction and the expression of both types of LTP were found to be dependent on activation of N-methyl-D-aspartate receptors as shown by their sensitivity to the receptor agonist D-2-amino-5-phosphonovalerate. This is in contrast to LTP in the hippocampus [Zalutsky, R. A. & Nicoll, R. A. (1990) Science 248, 1619-1624], where LTP is expressed by quisqualate receptors. Since, in the entorhinal cortex, LTP is linked to a selective increase of the N-methyl-D-aspartate-receptor-mediated synaptic responses, this enhancement is most likely due to postsynaptic factors.
Book
The human brain contains billions of nerve cells whose activity plays a critical role in the way we behave, feel, perceive, and think. This two-volume set explains the basic properties of a neuron - an electrically active nerve cell - and develops mathematical theories for the way neurons respond to the various stimuli they receive. Volume 1 contains descriptions and analyses of the principal mathematical models that have been developed for neurons in the past thirty years. It provides a brief review of the basic neuroanatomical and neurophysiological facts that will form the focus of the mathematical treatment. Tuckwell discusses the mathematical theories, beginning with the theory of membrane potentials. He then goes on to treat the Lapicque model, linear cable theory, and time-dependent solutions of the cable equations. He concludes with a description of Rall's model nerve cell. Because the level of mathematics increases steadily upward from Chapter Two, some familiarity with differential equations and linear algebra is desirable.
Article
Modulation of synaptic efficacy may depend on the temporal correlation between pre- and postsynaptic activities. At isolated neuromuscular synapses in culture, repetitive postsynaptic application of acetylcholine pulses alone or in the presence of asynchronous presynaptic activity resulted in immediate and persistent synaptic depression, whereas synchronous pre- and postsynaptic coactivation had no effect. This synaptic depression was a result of a reduction of evoked transmitter release, but induction of the depression requires a rise in postsynaptic cytosolic calcium concentration. Thus, Hebbian modulation operates at isolated peripheral synapses in vitro, and transsynaptic retrograde interaction appears to be an underlying mechanism.