Nothing Special   »   [go: up one dir, main page]

Skip to main content
Advertisement
  • Loading metrics

Intra-ripple frequency accommodation in an inhibitory network model for hippocampal ripple oscillations

  • Natalie Schieferstein ,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Software, Visualization, Writing – original draft, Writing – review & editing

    natalie.schieferstein@freenet.de (NS); r.kempter@biologie.hu-berlin.de (RK)

    Affiliations Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany, Bernstein Center for Computational Neuroscience, Berlin, Germany

  • Tilo Schwalger,

    Roles Conceptualization, Formal analysis, Methodology, Supervision, Writing – review & editing

    Affiliations Bernstein Center for Computational Neuroscience, Berlin, Germany, Institute for Mathematics, Technische Universität Berlin, Berlin, Germany

  • Benjamin Lindner,

    Roles Conceptualization, Formal analysis, Funding acquisition, Methodology, Supervision, Writing – review & editing

    Affiliations Bernstein Center for Computational Neuroscience, Berlin, Germany, Department of Physics, Humboldt-Universität zu Berlin, Berlin, Germany

  • Richard Kempter

    Roles Conceptualization, Formal analysis, Funding acquisition, Methodology, Project administration, Supervision, Writing – review & editing

    natalie.schieferstein@freenet.de (NS); r.kempter@biologie.hu-berlin.de (RK)

    Affiliations Institute for Theoretical Biology, Department of Biology, Humboldt-Universität zu Berlin, Berlin, Germany, Bernstein Center for Computational Neuroscience, Berlin, Germany, Einstein Center for Neurosciences, Berlin, Germany

Abstract

Hippocampal ripple oscillations have been implicated in important cognitive functions such as memory consolidation and planning. Multiple computational models have been proposed to explain the emergence of ripple oscillations, relying either on excitation or inhibition as the main pacemaker. Nevertheless, the generating mechanism of ripples remains unclear. An interesting dynamical feature of experimentally measured ripples, which may advance model selection, is intra-ripple frequency accommodation (IFA): a decay of the instantaneous ripple frequency over the course of a ripple event. So far, only a feedback-based inhibition-first model, which relies on delayed inhibitory synaptic coupling, has been shown to reproduce IFA. Here we use an analytical mean-field approach and numerical simulations of a leaky integrate-and-fire spiking network to explain the mechanism of IFA. We develop a drift-based approximation for the oscillation dynamics of the population rate and the mean membrane potential of interneurons under strong excitatory drive and strong inhibitory coupling. For IFA, the speed at which the excitatory drive changes is critical. We demonstrate that IFA arises due to a speed-dependent hysteresis effect in the dynamics of the mean membrane potential, when the interneurons receive transient, sharp wave-associated excitation. We thus predict that the IFA asymmetry vanishes in the limit of slowly changing drive, but is otherwise a robust feature of the feedback-based inhibition-first ripple model.

Author summary

The hippocampus plays a central role in the acquisition and consolidation of explicit memories. During sleep or rest, hippocampal neurons replay recently acquired memories, while the neuronal network exhibits high-frequency oscillations, so-called ripples. To study the potential function of ripples, their generation mechanism needs to be clarified. Here we analyze a model network of inhibitory interneurons as a potential ripple pacemaker. We derive an analytical approximation of the ripple dynamics that explains why the ripple frequency tends to decay over the course of an event (intra-ripple frequency accommodation, IFA). Studying dynamical phenomena such as IFA advances model selection and enables an understanding of how rhythmic brain activity contributes to cognitive functions.

Introduction

Sharp wave-ripples (SPW-R) are a prominent rhythmic signature of neuronal activity in the hippocampus. Hippocampal SPW-Rs are embedded in a larger system of thalamo-cortical rhythms [1, 2] and have been associated with the replay of behaviorally relevant neuronal activity [35]. SPW-Rs may thus be important for cognitive functions such as memory consolidation [611] and planning [4, 1214]. To probe their potential functional role, we first need to understand the generation mechanism of SPW-Rs.

SPW-Rs are brief (50–100 ms) periods of elevated, highly synchronized neuronal activity, which can be measured in the local field potential (LFP) of the hippocampus in vivo [1517], as well as in vitro [1822]. In CA1 the LFP sharp wave (SPW) is most prominent in layer stratum radiatum and is thought to reflect a current sink due to elevated excitatory synaptic transmission from the CA3 Schaffer collaterals. The LFP ripple oscillation (150–250 Hz in vivo, [16]; 210 ± 16 Hz in vitro, [19]) is strongest in stratum pyramidale of CA1 and is thought to reflect inhibitory synaptic currents, and potentially rhythmic excitatory action potentials [23, 24]. It is believed that the local CA1 network can generate ripples in isolation [19, 25, 26].

Various computational models have been put forward to explain ripple generation, relying either on excitation or inhibition as the main pacemaker (see Discussion for A note on mixed models). Excitation-first models assume that SPW and ripple oscillations are generated jointly by the sparsely connected pyramidal cell network, either via axonal gap junctions and antidromic spike propagation [27, 28] or via supralinear dendritic integration [29].

Inhibition-first models posit that the interneuron network (e.g. in CA1) produces ripples in response to transient excitatory input due to a SPW event, which is generated by a separate mechanism upstream (e.g. in CA3) [3032]. Inhibition-first models can be further subdivided into two classes, which we will call here feedback-based and perturbation-based, hinting at the mechanism by which ripples emerge: In the feedback-based models [16, 3340] the recurrent synaptic coupling between interneurons is strong such that for sufficiently strong excitatory drive the network undergoes a bifurcation and can, in theory, produce sustained ripple oscillations as long as the drive remains strong enough. The perturbation-based model [41], on the other hand, assumes weak coupling and cannot produce sustained oscillations. Here ripples emerge as a transient ringing effect in response to a perturbation in the external drive.

Experiments remain inconclusive as to which of the proposed mechanisms is the most plausible for ripple generation [17, 20, 42]. Previous analyses have focused on the average frequency and duration of ripple events as well as their dependency on pharmacological or optogenetic manipulation [17, 21, 43]. We propose that taking into account the transient dynamical features of spontaneous ripples can advance model selection. The instantaneous ripple frequency typically decays over the course of a ripple event. This intra-ripple frequency accommodation (IFA) has been observed in vivo and in vitro and across different animals, brain states, and measurements [9, 25, 37, 4346]. It is therefore an interesting question which of the proposed ripple models can account for IFA, and under which assumptions.

So far, only the feedback-based inhibition-first ripple model has been shown to reproduce IFA in exemplary numerical simulations [37]. Here we use a theoretical mean-field approach and extensive numerical simulations to explain the mechanism of IFA, show that IFA is robust with respect to parameter variation, and predict that IFA depends on the time course of the external, SPW-associated drive. These insights about IFA can be used to distinguish between the feedback-based and the perturbation-based inhibition-first ripple mechanisms.

Results

The feedback-based inhibitory ripple model assumes that a network of synaptically coupled CA1 interneurons, such as parvalbumin-expressing (PV+) basket cells, acts as a delayed negative feedback loop and thus creates fast ripple oscillations when stimulated with excitatory, sharp wave-associated drive from CA3. This idea was first expressed by [16, 33], and has been formally studied in computational models of varying complexity [3439]. Simulations of a biophysically detailed version of this model by [37] revealed that it can reproduce intra-ripple frequency accommodation (IFA) in response to time-dependent, sharp wave-like drive.

To explain the mechanism of IFA analytically, we first demonstrate that IFA is preserved in a spiking network model of reduced complexity (the network used in [34] with all-to-all coupling). We then turn to a mean-field approximation of the network dynamics, which enables us to study the ripple dynamics and IFA as a function of the time course of the excitatory sharp wave-like drive.

Ripples and IFA in a spiking neural network model

We model the CA1 interneuron network as a homogeneous network of N leaky integrate-and-fire (LIF) neurons with membrane time constant τm, capacitance C, and resting potential Eleak. The membrane potential vi of a unit i is given by the following stochastic differential equation: (1) where the derivative with respect to time t is abbreviated by a dot: . Whenever the membrane potential crosses a spike threshold Vthr, a spike is emitted and the membrane potential is reset instantaneously to a reset potential Vreset. For simplicity there is no absolute refractory period. All interneurons receive the same external, excitatory drive Iext(t) and an independent Gaussian white noise input ξi(t), with zero mean 〈ξi(t)〉 = 0 and unit noise intensity 〈ξi(t)ξj(t′)〉 = δijδ(tt′), scaled by the noise strength parameter σV. The network is fully connected via inhibitory pulse coupling of strength J and with a synaptic delay Δ (see Methods for details and default values of parameters).

The empirical population activity rN(t) in a small time interval [t, t + Δt) is defined as the number of spikes nspk(t, t + Δt) emitted by the population (Methods, Eq (15)), divided by the size of the population and the time step Δt: (2) For plotting purposes, rN is smoothed with a narrow Gaussian kernel (Methods, Eq (16)). In the following, we will illustrate that the key network dynamics in response to constant drive Iext and time-dependent, sharp wave-like drive Iext(t), as presented by [37], are preserved in this model, i.e., there are fast oscillations in the ripple range and there is IFA.

Dynamics for constant drive.

The network dynamics for constant drive Iext are illustrated in Fig 1. At low Iext, the network is in a steady-state with units firing asynchronously and irregularly at an overall low rate funit (Fig 1A, left). As Iext increases, the network activity begins to exhibit coherent oscillations (Fig 1A, middle). In the following, we will refer to the dominant frequency of this population oscillation as the network frequency fnet (Methods, black triangle markers in Fig 1B). For a biologically reasonable set of parameters (Methods) the network frequency lies within the ripple range for a large range of external drives (Fig 1B, gray band). For 0.3 nA ≲ Iext ≲ 0.9 nA, the unit activity underlying the network ripple oscillation is sparse and irregular, i.e. the average unit firing rate funit is lower than the frequency fnet of the network oscillation (saturation s = funit/fnet < 1) and the coefficient of variation (CV) of interspike intervals is around 0.5 (Fig 1B, bottom panels). We hence refer to this state as sparse synchrony [37, 47].

thumbnail
Fig 1. Constant-drive dynamics of the spiking network.

(A) Four dynamical regimes depending on the external drive: asynchronous irregular state, sparse synchrony, full synchrony, multiple spikes (left to right, N = 10, 000). Top: population rate rN, middle: raster plot showing spikes of 30 example units, and histogram of membrane potentials v (normalized as density, threshold and reset marked by horizontal dashed lines), bottom: power spectral density of the population rate. (B) Top: network frequency fnet (black) and mean unit firing rate funit (blue) for a range of constant external drives Iext. Grey band marks approximate ripple frequency range (140–220 Hz). Red markers indicate the critical input level (Hopf bifurcation) and the associated network and unit frequency, as resulting from linear stability analysis, see section Linear Stability Analysis in Methods, Eq (70), and [34]. Linestyle indicates network size (N ∈ [102, 103, 104]). Middle: saturation s = funit/fnet. Bottom: coefficient of variation of interspike intervals, averaged across units.

https://doi.org/10.1371/journal.pcbi.1011886.g001

In the mean-field limit N → ∞, the transition to this oscillatory state is sharp (a supercritical Hopf bifurcation) and the transition point can be determined by a linear stability analysis of the stationary state [34]. In a simulated spiking network of finite size, the transition to the oscillatory state is not perfectly sharp because of inevitable fluctuations. Nevertheless, at the bifurcation point, both the network frequency and the unit firing rate are well predicted by the mean-field theory if the network is large enough (Fig 1B, red markers: nA with fnet = 305 Hz, funit = 16 Hz, s = 0.05).

For some sufficiently strong drive, the network reaches a state of full synchrony (s ≈ 1), with units firing regularly and at the same average frequency as the population rhythm (Fig 1A, “full synch”). If the drive increases beyond this level, units spike several times per cycle (s > 1, Fig 1A, “multiple spikes”)), which increases the CV of interspike intervals. Since the firing rates in that regime are too high to be biologically plausible, we focus in the following on the dynamical regime between the onset of oscillations (bifurcation point) and the point of full synchrony (Fig 1).

We note that in this reduced model (and also in [34]) the network frequency is a decreasing function of the external drive (Fig 1B). It is generally not straight-forward to infer how the network frequency changes with the external drive Iext: On the one hand, individual neurons reach the spiking threshold faster when the drive is stronger. On the other hand, the resulting increase in spiking activity leads to a stronger delayed feedback inhibition that pushes all membrane potentials further away from the threshold. The network frequency thus results from a self-consistency condition. In fact, the shape of the network frequency curve can be different (increasing or even non-monotonic) depending on the details of the model network architecture (see also Discussion and Fig A in S2 Appendix). The mechanism of IFA that we demonstrate in the following is independent of the shape of the network frequencies under constant drive.

Dynamics for time-dependent drive.

During a sharp wave (SPW) event, which is thought here to be generated in CA3 (cf. models introduced by [3032]), pyramidal cells in CA3 transiently increase their firing rates [43, 48]. Here we model the resulting feedforward input Iext(t) to the CA1 interneuron network in a simplified form: as a symmetric, piecewise linear double ramp (Fig 2A, bottom; see also Eq (17) in Methods, and Discussion). This SPW-like drive elicits a transient ripple event in the network (Fig 2A). We measure the instantaneous frequency of the population activity, either by taking a windowed Fourier transform (wavelet spectrogram) and finding the peak in power in each time step, or by taking the inverse of the distances between consecutive peaks in the population activity providing frequencies for a discrete set of time points (Fig 2A, top, solid line vs white dots).

thumbnail
Fig 2. Transient dynamics of the spiking network and IFA.

(A) Example simulation showing a transient ripple with IFA. From bottom to top: SPW-like drive (Eq (17)); histogram of membrane potentials (normalized), horizontal dotted lines: threshold and reset potential; raster plot showing spike times of 10 example units; population rate exhibiting transient ripple oscillation; wavelet spectrogram indicating instantaneous power (blue-yellow colorbar) for a frequency range of 0–400 Hz. Solid curve: continuous estimate of instantaneous frequency based on wavelet spectrogram with gray scale indicating maximal instantaneous power. Dotted line: cutoff frequency for peak detection fmin = 70Hz. Red lines in scalebars: power threshold (see Methods). White dots: discrete estimate of instantaneous frequency based on peak-to-peak distance in population rate. Network size N = 10, 000. (B) Quantification of IFA. Top: Grey dots: discrete instantaneous frequency estimates from 50 repetitions of the simulation shown in (A) with different noise realizations. Grey line: linear regression line with negative slope (χIFA = −3.04 Hz/ms) indicating IFA (see Methods, Eq (18)). Black line: asymptotic frequencies (cf. Fig 1B, top). Bottom: The same SPW-like drive was applied in all 50 simulations. Network size N = 10, 000. (C) Dependency of IFA slope χIFA on the slope of the external drive for different network sizes (color coded). (D) Instantaneous (dots) vs asymptotic (black lines) network frequencies (top) for piecewise linear drives (bottom) of decreasing slopes (left to right). Color/linestyle indicates network size N. Thin, colored linear regression lines illustrate decreasing strength of IFA for shallower drive (regression slopes summarized in C). Asymptotic network frequencies are derived via interpolation of the constant-drive results shown in Fig 1B, top. The asymptotic frequencies for N = 103 and N = 104 are nearly identical (dash-dotted and solid lines). Note that the drive is identical in panels A, B, and in the left panel of D.

https://doi.org/10.1371/journal.pcbi.1011886.g002

There is no established definition of IFA. The central aspect of IFA is the decrease (accommodation) of the instantaneous frequency. Over the course of an experimentally measured ripple event (20–100 ms) the instantaneous ripple frequency typically decreases by about 10–60 Hz (rough visual estimate of peak-to-trough distances seen in [9, 25, 37, 4346]), corresponding to linear slopes of about −0.1 to −3 Hz/ms. We observe such a decreasing trend of the instantaneous frequency in the model even though the external stimulus is perfectly symmetric (Fig 2A, top, see also [37]), i.e. the network exhibits intra-ripple frequency accommodation (IFA). We quantify IFA by linear regression over the discrete instantaneous frequency estimates (Fig 2A, white dots). This quantification of IFA is applied to results from many simulations with different noise realizations and the same SPW-like drive (Fig 2B; Methods, Eq (18)). A negative regression slope (here χIFA = −3.01 Hz/ms) indicates IFA.

In the model, the decrease in instantaneous frequency is not strictly linear over the entire event. In the central portion of the simulated ripple event, where power in the ripple band is large, the ripple frequency decreases monotonically. At the end of the event, however, we observe a small increase of the frequency, albeit with low power (the underlying peaks in the population rate are small) (Fig 2A and 2B top). Similar non-monotonic trajectories of the ripple frequency have been observed in experimentally measured ripples and still been classified as exhibiting IFA [44].

The transient dynamics of the instantaneous frequency in this model can be understood best by comparing it to the asymptotic frequency that the network would settle into if the drive remained constant at the instantaneous value of the drive (Fig 2B, grey dots vs black line). Naturally, this asymptotic frequency follows the same symmetry as the external drive and thus provides a useful reference frame. For our double ramp drive (Methods, Eq (17)), we observe that during the plateau phase (i.e. constant drive) the instantaneous frequency quickly approaches the asymptotic frequency. However, during the rising phase of the drive the instantaneous frequency tends to be higher than the asymptotic reference. During the falling phase it is lower, thus creating the overall IFA asymmetry.

Varying the slope of the external double ramp drive we see that the IFA asymmetry is speed-dependent (Fig 2C and 2D): If the external drive changes more slowly (smaller slope), the network frequency response becomes more symmetric; for very small slopes the instantaneous frequencies approach the symmetric, asymptotic reference frequencies, and IFA vanishes.

In what follows, we aim at understanding the mechanism behind the observations illustrated for constant drive in Fig 1 and time-dependent drive in Fig 2. Further simulations indicate that the network dynamics varies only little when the network size is varied by two orders of magnitude (Figs 1B, 2C and 2D). We hence hypothesize that IFA is preserved in the mean-field limit of an infinitely large network, and will use a mean-field approach to explain the generating mechanism of IFA.

Gaussian-drift approximation of ripple dynamics in the mean-field limit

To facilitate notation in the mathematical analysis, we rescale all voltages to units of the distance between threshold and rest such that the new spiking threshold is at VT = 1 and the resting potential is at EL = 0. The single unit stochastic differential equation (previously Eq (1)) then reads (3) with rescaled external excitatory current IE, inhibitory synaptic strength K, and noise intensity D; see also section Dimensionless equations in the Methods, Eqs (20)–(22); there we provide details on the rescaling and values of default parameters.

In the mean-field limit N → ∞, the dynamics of the density of membrane potentials p(V, t) is described by the Fokker-Planck equation (FPE) (4) (5) (6) (see Methods, Eq (23) for details). The population rate r(t) defined in Eq (6) represents the mean-field limit of the population activity rN(t) in Eq (2) of the spiking neural network. Compared to the classical application of the FPE [49, 50] there are two essential differences: First, the FPE is nonlocal because of the resetting rule. This rule imposes an absorbing boundary condition at the threshold (Eq (23e)) and a source of probability at the reset point; the latter source can be imposed by a jump condition for the derivative of the density at the reset point (Eq (23g)) that matches the derivative at the threshold point, see e.g. [5155]. Secondly, the FPE (4) is nonlinear because the current I(t) depends on the probability density p(V, t) through the population rate r(t) in Eq (6).

Stable stationary solutions of this nonlocal, nonlinear FPE correspond to asynchronous irregular spiking [34, 51]. It has been shown that an oscillatory (i.e. periodic) solution r(t) emerges via a supercritical Hopf bifurcation when the external drive IE exceeds a critical value [34, 51]. This network oscillation well reproduces the coherent stochastic oscillation of the population activity rN(t) in the finite-size spiking neural network (Fig 1). The network frequency at the onset of oscillations (i.e. at the Hopf bifurcation, where the stationary solution looses stability) is well predicted analytically by a linear stability analysis [34] (see also Methods, Eq (70) and Fig 1B, red markers). However, this analytical prediction quickly breaks down further away from the bifurcation where the oscillation dynamics becomes strongly nonlinear (see power spectral densities with higher harmonics in Fig 1A). In this regime neither an exact periodic solution of the FPE nor an approximation of its frequency is known. We will thus introduce a simplified approach that allows us to approximate the network frequencies at strong drive beyond the bifurcation.

Our approach can be motivated by two observations from the spiking network simulations: (a) In the relevant regime between sparse and full synchrony, units spike at most once per cycle of the population rhythm (funitfnet, Fig 1B). This property will allow us to approximate the time course of a population spike in rN(t) using a first-passage-time ansatz, neglecting the reset mechanism. (b) In between population spikes, the population rate rN(t) is close to zero, and the strong inhibitory feedback pushes the bulk of the membrane potential distribution significantly below threshold (Fig 1A). In those periods the absorbing boundary condition at threshold does not have any significant impact on the dynamics of p(V, t). Observation b) is illustrated in more detail in Fig 3A, which shows the spiking network simulations from Fig 1 (Iext = 0.55 nA) with rescaled voltages.

thumbnail
Fig 3. Illustration of the Gaussian-drift approximation.

Comparison of oscillation dynamics in the spiking network simulation (A) and the Gaussian-drift approximation (B) at constant drive (IE = 4.24). (Ai) Spiking network simulation. Empirical population rate rN (top), and density of membrane potentials (bottom, dimensionless voltage), exhibiting coherent stochastic oscillations with (weak) finite size fluctuations. (Aii) The average cycle of the oscillation dynamics in Ai (computed for 21 bins of 0.24 ms each). Top: population rate; middle: density of membrane potentials, orange marker: local minimum of mean membrane potential; bottom: standard deviation of membrane potential distribution, dashed line: theoretical asymptote in the absence of boundary conditions (Eq (26)). (Aiii) Snapshots of the membrane potential density over the course of the average cycle shown in (Aii). (Bi) Gaussian-drift approximation (numerical integration of DDE (8) with reset condition). Top: population rate, bottom: mean membrane potential μ(t) (black line) with Gaussian density of membrane potentials p(V, t) painted on in the background for better comparison with Ai. (Bii) Zoom into one oscillation cycle (black bar in Bi). Top: population rate r, middle: density of membrane potentials p(V, t) with mean μ(t), bottom: constant standard deviation . The mean membrane potential μ(t) (black line) starts in each cycle at μmin (orange) and rises up until μmax (cyan) at time toff, at which point the population spike ends. In a phenomenological account for the single unit reset, μ is reset instantaneously to μreset (yellow). From there μ declines back towards μmin. (Biii) Snapshots of the membrane potential density p(V, t) over the course of one cycle (Bii). Colors mark t = 0 ∼ T (orange), and t = toff (cyan/yellow). Dotted lines in all voltage panels mark spike threshold VT = 1 and reset potential VR = 0. Note that in the theoretical approximation the spike threshold VT = 1 is no longer an absorbing boundary.

https://doi.org/10.1371/journal.pcbi.1011886.g003

These two observations motivate a considerable simplification of the mean-field dynamics: Without the boundary conditions at threshold and reset, the FPE (4) can be solved analytically. In the long-time limit its solution becomes a simple Gaussian—independent of the initial condition (Methods, Eq (24)). We thus approximate the density of membrane potentials as a Gaussian (7) (see Fig 3B), and the only time-dependent quantity is the mean membrane potential μ(t), which evolves according to (8a) (see Fig 3Bi and 3Bii, black lines, cf. Methods Eq (25)). Because in the considered regime spikes are mainly driven by the mean input rather than membrane potential fluctuations, the population rate can be well approximated by the drift part of the probability current across the threshold, while diffusion-mediated spiking is ignored (Methods, Eqs (28) and (29); see also [5659]): (8b) Thus, in our approximate dynamics without reset, the population rate r(t) is given by the membrane potential density at threshold, scaled by the speed at which the mean membrane potential approaches the threshold. Whenever the mean membrane potential is decreasing (), the drift current at the threshold is negative, and hence, the rate is clipped to 0 ([x]+ ≔ (x + |x|)/2, see Fig 3Bi and 3Bii, top).

Because the single-neuron dynamics of integrate-and-fire neurons includes a reset mechanism after the spike-threshold has been reached, we add a phenomenological account for such a reset to our population-level description: At the end of a population spike () the mean membrane potential is pushed down (“reset”) by an amount that is proportional to the fraction of neurons that have participated in the population spike (Fig 3Biii, t = 3.62 ms; see Methods Eq (49) for details).

The two coupled Eqs (8a) and (8b) are equivalent to a single delay differential equation (DDE; Methods, Eq (31)). This DDE (including the reset condition) governs the dynamics of the mean of the Gaussian membrane potential distribution, and the resulting drift-based population rate. In the following, we will therefore refer to the DDE dynamics (with the phenomenological reset condition) as the Gaussian-drift approximation.

The main differences between the spiking network model (or the exact mean-field dynamics given by the FPE (4)) and the Gaussian-drift approximation are illustrated in Fig 3 for the case of constant drive. In the spiking network model, the membrane potential density p(V, t) is not strictly Gaussian (Fig 3Aiii). Because of the fire-and-reset mechanism, p(V, t) changes in shape during the oscillation cycle, becoming at times even bimodal (Fig 3Aiii vs Fig 3Biii). Still, we see that our Gaussian assumption (b) is justified: Whenever the membrane potential density is subthreshold in between population spikes, it becomes more Gaussian, and its standard deviation approaches (Fig 3Aii, bottom, and Fig 3Aiii, first 3 snapshots).

The advantage of our Gaussian-drift approximation is that it reduces the FPE with complex boundary conditions to a simpler DDE with oscillatory solutions that can be studied analytically. In the following, we will first consider the dynamics for constant drive, then extend the Gaussian-drift approximation to understand the transient response to time-dependent drive, and hence explain the emergence of IFA.

Analysis of oscillation dynamics for constant drive.

A T-periodic solution of the Gaussian-drift model, Eq (8), must have a mean membrane potential μ(t) = μ(t + T) that oscillates between two local extrema μmin and μmax (Fig 3Bi). Whenever the mean membrane potential increases, a positive population rate r is produced; when the mean membrane potential decreases the population rate is clipped to 0 (Eq (8b)). Let us consider a single cycle of this oscillatory dynamics (Fig 3Bii): The moment when μ reaches its local maximum μmax (Fig 3Bii, cyan circle) is of special importance as it marks the end of the population spike. We will refer to this time as toff: Since the inhibitory feedback II(t) = −mr(t − Δ) is proportional to the population rate with a delay of Δ (Eq (8a)), it follows that this feedback ceases exactly Δ after the end of the population spike, i.e. II(toff + Δ) = 0. It is convenient to define this time as the end of a cycle, i.e. the beginning of the next one. The mean membrane potential at this time is close to its local minimum (see Methods, Step 2) and will be denoted as μminμ(toff + Δ). The period T can then be split into the upstroke time toff needed for the mean membrane potential to rise from μ(t = 0) = μmin towards μ(toff) = μmax, and the downstroke time Δ during which the mean membrane potential is pushed back down to μ(T) = μmin due to the delayed inhibitory feedback (Fig 3Bii). In Methods we derive, through a series of heuristic approximations, analytical expressions for the local extrema μmax (Eq (38)) and μmin (Eq (50)) of the mean membrane potential oscillation as a function of the external drive IE. Using these expressions, we obtain an analytical formula for toff (Eq (47)) and hence for the network frequency: (9) Apart from the network frequency, the Gaussian-drift approximation also allows an intuitive understanding of the mean unit firing rate. When the population spike ends at time toff, the suprathreshold portion of the Gaussian density corresponds to the fraction of units that have spiked in the given cycle (the saturation s, Methods, Eq (48)). The mean unit firing rate can thus be inferred as: (10) In the following, we compare our analytical Gaussian-drift approximations for the network frequency and mean membrane potential dynamics to the spiking network simulations for a range of external drives IE.

Performance evaluation of the Gaussian-drift approximation.

Our theory captures the dependence of the network frequency fnet and the mean unit firing rate funit on the external drive IE, including the transition from sparse to full synchrony for increasing external drive (Fig 4, top). Our analytically derived expression for the saturation s predicts this transition (Fig 4, middle), since s is a monotonically increasing function of μmax (Eq (48)), which monotonically increases as a function of the external drive IE (Eq (38), Fig 4, bottom). We can also estimate analytically the point of full synchrony, i.e. the amount of external drive that is required for single units to fire approximately at the frequency of the network rhythm (; Methods, Eq (51); Fig 4, vertical dashed line). The Gaussian-drift approximation slightly overestimates the point of full synchrony, but correctly predicts its parameter dependencies: For stronger coupling and/or larger noise, stronger external drive is required to reach full synchrony (Fig B in S1 Appendix).

thumbnail
Fig 4. Analytical approximation of the oscillation dynamics for constant drive.

Comparison of dynamics in theory (full lines) and spiking network simulation (dashed lines, N = 10, 000). Top: Network frequency (black triangles) and mean unit frequency (blue circles). Red markers: Hopf bifurcation. Vertical lines indicate the range for which the theory applies (see Methods, Eq (52)). Middle: Saturation s increases monotonically with the drive (Eq (48)). Bottom: characterization of the underlying mean membrane potential dynamics via local maximum μmax (cyan, Eq (38)), local minimum μmin (orange, Eq (50)) and population reset μreset (yellow, Eq (49)). Default parameters (see Methods).

https://doi.org/10.1371/journal.pcbi.1011886.g004

The theory shows that the amplitude μmaxμmin of the oscillatory mean membrane potential grows with increasing drive (Fig 4, bottom). This is mainly due to a strong decrease of the periodic minimum μmin (Fig 4, bottom, solid orange line), which we also observe in the spiking network simulation (Fig 4, bottom, dashed orange line): stronger external drive increases the amplitude of the population spikes (Fig 1A), resulting in stronger recurrent inhibitory feedback and thus a stronger hyperpolarization of the membrane potentials. The quantities μmax and μreset are pertinent to the Gaussian-drift approximation and have no direct counterpart in the spiking neural network model.

The range of applicability of our theory is defined by our two assumptions (see Methods for details, Eq (52)): (a) units should spike at most once per cycle (i.e. ), (b) in between population spikes the bulk of the membrane potential distribution should be subthreshold (i.e. . The resulting range for the external current IE covers the large part of the regime of sparse synchrony up to the point of full synchrony (Fig 4, here ). We confirmed with numerical simulations that for strong enough drive the Gaussian-drift approximation works for a wide parameter regime w.r.t. noise, coupling strength, and synaptic delay (S1 Appendix).

At low drive () this theory breaks down (see also Methods, section Numerical analysis of oscillation dynamics). This is to be expected from a purely drift-based approximation. The dynamics of the spiking network close to its supercritical Hopf bifurcation is largely fluctuation-driven. Such dynamics cannot be captured by focusing only on the oscillation of the mean membrane potential, which has vanishing amplitude as the drive approaches its critical value. This limitation of the theory does not pose a problem, since (a) the fluctuation-driven dynamics around the Hopf bifurcation has already been studied in depth by [34] and (b) our main goal here is to explain the IFA dynamics, which happens in the strongly mean-driven regime .

So far we have studied the case of constant drive describing the asymptotic dynamics of sustained ripple oscillations observed in the long-time limit, after initial conditions have been forgotten. This will be emphasized from here on by adding a superscript “∞”. The simulations in Fig 2 already suggested that IFA emerges due to a deviation of the transient dynamics from these asymptotic dynamics. We will now use the established Gaussian-drift approximation to understand why. In a first step we will maintain the assumption of constant drive and study the transient dynamics that can be induced by perturbations of the initial condition (Fig 5A and 5B). With piecewise constant drive as a simplistic model of a SPW-like drive we then demonstrate the core mechanism of IFA: a hysteresis of the mean membrane potential dynamics (Fig 5B and 5C). We then extend the same formalism to piecewise linear drive, as a more realistic approximation of SPW-like drive, and in order to demonstrate that the IFA asymmetry is preserved even when the drive is fully symmetric (Fig 6). The Gaussian-drift approximation for linear drives allows us to study the dependence of IFA on the slope of the drive. We demonstrate that the hysteresis effect causing IFA is speed-dependent (Fig 7)—a novel, testable prediction of the feedback-based inhibition-first ripple model matching the simulation results from the spiking network (Fig 2D).

thumbnail
Fig 5. Transient dynamics and IFA for piecewise constant external drive.

(Ai, Aii) Dynamics under constant drive depending on the initial mean membrane potential. Top: Population rate. Bottom: Constant external drive (green) and mean membrane potential (gray) with initial value μmin (orange marker). Dotted horizontal lines mark spike threshold VT = 1 and reset potential VR = 0. (Aiii) Direct comparison of the first oscillation cycles in Ai/Aii (gray shaded area) with the asymptotic cycle dynamics. Orange and cyan horizontal lines mark the asymptotic values for and , respectively. Left: shorter cycle for . Middle: asymptotic period for . Right: longer cycle for . The color of the population rate curve (left, right) expresses the difference in cycle length as a difference in instantaneous frequency (colorbar in B). (B) Difference between the instantaneous frequency of a cycle with constant drive IE and initial condition μmin, and the asymptotic frequency for a range of external drives IE and initial mean membrane potentials μmin. Black line: asymptotic (cf. Fig 4, bottom, orange line). Markers indicate example cycles shown in C. Arrows indicate convergence to the asymptotic dynamics after one cycle. If the drive changes after each cycle (dotted lines), the seven examples lead to the trajectory shown in C. Cycles 2 and 6 are also shown in Aiii (left, right), together with their common asymptotic reference dynamics. ȈFA for piecewise constant drive with symmetric step heights. Shaded areas mark oscillation cycles. Bottom: The external drive is increased step-wise, up to the point of full synchrony (green staircase). As in A, lines in all panels indicate the asymptotic dynamics associated to the external drive of the respective cycle. Circular markers indicate transient behavior. Cyan: μmax. Orange: μmin. Reset not marked to enhance readability. Gray line: trajectory of the mean membrane potential. Middle: Population rate. Top: the instantaneous network frequency (markers) is first above and then below the respective asymptotic network frequencies (black line). Same colorbar as B. All quantities are derived analytically from the Gaussian-drift approximation. Vertical axis labeled “voltage, drive” in panels A and C applies to membrane potential and external drive.

https://doi.org/10.1371/journal.pcbi.1011886.g005

thumbnail
Fig 6. Transient dynamics and IFA for piecewise linear external drive.

(A) Exemplary transient dynamics during rising vs falling phase of the external drive (“up” vs “down”), given fixed (green dot). Bottom: external drive (green lines), and trajectories of mean membrane potential (gray lines) depending on initial conditions (orange dots). Dotted horizontal lines mark reference drive , spike threshold VT = 1, and reset potential VR = 0. Top: Population rate. Color (colorbar in B) indicates the difference in cycle length (shaded area) compared to the asymptotic reference (middle panel). Left: shorter cycle for m > 0 and initial (orange dot vs orange line). Middle: asymptotic period for constant drive () and initial . Right: longer cycle for m < 0 and initial . (B) Difference between instantaneous and asymptotic frequency for a range of reference drives and initial mean membrane potentials μmin. Left: linearly increasing drive with slope m = +0.4/ms. Right: linearly decreasing drive with slope m = −0.4/ms. Black line: asymptotic for constant drive. White line: initial membrane potential μmin for which . Stars mark the examples shown in A for . Round markers and arrows indicate the trajectory shown in C for piecewise linear drive, numbered by cycle. White space where either: no asymptotic oscillations occur (), or (bottom left): no transient solution exists (see Methods, Eq (64)). (C) IFA for symmetric, piecewise linear (SPW-like) drive. Shaded areas mark oscillation cycles. Bottom: The external drive is increased up to the point of full synchrony (green line). Colored lines indicate asymptotic dynamics. Gray line: mean membrane potential trajectory μ(t). Markers quantify transient behavior. Cyan: μmax. Orange: μmin. Reset not marked to enhance readability. Middle: Population rate. Top: the instantaneous network frequency (markers) is first above, then below the resp. asymptotic network frequencies (black line). Same colorbar as B. Dashed lines: plateau phase of variable length with , during which the network settles into the asymptotic dynamics. The IFA slope χIFA was derived for an assumed plateau length of 20 ms (as in Fig 2). All quantities are derived analytically. Vertical axis labeled “voltage, drive” in panels A and C applies to membrane potential and external drive.

https://doi.org/10.1371/journal.pcbi.1011886.g006

thumbnail
Fig 7. IFA is speed-dependent.

Transient dynamics for SPW-like drive with slope m = ±0.4/ms (A, cf. Fig 6), m = ±0.2/ms (B), or m = ±0.1/ms (C). Top panels (i): difference between instantaneous and asymptotic network frequency for the possible combinations of external reference drive and initial mean membrane potential μmin, shown separately for positive (top) and negative (bottom) slope of the drive. Trajectories shown in (ii) are overlaid. Bottom panels (ii): Example trajectories through the space shown in (i) for consecutive cycle under SPW-like drive with slope ±m. Top: instantaneous (colored markers) vs asymptotic (black solid line) network frequency as predicted by the theory. Grey dots indicate instantaneous frequencies in spiking network simulations (cf. Fig 2D, N = 10, 000). Black dashed line illustrates asymptotic network frequencies from spiking network simulations (cf. Fig 1B, N = 10, 000). See Table 1 for a quantitative comparison of simulation and theory. Middle: population rate as predicted by the Gaussian-drift approximation. Bottom: instantaneous vs asymptotic μmin (orange) and μmax (cyan). Gray line: trajectory of mean membrane potential μ(t). Green lines show SPW-like external drive (Eq (17)), green dots mark reference drive of each cycle. The difference between instantaneous and asymptotic network frequencies (IFA) becomes less pronounced for smaller slope (C vs A, see also Table 1).

https://doi.org/10.1371/journal.pcbi.1011886.g007

Analysis of oscillation dynamics for piecewise constant, sharp wave-like drive.

Even for constant drive, there is transient dynamics if the initial mean membrane potential deviates from the asymptotic minimum . Let us assume that a cycle starts with an initial mean membrane potential . We only require that μmin be sufficiently subthreshold, such that the initial population rate is close to zero. What will be the period of the first cycle and how long does it take until the asymptotic dynamics is reached?

First we note that, independent of μmin, the mean membrane potential will rise towards the asymptotic , which can be shown to be independent of the initial condition (Methods, Eq (38)). Thus, only the duration of the first upstroke will be influenced by the initial condition, and the asymptotic dynamics is reached immediately thereafter (Fig 5Ai and 5Aii).

The duration of the first upstroke depends on the distance that the mean membrane potential has to travel, from its initial value μmin to the next peak . For the upstroke has length and hence the length of the first cycle is equal to the asymptotic period T. Correspondingly, the instantaneous frequency is equal to the asymptotic network frequency, (Fig 5Aiii, middle). The upstroke takes less time, if the mean membrane potential starts at a higher value (, Fig 5Ai and 5Aiii, left), and more time otherwise (Fig 5Aii and 5Aiii, right). Hence the period of the first cycle will be either shorter or longer, which can be rephrased as an instantaneous frequency that is higher or lower than the asymptotic . Fig 5B illustrates the instantaneous frequency of the first cycle for different combinations of (constant) drive IE and initial condition μmin (red: instantaneous frequency is higher than asymptotic frequency; blue: instantaneous frequency is lower).

Once the mean membrane potential has reached its first peak, it will follow the asymptotic dynamics, settling into at the end of the first cycle, and all subsequent cycles will come at the asymptotic frequency associated to the external drive IE (Fig 5Ai and 5Aii, convergence indicated by arrows in Fig 5B). A change of initial condition can thus only introduce a transient deviation from the asymptotic dynamics in a single cycle. In particular, this implies that stimulation with a square pulse cannot induce IFA in this model (see also Discussion and Fig A in S2 Appendix, panels a5–d5).

What if we change the external drive after each cycle (green line in Fig 5C)? Then the initial mean membrane potential of each cycle i will be the asymptotic minimum associated to the drive of the previous cycle: i.e. the mean membrane potential dynamics exhibits a history dependence (or hysteresis). Now recall that the asymptotic minimum is a monotonically decaying function of the drive (except for very strong drive close to , Fig 5B, black line). Thus, if the external drive increases stepwise, each cycle starts with an initial mean membrane potential above the asymptotic minimum associated to that cycle’s drive, hence the instantaneous frequency is above its asymptotic value (Fig 5B, trajectory through red area: ). Vice versa, if the external drive decreases stepwise, each cycle starts with an initial mean membrane potential below the asymptotic minimum associated to that cycle’s drive, hence the instantaneous frequency is below its asymptotic value (Fig 5B, trajectory through blue area: ). In summary:

Thus, if we approximate the transient change in drive during a sharp wave as a simple, piecewise constant function that first increases after each cycle, and then decreases (Fig 5C, green line), we observe IFA: The asymptotic network frequency associated to the drive in each cycle describes a reference curve that follows the same symmetry as the drive (Fig 5C, top, solid black line). However, the instantaneous network frequency (Fig 5C, top, round markers) is asymmetric over time, as it is above the asymptotic network frequencies during the rising phase of the external drive, and below during the falling phase. The theory thus describes the relationship between instantaneous and asymptotic frequencies that was already described for the spiking network simulations in Fig 2D.

The piecewise constant shape of the drive may not be realistic, but serves to illustrate the core mechanism of IFA: a hysteresis in the oscillation amplitude of the mean membrane potential. A drawback is that this simple model for SPW-like drive is not symmetric in time, since the drive changes after each cycle, and the cycle length increases due to IFA. To show that the IFA asymmetry does not rely on an asymmetry in the drive, we adapted the Gaussian-drift approximation to incorporate time-dependent linear drive.

Analysis of oscillation dynamics for piecewise linear, sharp wave-like drive.

Following the same approach as before, we approximate the transient dynamics of the mean membrane potential in a cycle i with initial value and a drive that changes linearly around a level with slope m: (11) The dynamics is quantified in terms of the peak of the mean membrane potential , its reset value , and the value that is reached at the end of cycle i (and may thus be the initial membrane potential of the next cycle i + 1). Most importantly, the duration of the upstroke is inferred, and from that the instantaneous network frequency (see Methods, Eqs (56)–(66)). In agreement with our theoretical approximation, which is anchored to the end of the population spike, the reference drive is chosen such that , i.e. (12)

The analysis is now a little more complex, since each cycle depends on three parameters (), in contrast to only two parameters for piecewise constant drive. We will demonstrate, however, that the essential findings from the basic case of piecewise constant drive still hold, i.e. that IFA is generated by the same hysteresis in the transient dynamics of the mean membrane potential that we have uncovered before (Fig 5 vs Fig 6). To illustrate this by an example, we fix the slope of the linear drive to m = ±0.4/ms. We then compare how the transient dynamics deviates from the asymptotic dynamics, depending on whether the drive is increasing (m = +0.4/ms) or decreasing (m = −0.4/ms) (Fig 6).

The principal insight from the case of constant drive still holds for the case of time-dependent drive, except for a small range of initial values : If the initial mean membrane potential is (sufficiently) larger than the asymptotic reference , the period is shorter than the asymptotic reference T, hence (Fig 6A, left; Fig 6B, red area). In contrast, if is smaller than the asymptotic reference , the period is longer than the asymptotic reference T, hence (Fig 6A, right; Fig 6B, blue area). Exceptions from this “rule” (Fig 6B, area between black and white lines) occur because for time-dependent drive an asymptotic initial value (black line) no longer implies the asymptotic period T and frequency (white line, see Methods for details). We note, however, that these exceptions from the rule occur only in a small portion of the state space that is rarely visited in a given ripple event, as we will see in the following.

What can we say about the dynamics of consecutive cycles i, i + 1, … that occur if the drive rises or falls continuously with slope m? At the end of each cycle the mean membrane potential is close to the asymptotic reference (Methods, Eq (66)). Thus we observe the same hysteresis as before: if the drive increases (m > 0, Fig 6B, left), trajectories of consecutive cycles will lie in the upper right half of the parameter space, where every cycle starts with an initial mean membrane potential that is higher than its asymptotic reference (), and hence has an instantaneous frequency that is (mostly) higher than the asymptotic reference (red color code). Vice versa, as the drive decreases (m < 0, Fig 6B, right), trajectories will lie in the lower left half () where instantaneous frequencies are mostly lower than their asymptotic reference (blue color code). Hence even a symmetric, piecewise linear double ramp drive (Methods, Eq (17)), induces the IFA asymmetry (Fig 6C): During the rising phase of the drive the instantaneous frequencies are above the asymptotic reference, and during the falling phase they lie below (Fig 6C, top: markers vs black line; note cycle 5 as the only exception from the above “rule”). The IFA asymmetry thus does not rely on asymmetry in the input. Linear regression over the (semi-)analytically estimated instantaneous frequencies yields an IFA slope of −2.6 Hz/ms which is close to the spiking network simulation (Fig 2B).

Interestingly, the last cycle i = 7 in Fig 6C has a reference drive for which the constant-drive theory no longer applies (), hence there is no asymptotic reference for the network frequency (empty marker, Fig 6C, top). This means that at the end of the sharp wave the network can sustain one more ripple cycle at a level of drive that in the beginning would be insufficient to trigger ripples (see cycle 1 in Fig 6C). The transient ripple is thus not only asymmetric in its instantaneous frequency (IFA), but also with respect to the level of drive at which it starts and ends.

We have established that IFA occurs in response to transient, sharp wave-like drive, independent of its symmetry, due to a hysteresis effect in the amplitude of the oscillatory mean membrane potential. Varying the slope m in our double ramp model for SPW-like drive (Methods, Eq (17)) we find that this hysteresis is speed-dependent (Fig 7): If the drive changes more slowly, the transient dynamics approaches the asymptotic dynamics, and the IFA asymmetry is reduced (χIFA → 0 for |m| → 0, Fig 7A–7C; Methods, Eqs (56)–(66)). The theory thus explains the speed-dependence of IFA that we already observed in the spiking network simulations (Fig 2D). The theoretically predicted instantaneous frequencies and IFA slopes are in excellent agreement with the corresponding observations in the spiking neural network simulation if the slope is sufficiently strong (Fig 7Aii and 7Bii, colored vs grey markers, average relative error ϵ: 11–13%, Table 1). The discrepancies between theory and simulation for slow drive (Fig 7Cii, colored vs grey markers) are mainly due to the discrepancies in the estimate of the asymptotic reference frequencies for constant drive (Fig 4).

thumbnail
Table 1. IFA in theory and simulation.

Quantification of the IFA slope χIFA in the spiking network simulations and the theoretical approximations shown in Fig 7Aii–7Cii for different slopes m of the external SPW-like drive. The error ϵ (Eq (67)) quantifies the average relative deviation of the theoretically predicted instantaneous network frequencies (colored markers in Fig 7) from the simulation results (grey dots in Fig 7).

https://doi.org/10.1371/journal.pcbi.1011886.t001

The speed-dependence of IFA is an important prediction of the feedback-based inhibitory ripple model that can be tested in experiments: Optogenetic stimulation of PV+ basket cells can trigger ripple oscillations in the LFP [21] or induce ripple-modulated spiking activity [43]. Increasing and decreasing the intensity of the light pulse could mimick the piecewise linear, double ramp drive studied here. The model predicts that the IFA asymmetry is reduced if the optogenetic light stimulus changes more slowly.

Discussion

We studied the dynamics of hippocampal ripple oscillations in a feedback-based inhibition-first model. By reducing the model complexity we derived an analytical approximation of the mean-field dynamics in the regime of strong drive and strong coupling. For constant drive, our theory (i) yields an approximation of the asymptotic network frequencies and mean unit firing rates far beyond the Hopf bifurcation, (ii) captures the transition from sparse to full synchrony, and (iii) reveals an increase of the mean membrane potential oscillation amplitude for increasing levels of external drive. For a fast changing, sharp-wave like drive we then show that a speed-dependent hysteresis effect in the trajectory of the mean membrane potential produces an IFA-like asymmetry of the instantaneous ripple frequency compared to the asymptotic frequencies. Our analytical approach presents a substantial advancement over previous, exemplary, numerical demonstrations of IFA [37] as it allows a mechanistic understanding of the phenomenon and its parameter dependencies. Our derivation shows that IFA is an intrinsic feature of the feedback-based inhibitory ripple model and that IFA occurs for any fast-enough sharp wave-like drive, independent of other parameter choices. The speed-dependence of IFA is a new prediction that can be tested experimentally.

Simplifying assumptions

To achieve an analytical treatment of the spiking network dynamics, we have made a number of simplifying assumptions. Our resulting reduced model with full recurrent connectivity is a special case of the network analyzed in [34] with sparse recurrent connectivity. We demonstrate in Figs 1 and 2 that the reduced model exhibits ripples and IFA similar to a biologically more realistic model [37] that includes random sparse connectivity, correlations in the background noise, synaptic filtering, absolute refractoriness, local pyramidal cells, and conductance-based synapses. Let us discuss these differences in more detail:

Pyramidal cells. Inhibition-first models posit that the interneuron network generates ripple oscillations and entrains the spiking of local CA1 pyramidal cells. Thus we neglect pyramidal cells in our reduced model—we regard them as a non-essential ingredient. We have no a priori reason to believe that the dynamics of IFA would be qualitatively altered by the presence of pyramidal cells: Numerical simulations of the feedback-based model including pyramidal cells have shown that (i) the inhibitory network can entrain the local pyramidal cell network, and (ii) IFA can occur in the spiking activity of both the excitatory and inhibitory population [37]. The influence of pyramidal cells is most relevant when it comes to determining the boundaries between the parameter spaces for ripple and gamma oscillations [37], or the relation between neuronal spiking activity and LFP [24, 38].

Absolute refractoriness. If an absolute refractory period is added to the reduced model in its default parameter setting (Table 2), the membrane potential distribution becomes bimodal and the network activity oscillates with alternating clusters of active neurons [60]—a state that is beyond the scope of our theory. Unimodality of the membrane potential distribution can easily be recovered by increasing the noise intensity. In the regime of sparse synchrony the network frequency is then virtually independent of the refractory period [60]. The refractory period only leads to a slight delay of the point of full synchrony, which also occurs at a lower network frequency due to the decrease in mean unit firing rate.

Conductance-based synapses. A noteworthy qualitative difference in the network dynamics of our reduced model compared to the model in [37] comes from the use of current-based instead of conductance-based synapses. In our network with current-based synapses, an increase in external drive, and thus recurrent feedback, leads to an increase of the hyperpolarization of the membrane potentials, and the achieved membrane voltages can be quite low (< −100 mV in Fig 1A; μmin in Fig 4). In contrast, in a network with conductance-based synapses, the decrease in μmin would still be monotonic, but bounded by the inhibitory reversal potential, and thus confined to a biologically more realistic range. Moreover, inhibitory feedback in a conductance-based model compresses the membrane potential distribution around the reversal potential. To accurately capture the resulting oscillation dynamics, it would thus be necessary to capture not only the first, but also the second moment of the moving membrane potential distribution. We developed such an extension of the Gaussian-drift approximation for a toy model of pulse-coupled oscillators with linear phase-response curve, which mimicks the effect of an inhibitory reversal potential [60]. We find again that IFA is caused by a hysteresis effect, but now with respect to two variables: the mean and the variance of the membrane potential distribution. However, a direct analytical treatment of the conductance-based LIF network of [37] is difficult and may be an interesting subject for future investigations.

Short term plasticity and adaptation. There are many more biological complexities that are neglected not only in our reduced model, but also in the reference simulations by [37]. There is evidence that PV+ interneurons in CA3 exhibit short-term synaptic depression [61]. In principle, short term depression or facilitation could easily be incorporated in our Gaussian-drift approximation by making the synaptic strength K cycle-dependent. Spike frequency adaptation would be another factor relevant for ripple dynamics. However, PV+ interneurons in CA1—some of the prime candidates for a potential inhibition-first ripple generator—do not exhibit strong adaptation [62].

Performance and limitations of the Gaussian-drift approximation

Our reduced network model allows a Gaussian-drift approximation of its oscillation dynamics that can be treated analytically. We added a new phenomenological account for the reset mechanism on the population level, which improves the accuracy and range of applicability of the Gaussian-drift approximation (Fig 9A vs 9B). We would like to stress that this phenomenological reset is not necessary to capture the most important qualitative features of the network, namely the transition from sparse to full synchrony for constant drive and the hysteresis effect leading to IFA for time-dependent drive.

The differences between the simulations of spiking units and the theory have been analyzed in detail for the case of constant drive (S1 Appendix): there was a good quantitative match for a large parameter range, and a slight decrease in performance towards high noise intensities. For a slow-enough time-dependent drive, any approximation error in the asymptotic frequencies at constant drive (Fig 7Cii, top, solid vs dashed line) also contributes to an error in the instantaneous frequencies (Fig 7Cii, top, colored vs gray markers). Furthermore, the trajectory of the instantaneous frequencies in the spiking network simulations is influenced by the non-gaussian shape of the membrane potential distribution, which cannot be captured by the theory. An example is the “wiggle” (local maximum) in the simulated instantaneous network frequencies during the falling phase of the drive as seen e.g. in Fig 7Bii and 7Cii (top, gray markers): It reflects a single cycle of reduced period that occurs, with some phase jitter, in each of the 50 simulations pooled together in the plot. Inspection of the membrane potential distribution during this cycle revealed a significant reset-induced bimodality (S1 Fig). Such details of the transient dynamics cannot be captured by the theory, but are also not crucial to the IFA phenomenon.

Since our approximation is purely drift based, it is not well behaved in the limit of low drive (called “pathological oscillation” in Fig 8). In that sense our approach is truly complementary to previous linear (or weakly nonlinear) analyses around the bifurcation, which break down at strong drive [34]. A theory that can describe the dynamics of recurrent spiking networks both in the drift- and diffusion-dominated regimes, as well as capture the transitions in between, would be desirable. In the context of escape noise models, various approaches have been put forward to approximate the combined effect of drift- and diffusion-mediated spiking [5659, 6365]. The rate expression in our Gaussian-drift approximation (Eq (8b)) is equivalent to the drift-based hazard function derived by [56, 58].

thumbnail
Fig 8. Dynamical states of the DDE system.

Numerical integration of the DDE system (without phenomenological reset) demonstrating 4 distinct dynamical regimes for increasing external drive IE (bottom axis). Blue: Stable fixed point with zero rate. Red: Pathological, fast oscillations. Yellow: Period-2 oscillations. Green: Regular period-1 oscillations. Top: Numerically integrated trajectories of population rate r and mean membrane potential μ over time. Note the changes in scale for the population rate. Bottom: Phase space showing the trajectory (μ(t), r(t)) and the fixed point (IE, 0), which is only stable in the first case (black circle) and unstable otherwise (empty circle). The horizontal colored bar shows the full range of (relevant) drives from 0 to the point of full synchrony . The boundaries between the four dynamical regimes were approximated numerically. Black triangles mark the levels, for which the above example dynamics are shown.Extra ticks indicate: critical drive , for which the spiking network undergoes a Hopf bifurcation (see section Linear Stability Analysis); lower bound introduced for our theory to ensure that we only consider non-pathological dynamics (see paragraph Range of applicability).

https://doi.org/10.1371/journal.pcbi.1011886.g008

Our approach for analyzing the population rate dynamics in response to strong voltage transients may also be relevant in other contexts, such as gamma oscillations [66, 67], or propagation of synchronous neural activity in synfire-chains [29, 68]. A hysteresis effect similar to IFA was recently observed in optogenetically evoked gamma oscillations in cat visual cortex [69]. It may be an interesting subject for future research to extend the present Gaussian-drift approximation to gamma rhythms in E-I networks.

The shape of the asymptotic network frequency curve as a function of the external drive

A hallmark of the feedback-based inhibition-first ripple model is its state of sparse synchrony, where, for a wide range of external drives, the collective population activity exhibits oscillations in the ripple range, while individual neurons fire sparsely and irregulary at lower firing rates (Fig 1). In this state the asymptotic network frequency is not necessarily proportional to the external, excitatory drive, as is the case in networks at full synchrony or in clustered states [67, 70, 71].

In our reduced model, the asymptotic network frequency decreases with increasing external drive (Fig 1B). In the vicinity of the Hopf bifurcation and in the limit of small synaptic delay such a frequency decrease was shown analytically in [34]. In biologically more realistic model networks with an absolute refractory period and/or conductance-based synaptic coupling, the shape of the asymptotic network frequency curve can be quite different (see Fig A in S2 Appendix, panels a1–d1). Qualitatively, the hysteresis mechanism of IFA that we have uncovered here is preserved in all network architectures (Fig A in S2 Appendix, panels 2–4).

The shape of IFA

Our theory predicts that instantaneous ripple frequencies for a changing drive are different from the asymptotic frequencies for constant drive: If the drive increases the instantaneous frequencies are higher than the asymptotic frequencies, and if the drive decreases the instantaneous frequencies are lower. For a symmetric time course of the drive, this asymmetry immediately implies IFA with a negative regression slope χIFA approximately equal to the regression slope over the deviations between instantaneous and asymptotic frequencies (see Methods, Eq (19)). For asymmetric drive, the IFA slope χIFA may be slightly stronger or weaker, depending on the covariance of the asymptotic reference frequencies with time (see Methods, Eq (19)). In general, the shape of the instantaneous ripple frequency curve over time thus depends both on the shape of the drive, and on the shape of the asymptotic frequency as a function of the external drive (Fig A in S2 Appendix). In our reduced model (Fig A in S2 Appendix, panel d) the asymptotic frequency decreases for increasing values of the constant drive. Thus it is possible for the instantaneous frequencies in the second half of the ripple event (when the external drive decreases) to lie below the asymptotic reference while actually increasing as a function of time (Figs 2D and 7Aii–7Cii, Fig A in S2 Appendix, panels d2, d3). Interestingly, such a switch from decreasing to increasing frequency at the end of the ripple event was also found in some experimental data [44]. Others have reported an almost monotonic decrease or even a small peak in the beginning of the event [25, 37, 43, 45]. In the reduced model, IFA gets stronger (more negative χIFA) if the peak of the drive is asymmetrically shifted to the right (Fig A in S2 Appendix, panel d4). If we incorporate into the model an absolute refractory period (Fig A in S2 Appendix, panel c) or conductance-based synaptic coupling (Fig A in S2 Appendix, panels a,b) the shape of the asymptotic frequency curve changes such that the strongest IFA is achieved with a left-shifted stimulus (Fig A in S2 Appendix, panels a3, b3). It is only for rare combinations of model architecture and stimulus profile that the described asymmetry between instantaneous and asymptotic network frequencies does not imply IFA in the sense of χIFA < 0: If a network with asymptotic frequencies that increase strictly monotonically with the drive is stimulated with a strongly right-shifted stimulus (Fig A in S2 Appendix, panel b4), the resulting ripple does not exhibit IFA. Our theory still correctly predicts this response: the instantaneous frequencies are first above, then below the asymptotic reference.

Modeling sharp wave-like drive to CA1 interneurons

Square pulse. One may argue that a square pulse is the simplest possible model for the sharp wave-associated momentary increase in drive to CA1 interneurons (SPW-like drive). Our analysis, however, showed that transients under constant drive due to perturbations of initial conditions are short-lived (∼one cycle in theory, Fig 5). This implies in particular that a square pulse cannot account for IFA (Fig A in S2 Appendix).

Note that there is a subtlety when comparing the simulation of the reduced model under a square pulse (panel d5 of Fig A in S2 Appendix) with the theoretical analysis for piecewise constant drive in Fig 5: in Fig 5 we analyze the network response to stepwise changes in the drive that occur when the network is already in an oscillatory state (i.e. is sufficiently subthreshold, otherwise the drift-based theory does not apply). In that case, an upwards step in the drive always corresponds to the red regime of Fig 5B, where the instantaneous frequency is higher than the asymptotic frequency. In Fig A in S2 Appendix, on the other hand, a square pulse is delivered to the network in the non-oscillatory steady-state. The stationary mean membrane potential in that state hovers just below the threshold (i.e. ). If we apply the Gaussian-drift approximation nevertheless, this corresponds to a starting distribution of membrane potentials with a significant portion already above threshold. In that case a square pulse triggers a first population spike starting with a high, non-zero rate, that silences itself as soon as the inhibitory feedback starts (toff = Δ). Due to the strong inhibitory feedback this cycle can end with a mean membrane potential below the asymptotic associated to the amplitude of the square pulse. The next cycle thus starts in the blue regime of Fig 5B and is longer than the asymptotic period, before the system settles into the asymptotic dynamics. This explains the slight tendency for positive IFA slopes χIFA (Fig A in S2 Appendix, panel d5).

Most importantly, however, the network settles into the asymptotic dynamics very quickly (|χIFA| ∼ 0), so stimulation with a square pulse cannot account for realistic IFA in the feedback-based inhibition-first model.

In experiments, an optogenetic square pulse stimulation of CA1 pyramidal cells has been found to elicit IFA in the local field potential [9, 43]. An extension of the present model, including pyramidal cells and a proxy for the LFP, would be needed to determine the compatibility of the feedback-based inhibition-first model with these findings. It is plausible that a square pulse stimulation of pyramidal cells yields a non-square, double ramp-like drive to locally connected interneurons (see the ramp-like evolution of the evoked firing rates of pyramidal cells in Fig 2E of [9]). As our analysis emphasizes, any such fast-changing double ramp drive to CA1 interneurons is likely to induce IFA. For a direct square-pulse stimulation of interneurons, on the other hand, the present reduced model predicts no IFA in the inhibitory population activity. This prediction could be tested in future experiments.

Double ramp. For most of the manuscript we used a double ramp drive as the simplest model of SPW-like drive that can account for IFA. We chose a symmetric double ramp to highlight that the asymmetric phenomenon of IFA does not depend on asymmetry in the input. Our derivation emphasizes that the only necessary requirement for IFA is an external drive that changes sufficiently fast and first rises, then decays. Thus we predict that IFA in the feedback-based model will occur for a wide range of such SPW-like external drives, even when their time courses are weakly asymmetric (Fig A in S2 Appendix, panels 2–4).

IFA in other ripple models

In the introduction we motivated IFA as a means of dissociating the present feedback-based model from the perturbation-based, inhibition-first model. Indeed, the perturbation-based inhibitory model [41] makes different predictions with respect to IFA: In the perturbation-based model, ripples can emerge only transiently, when a sudden increase in drive synchronizes the interneurons; furthermore, there is no sparse synchrony but all neurons fire in all ripple cycles until the neurons desynchronize: The network frequency during this transient ripple event reflects the momentary unit firing rate and is thus set directly by the strength of the external drive; a symmetric “up-down” drive thus creates a symmetric instantaneous frequency response that first rises and then decays (no IFA). Since the oscillation power in this perturbation-based model decays monotonically over the course of the ripple event, the initial phase of rising frequency even dominates over the subsequent decay (“anti-IFA”). The perturbation-based model thus predicts that IFA only occurs in response to asymmetric drive (e.g. a sudden step up, followed by a ramp down). Excitatory currents measured during spontaneous SPW-Rs can exhibit some asymmetry due to synaptic filtering [26], but generally have a non-zero rise time (CA3 PV+ BC: [21, 72]; CA1 pyramidal cell: [26, 37]). Alternatively, the perturbation-based model could account for IFA under a step-current input, by assuming strong spike frequency adaptation. However, experimental evidence suggests that PV+ interneurons—likely candidates for inhibition-first ripple generation in CA1 [7378]—do not exhibit strong spike frequency adaptation [62]. This speaks more in favor of the feedback-based inhibitory model, which can account for IFA occuring independently of the exact (a)symmetry of the SPW-associated drive (Fig A in S2 Appendix), and without assuming high interneuron firing rates with strong spike frequency adaptation.

In both excitation-first ripple models, the ripple frequency depends on the average spike propagation delay among pyramidal cells—either orthodromically via supralinear dendrites [29, 79] or antidromically via axo-axonal gap junctions [27, 28]. These models may be able to account for IFA by assuming an increase in the spike propagation delay over the course of a ripple event. Such an increase in latency might occur due to increasing somatic depolarization which has been shown to decrease the action potential amplitude [80, 81]. Future work should investigate in more depth whether and under which conditions excitation-first models can generate IFA.

A note on mixed excitatory-inhibitory models for ripple generation

Computational models for ripple generation that combine excitatory and inhibitory populations (mixed, excitatory-inhibitory ‘E-I’ network models) are sometimes listed as a third class of ripple models in the literature [36, 38, 43]. Because CA1 contains both excitatory and inhibitory neurons, any “complete” in silico ripple model would need to incorporate both. In the present work, however, we use the term model to distinguish different generation mechanisms of ripple oscillations. To the best of our knowledge, there is no evidence that an E-I network model can produce oscillations in the ripple range in the absence of recurrent inhibitory coupling if biologically realistic parameters of delays and time constants are used. On the contrary, both theoretical [35] and numerical [37, 39] analyses suggest that the presence of a disynaptic E-I loop promotes network frequencies that are well below the ripple range, which is why E-I network models are typically used in the context of gamma oscillations [66, 82, 83]. We thus posit that, mechanistically, all currently known ripple models can be categorized into either inhibition-first or excitation-first (a potential special case is the model by [84] relying on an excitatory effect of GABAergic synaptic transmission). The respective other cell type can have a modulatory influence, and such modulations have been studied (interneurons in excitation-first models: [28, 29]; pyramidal cells in inhibition-first models: [3541, 43]): Changes in the recruitment of CA1 interneurons (feedforward via CA3, or local via CA1) or in the balance of feedforward drive to CA1 interneurons and pyramidal cells have been suggested to mediate transitions between ripple and gamma network states [37, 39]. Furthermore, there is evidence that ripple phase-locked action potentials of pyramidal cells may increase the ripple power in the local field potential [24].

Measurement and quantification of ripple oscillations: Population activity or local field potential?

Understanding ripple oscillations requires not only understanding their generating mechanism but also the origin of the signal that is used to detect them. Typically, ripple oscillations are quantified using the local field potential (LFP). The LFP is traditionally believed to reflect synaptic currents [23, 85], but modeling studies have suggested that action potentials of pyramidal cells can also contribute when occurring locked to a ripple rhythm [24, 38]. In fact, the participation of pyramidal cells in the ripple rhythm may be crucial for the detectability of ripple power in the LFP [43].

In the present work, we used the population activity of interneurons as a proxy for the LFP signal. The relation between interneuron population activity and LFP in the feedback-based inhibition-first model was partially explored by [37] using numerical simulations: They found that for bandpass-filtered (120–300 Hz) average inhibitory currents as a simple model of the CA1 LFP, IFA in the inhibitory population activity translates into IFA in the LFP. Simulations of the feedback-based inhibition-first model including pyramidal cells [3638] have shown that the inhibitory network can pace pyramidal cell spikes to occur phase-locked to the ripple rhythm. Furthermore, IFA can occur in the spiking activity of both excitatory and inhibitiory populations [37]. It may therefore be reasonable to assume that even in a more complex model of the LFP, taking into account both inhibitory synaptic currents and excitatory action potentials, IFA in the inhibitory population activity may translate into IFA in the LFP.

Note that all proposed ripple models focus on explaining ripple generation in the spiking activity of the respective pacemaker population (excitatory or inhibitory). A detailed model of the CA1 LFP and additional numerical simulations will be needed to confirm, for each of the models, whether they yield a ripple signature in the LFP that is consistent with experimental data.

Conclusion

In conclusion: The feedback-based inhibition-first ripple model can account naturally for IFA without adding further parameter constraints. A deepened understanding of the transient ripple dynamics in each of the proposed models together with extensive experimental testing of the various predictions will hopefully advance our understanding of the generating mechanism of ripple oscillations and enable us to study their potential role in memory consolidation.

Methods

The following section on the Methods of this paper is written to be self-contained. Thus, some of the equations shown in the Results will be reprinted here in the Methods for better readability.

Spiking neural network simulations

Network architecture.

To model ripples in a spiking neural network, we consider a fully-connected inhibitory network of noisy leaky integrate-and-fire (LIF) neurons. Each neuron’s membrane potential vi (for i = 1, …, N) is given by the following stochastic differential equation (SDE): (13) see also Results, Eq (1). Whenever the membrane potential reaches a threshold of Vthr = −52 mV, a spike is recorded and the membrane potential is reset to Vreset = −65 mV. For simplicity there is no absolute refractory period. We choose a membrane time constant τm = 10 ms, resting membrane potential Eleak = −65 mV and membrane capacitance C = 100 pF as used by [37]. All default parameter values are summarized in Table 2.

In our model, Eq (13), each unit receives a common excitatory drive Iext(t) and a common inhibitory synaptic input in form of a sum of presynaptic Dirac-delta spikes from all neurons in the network. Spikes occur at times , where denotes the k-th spike time of neuron j. They are transmitted to all neurons in the network and arrive at the postsynaptic neurons after a synaptic delay of Δ = 1.2 ms. The amplitude of the inhibitory postsynaptic potential elicited by one input spike is determined by the synaptic strength J = 65 mV. Our default network has N = 10, 000 units.

To account for noisy background activity, every unit receives independent Gaussian white noise ξi(t) with 〈ξi(t)〉 = 0, 〈ξi(t)ξj(t′)〉 = δi,jδ(tt′). The strength of the noise is determined by the parameter σV = 2.62 mV, which can be interpreted as the long-time standard deviation of the membrane potential in the absence of a threshold.

In simulations of the spiking network with a finite temporal resolution (Δt = 0.01 ms), we define the empirical population activity as (14) (see also Results, Eq (2)), where (15) denotes the total number of spikes that were emitted from the population in the time interval [t, t + Δt]. The empirical population activity has units of spikes per second and can also be interpreted as the instantaneous firing rate of single neurons in our homogeneous network [86].

For plotting purposes the empirical population activity is smoothed with a Gaussian window g of standard deviation σt = 0.3 ms: (16) To facilitate notation we omit the superscript “plot” in all Figures.

For constant drive, we simulate 5.05 s (time step Δt = 0.01 ms). The initial membrane potentials are drawn randomly from a uniform distribution between the reset and threshold potentials. The initial 50 ms are excluded from analysis, because we are not interested in the initial transient but in the asymptotic network dynamics. The remaining 5 s are sufficient for a basic spectral analysis.

For time-dependent drive, we first simulate the network for 200 ms with a constant baseline drive of (see section Linear stability analysis), followed by the time-dependent, sharp wave-like stimulus, which we model as a piecewise linear double ramp of slope ±m with a plateau phase of arbitrary length in between (here 20 ms): (17) During the plateau phase the drive is at the approximate point of full synchrony for the respective spiking network, as determined by a range of constant-drive simulations.

Simulations of the LIF network model have been performed using the Brian2 simulator [87]. Data storage and parallelization of simulations for large parameter explorations were done using the Python toolkit pypet [88].

Frequency analysis.

The network frequency at constant drive is defined as the location of the dominant peak in the power spectral density of the population activity rN(t). The saturation (average fraction of neurons firing in one cycle of the population rhythm) is computed by dividing the mean unit firing rate by the network frequency.

To define the instantaneous network frequency in response to time-dependent drive, we use frequency estimates both in continuous time and for a discrete set of time points. The continuous estimate is derived from the wavelet spectrogram (windowed Fourier transform) of the population activity, which indicates instantaneous power in the frequency band from 0 to 350 Hz over time; the instantaneous frequency at each point in time is defined as the frequency above 70 Hz, that has maximal instantaneous power. The lower limit is introduced to exclude the low-frequency contribution due to the sharp wave. The instantaneous frequency is considered significant, whenever the corresponding instantaneous power exceeds a power threshold. The power threshold is chosen as the average instantaneous power at 0 Hz during the initial 200 ms baseline window, plus 4 standard deviations.

An alternative estimate of the instantaneous frequency is given by the inverse of the peak-to-peak distances in the (smoothed) population rate. We consider only peaks that are more than 4 standard deviations above the average rate during baseline stimulation. This procedure delivers a discrete set of frequency estimates associated with a discrete set of time points. In this paper we mostly rely on this “discrete” estimate of instantaneous frequency since its parameter-dependencies (minimal height of oscillation peaks) are more transparent than the ones of the continuous-time estimate (size of time window for windowed Fourier transform, power threshold). Furthermore it is better suited for comparison with the theory which also describes instantaneous frequency as a discrete measure per cycle (see Eq (65)).

To quantify the network’s instantaneous frequency response to a SPW-like drive (Eq (17)), we perform 50 independent simulations of the network model with the same drive but different noise realizations. Linear regression over the discrete instantaneous frequency estimates from discrete oscillation cycles, pooled together from all 50 simulations, yields the average change of the instantaneous frequency over time: (18)

A negative slope χIFA < 0 indicates IFA.

Note that for our symmetric model of SPW-like drive (Eq (17)), the regression slope χIFA is approximately equal to the slope resulting from linear regression over the deviations of the instantaneous network frequencies from the symmetric, asymptotic reference frequencies (except for small effects of asymmetric sampling over time): (19) For asymmetric drive, the covariance needs to be taken into account, when inferring the IFA slope χIFA from our analysis of the asymmetry between instantaneous and asymptotic frequencies.

The point of full synchrony.

We estimate the point of full synchrony in the spiking network by interpolating the simulated saturation curve and estimating the level of drive for which it becomes 1.

Extracting the average oscillation cycle.

For constant drive, we split the spiking network simulation into individual cycles based on the Hilbert transform of the mean membrane potential. We take a sufficient number of equally spaced samples from each cycle (here 21) and average them across cycles to derive the average trajectory of the population rate and the membrane potential histogram over the course of a ripple cycle. For each of the 21 sample times we can calculate the average membrane potential, which will be used for comparison with the theory.

Mean-field approximation

Dimensionless equations.

To facilitate notation in the theoretical part of this paper, we shift and rescale all voltages, such that the spiking threshold becomes VT = 1 and the resting potential is EL = 0. This corresponds to measuring voltage in units of the distance from the resting potential to the spike threshold. The single unit SDE then reads (20) (21) (see also Results Eq (3)), where now (22) are all dimensionless quantities.

Fokker-Planck equation.

In the mean-field limit of an infinitely large interneuron population (N → ∞) the evolution of the membrane potential density p(V, t) is described by the following Fokker-Planck equation (FPE) (see e.g. [34, 51]): (23a) (23b) see also Results Eqs (4)–(5). The FPE can also be written as a continuity equation (23c) with a probability current (23d) Since units are reset instantaneously as soon as they reach the spiking threshold, the FPE has an absorbing boundary condition at threshold: (23e) The population rate r is given by the probability current through the threshold: (23f) see also Results, Eq (6). The fire-and-reset mechanism introduces a derivative discontinuity at the reset potential VR: (23g) see also [5155]. The membrane potential density must decay to zero fast enough in the limit of V → −∞: (23h) As a probability density, p(V, t) obeys the normalization condition: (23i)

Derivation of the Gaussian-drift approximation

Without the absorbing boundary at threshold and the source term due to the fire-and-reset rule, the FPE has only natural boundary conditions and can be solved analytically. Its solution pδ for an initial Dirac delta distribution pδ(V, 0) = δ(Vμ0), is a Gaussian density with time-dependent mean μ(t) and variance σ2(t) [89]: (24) The mean membrane potential μ evolves according to the single unit ODE (Eq (20)) without the noise term: (25) see also Results (8a). The variance of the membrane potential distribution, σ(t)2, approaches D with a time constant of τm/2: (26) The solution for an arbitrary initial condition can be found by convolution of the initial condition with pδ. Hence in the long time limit (t → ∞) all solutions of the FPE with natural boundary conditions tend towards a Gaussian with variance D—independent of the initial condition (which has to satisfy the boundary conditions Eq (23)).

In our ripple-generating network, with strong drive and strong coupling, the bulk of the membrane potential distribution is strongly subthreshold in between population spikes (i.e. unaffected by the non-natural boundary conditions), and thus tends towards a Gaussian density. When the next population spike begins, we can hence approximate the membrane potential density as a Gaussian with fixed variance D: (27) see also Results, Eq (7). This Gaussian approximation allows the derivation of a simple expression for the population rate under strong drive [5658]: (28) The rate is given by the value of the Gaussian density at threshold, scaled by the speed at which the mean membrane potential approaches the threshold. Since only upwards-threshold-crossings should contribute to the rate, we add a sign-dependence: (29) see also Results Eq (8b). The rate is clipped to 0 whenever the mean membrane potential decays: [5658].

Numerical analysis of oscillation dynamics.

In its derivation above, we formulated the Gaussian-drift approximation in several equations, describing the membrane potential density p (Eq (27)), mean membrane potential μ (Eq (25)) and population rate r (Eq (29)) separately. Note however that the mean membrane potential μ is the only independent variable and thus the Gaussian-drift approximation can be rephrased as a single delay differential equation (DDE): (30) (31)

We will nevertheless illustrate the solutions as two-dimensional trajectories of both μ(t) and r(t) (Fig 8), since the population rate is our main variable of interest. The DDE can be integrated numerically using a simple forward Euler method. We numerically explored the dynamics for initial conditions μ(t ≤ 0) ≪ VT. Effectively, we assume that the Gaussian membrane potential distribution is so far subthreshold that no spiking has occured before time 0 (r(t) ∼ p(VT, t) ≈ 0 ∀t ≤ 0). In that case, the intialization of , which only appears as a second factor in the rate expression Eq (29), is irrelevant as long as it is finite. For constant drive IE, we then find a range of potential dynamics (see Fig 8).

There is a large regime of sufficiently strong drive in which the solution μ exhibits persistent period-1 oscillations (Fig 8, green). This is the regime that we are interested in and the dynamics of which we will approximate in the following.

At lower levels of drive, there are three additional dynamical regimes that we will exclude from analysis: At very low drive, the system has a stable fixed point () in (μ(t), r(t)) ≡ (IE, 0) (Eq (30), Fig 8, blue). The bifurcation at which the fixed point loses stability can be determined numerically (see Eq (38) for an analytical approximation). Immediately after the bifurcation, the DDE solution exhibits very fast oscillations at (2Δ)−1 ∼ 417 Hz (Fig 8, red). The oscillation amplitude of the mean membrane potential is very small, and a large portion of the Gaussian potential density is suprathreshold at all times. We refer to this oscillation as “pathological” since it is a direct result of the artificial clipping of the rate to 0 whenever the mean membrane potential decays (Eq (29)). Increasing the drive further brings the system into a state of period-2 oscillations where the Gaussian density gets pushed below threshold only every other cycle (Fig 8, yellow).

These regimes of pathological high-frequency or period-2 oscillations exist due to our simplifying assumptions capturing only the mean-driven aspects of the network dynamics. Both regimes occur either shortly before or after the level of drive at which the original spiking network undergoes a supercritical Hopf bifurcation (see tick mark in Fig 8). In the vicinity of that bifurcation the spiking network dynamics are fluctuation-driven with either no () or only small-amplitude () oscillations in the mean membrane potential and population rate. These cannot be captured without taking into account the absorbing boundary at threshold and the non-Gaussian shape of the density of membrane potentials. We will exclude these pathological dynamics from analysis by introducing a lower bound for the theoretical approximations that we develop in the following (see tick mark in Fig 8).

Analytical approximation of oscillation dynamics for constant drive

For strong enough drive the mean membrane potential μ under the Gaussian-drift approximation oscillates periodically between two local extrema μmin and μmax (Fig 8, green regime, see also Fig 9A). The population rate r(t) oscillates at the same frequency and is positive when the mean membrane potential increases, i.e. , and 0 otherwise, c.f. Eq (29). The time when the mean membrane potential reaches its local maximum μmax marks the end of the population spike and will be denoted as toff. The inhibitory feedback (Eq (23b)) thus ends at time toff + Δ and we define μminμ(toff + Δ) as the end of a cycle. This allows us to approximate the overall period as T = toff(μmin, μmax) + Δ. Here, we wrote toff(μmin, μmax) to emphasize that toff is the rise time of the mean membrane potential from μmin to μmax (upstroke). Furthermore, Δ is the duration of the subsequent downstroke back to μmin. In the following, we will approximate μmax (Step 1) and μmin (Step 2) and thus derive the network frequency as the inverse of the period: (32) (see also Results, Eq (9)).

thumbnail
Fig 9. Illustration of analytical approximation of DDE dynamics.

(A) One cycle of the oscillatory solution of the DDE for IE = 3.6. Dotted lines for rate (top) and mean membrane potential (bottom) are the result of a numerical integration of the DDE (Eq (31)). All other lines illustrate our analytical considerations. Top: population rate r(t) (black). Bottom: mean membrane potential μ(t) (full grey line: Eq (A2) during upstroke, Eq (39) during downstroke, gray area: indicating the width of the Gaussian density p(V, t)). Constant external drive IE (green line), total input I(t) = IEII(t) (dashed line, Eq (42)). Local extrema of the mean membrane potential occur at the intersections of μ and I: Cyan: local maximum μmax (Eq (38)). Orange: approximate local minimum μmin (Eq (43)). Vertical dotted lines mark end of population spike toff as well as intervals toff + kΔ, k ∈ {−2, −1, 1}. Arrows illustrate simplifying assumption (A1). The beginning of the cycle (ton = 0) is determined by μmin. Horizontal gray bars mark the length of one cycle (here T = toff + Δ = 3.44 ms (Eq (47)), corresponding to a network frequency of fnet = 290.7 Hz, (Eq (32))). Inset: magnification highlighting the differences between numerical solution (dotted) and analytical approximation (full line). Due to assumption (A2) μmax is slightly overestimated. Note that the second intersection of μ and I occurs shortly before time toff + Δ. Hence μmin is slightly larger than the true local minimum. (B) Same as A, but with an account for the reset on the population level. At the end of the population spike, μ is reset instantaneously from μmax to μreset (Eq (49)) (yellow marker). This leads to a lower μmin (Eq (50)) and hence a slightly longer period (T = 4.24 ms), i.e. lower network frequency (fnet = 235.8 Hz). (C) Illustration of phenomenological reset. Cyan: density of membrane potentials p(V, toff) at the end of the population spike, centered at μmax (before reset). Cyan hatched area: fraction of active units (saturation, Eq (48)). Grey hatched area: resetting the active portion of p. Yellow: p(V, toff) after reset, centered at μreset. The reset value μreset is calculated as the average of the density that results from summing the grey-dashed (active units) and cyan-non-hatched (silent units) density portions (Eq (49)). Default parameters (see Table 2).

https://doi.org/10.1371/journal.pcbi.1011886.g009

Step 1.

To find the local maximum μmax = μ(toff) we have to solve (33) Since the dynamics of the mean membrane potential are given by a delay differential equation, the term on the right-hand side is recurrent: From the yet unknown time toff we have to look back in time (in windows with length of the delay Δ) at the history of the population rate r(t): which in turn depends on μ(t) and : (34) We can resolve the recurrence since there is only a finite number of past time windows, during which the population rate, and thus the delayed feedback inhibition II(t) = mr(t − Δ), is significantly above 0. In the first time window [toff − Δ, toff], right before the end of the upstroke, we have to take inhibition into account, since this is what stops the upstroke. In the second time window [toff − 2Δ, toff − Δ], further into the past, we will assume that the inhibitory feedback is negligible, i.e. This assumption implies that the population rate was negligible in the previous time window, i.e. (A1) (Eq (23b), Fig 9A). Note that t here refers to time since the beginning of the cycle (t = 0). Since the population spike ends at time toff, (A1) is equivalent to the assumption that the population spike lasts at most 2Δ. Adding the subsequent downstroke time of Δ this amounts to an upper bound for the oscillation period of around 3Δ, plus any additional upstroke time with r ≈ 0, which is a reasonable assumption for a feedback loop with delay Δ as argued already by [34].

Under this assumption, we set r(toff − 2Δ) = 0 in Eq (34), and only a finite number of terms remains: (35) Note that the rectification [⋅]+ has been dropped in this equation because the mean membrane potential can never be larger than the external drive. Furthermore, we assume that we can approximate the past μ(toff − Δ) during the upstroke by only considering the excitatory drive, I(t) ≈ IE. Under this assumption, Eq (25) yields the exponential relaxation (36) Looking backwards from time toff, we can reformulate this assumption as: (A2) The resulting error is small since the exponential relaxation of the mean membrane potential towards the total drive I(t) is governed by the membrane time constant (τm = 10 ms). The population spike on the other hand is quite synchronized (we assumed that it lasts less than 2Δ ≪ τm, (A1)). The time window right before toff, during which the units receive inhibitory feedback and the total drive deviates from IE, is thus small compared to the membrane time constant and alters the trajectory of μ only slightly (Fig 9A, inset).

Eq (35) thus simplifies to (37) The simplified equation can be readily solved for μmax: (38) To ensure that the argument of the logarithm in Eq (38) is larger than 1, the coupling must be sufficiently strong: . For weaker coupling the inhibitory feedback is never strong enough to counteract the external drive, hence no local maximum μmax exists. The existence of a lower bound on synaptic strength for the emergence of oscillations is in line with previous analyses [34]. Furthermore, there is also a lower bound for the drive IE: if the drive is too weak, i.e. , the term inside the square brackets in Eq (38) is negative and we find μmax > IE. But then μmax cannot be reached; instead, the mean membrane potential simply settles into its fixed point IE (Fig 8, blue dynamics). Eq (38) thus yields an analytical approximation of the bifurcation point where the DDE fixed point loses stability and oscillations emerge (numerical, Fig 8, blue to red: IE ∼ 0.61; analytical, Eq (38): IE ∼ 0.56, for default parameters, Table 2). For strong enough coupling and a strong enough external drive a well-defined local maximum μmax < IE exists. Note that μmax does not depend on the initial voltage μmin. This makes sense intuitively for any μmin that is sufficiently smaller than μmax, since the “turning point” of the mean membrane potential depends only on the feedback inhibition resulting from the immediate history shortly before μmax is reached (t ∈ [toff − 2Δ, toff]).

Step 2.

We will now approximate the trajectory of the mean membrane potential during its inhibition-induced downstroke and infer μmin = μ(toff + Δ), which corresponds to both the end and the starting value of each cycle of a periodic solution.

Note that while μmin is close to the periodic local minimum of the mean membrane potential, it is a slight overestimation: The local extrema of the mean membrane potential μ occur at its intersections with the total drive I(t) (see Eq (33), Fig 9A). At time toff the mean membrane potential reaches its local maximum and becomes larger than the total drive I(t) (Fig 9A). Since the population spike ends at time toff, the delayed inhibitory feedback II(t) will stop at time toff + Δ: The total drive at this point will equal the external drive (I(toff + Δ) = IE); note that the mean membrane potential μ can never be larger than the external drive IE. Hence, if μ becomes larger than I at time toff and is smaller than I at time toff + Δ, μ must intersect with I(t) and reach its local minimum slightly before time toff + Δ (see Fig 9A, inset). What we define as the initial/final membrane potential of a cycle (μminμ(toff + Δ)) is thus close to but slightly larger than the periodic local minimum. This does not affect our estimate of the period.

The definition of a fixed downstroke duration Δ allows us to find μmin directly by integrating the mean membrane potential ODE Eq (25) up to time toff + Δ, starting from the initial value μ(toff) = μmax: (39) It follows for μmin = μ(toff + Δ) with initial condition μ(toff) = μmax: (40) Again, the total current I(toff + τ) with τ ∈ [0, Δ], has a recurrent dependency on the past dynamics, which can be seen by repeated application of Eqs (23b), (29), (25): As before we assume that the current I at time toff − (2Δ − τ) ≤ toff − Δ is given exclusively by the excitatory drive (A1), which truncates the infinite recurrent expression above to: (41) We are left with a finite number of terms that depend on the trajectory of μ(t) during the upstroke. Again, we approximate μ(toffx), x ≥ 0 assuming exponential relaxation towards only the external drive, i.e. ignoring inhibition (A2). This abolishes all dependencies on the yet unknown time toff of the end of the population spike: (42) Inserting this approximation (Eq (42)) into the expression for μmin (Eq (40)) yields: The integrals can be solved analytically, if we approximate the past trajectory of μ linearly within the Gaussian expressions p: (A3) Hence, Using this expression, we obtain (43) where (44) (45) (46) Although lengthy, this expression can be easily evaluated numerically.

The local “minimum” μmin is not only the final but also the initial mean membrane potential of each cycle. The length of the upstroke can thus be approximated as the time it takes for the mean membrane potential to rise from μmin to μmax, based on exponential relaxation towards only the excitatory drive IE (A2): (47) Together with the assumed downstroke duration of Δ we arrive at an analytical estimate for the oscillation period T and hence the network frequency fnet (see Eq (32)). Overall we have derived analytical expressions for μmax, μmin, toff, fnet as functions of the external drive IE, which characterize the oscillatory dynamics. To evaluate the accuracy of our analytical approximation we integrate the DDE numerically (Eq (31), Fig 8, green regime of period-1 oscillations) and determine μmax, μmin and fnet. We find that the errors introduced by our simplifying assumptions (A1)–(A3) are small (Fig 10, dashed lines vs square markers).

thumbnail
Fig 10. Analytical vs numerical evaluation of oscillatory solutions in the Gaussian-drift approximation.

Network frequency (top) and dynamics of the mean membrane potential (bottom) quantified in terms of its periodic local minimum μmin (orange) and local maximum μmax (cyan) for different levels of external drive IE. The analytical approximations (solid lines: with reset, dashed lines: without reset) are very close to the results of numerical integration of the DDE Eq (31) (round markers: with reset, square markers: without reset). Including the reset does not affect μmax but leads to a decrease in μmin (Eqs (50) vs (43)) and thus a decrease in network frequency. Results are shown in the relevant range of external drives (vertical dotted lines). For parameters see Table 2.

https://doi.org/10.1371/journal.pcbi.1011886.g010

Accounting for the reset mechanism.

The quantitative accuracy of our Gaussian-drift approximation with respect to the original spiking network can be increased by adding a phenomenological account for the reset on the population level. During the population spike (t < toff) the single unit reset has little influence on the population rate dynamics, since units spike at most once per cycle. At time toff the population spike ends and the integral over the suprathreshold portion of the Gaussian density of membrane potentials corresponds to the fraction of units that have spiked (the saturation): (48) (see Fig 9C, cyan hatched area). Taking into account the reset mechanism at this point would mean shifting the suprathreshold portion of p(V, toff) downwards by an amount VTVR (Fig 9C, gray hatched area), essentially splitting the voltage distribution into two pieces, corresponding to silent units (Fig 9C, non-hatched area under cyan Gauss) and units that have spiked and been reset (Fig 9C, gray hatched area). To preserve our simplified framework of a unimodal, Gaussian voltage distribution, we will instead assume that the Gaussian voltage distribution is reset as a whole, to a new mean membrane potential μreset given by the average of the two distribution pieces (“silent” and “spike+reset”): (49) This phenomenological account for the reset contains the implicit assumption that in between population spikes the membrane potential distribution “spends enough time” subthreshold that the bimodality created by the reset mechanism vanishes due to diffusion and the distribution becomes roughly Gaussian again. This assumption is satisfied for a relatively large portion of the parameter space spanned by noise intensity, coupling strength, and reset potential (see S1 Appendix). The introduction of the population-reset requires an adjustment of the definition of μmin (Eq (40)): Instead of using μmax as the initial value when integrating the feedback inhibition during the downstroke, we will now use the reset potential: . Therefore, the membrane potential μmin = μ(toff + Δ) at the end of the cycle is given by (50) Except for the initial condition term, all other terms remain unchanged (cf. Eq (43)). Since μreset < μmax (Eq (49)), the introduction of the reset decreases our estimate of the local minimum μmin. This leads to an increase of the upstroke time toff required for the mean membrane potential to rise from μmin to μmax (Eq (47)), and hence to a decrease in the network frequency (Eq (32), see Fig 10).

Note that when the reset is incorporated in the numerical integration of the DDE (31) (e.g. Fig 9B, dotted lines), the population rate after time toff needs an additional artificial clipping to zero, since the slope will briefly become positive again after the reset ().

The point of full synchrony.

Our analytical ansatz allows for a straightforward prediction of the point of full synchrony and its parameter dependencies. As mentioned before, the integral over the suprathreshold-portion of the membrane potential density at the end of the population spike corresponds to the fraction s of active units (saturation, Eq (48)). In the strict sense of full synchrony can never be reached, since the Gaussian probability density p approaches zero only in the limit v → ±∞. We can however define approximate full synchrony as the state where only the 0.13th percentile of the distribution remains subthreshold: Using our mapping from external drive to μmax (Eq (38)), we can derive a closed-form expression for the external drive that is required to achieve full synchrony: (51)

Range of applicability of the Gaussian-drift approximation.

There are two main constraints on the applicability of our theory:

(a) Since we assume that units spike at most once per cycle, the theory is only valid up to the point of full synchrony where network frequency and mean unit firing rate coincide.

(b) The assumption of a unimodal distribution of membrane potentials is only valid if, in between population spikes, the bulk of the membrane potential distribution is pushed sufficiently below threshold such that it can diffuse back to approximately Gaussian shape. We will thus require that at its lowest point the Gaussian density is almost entirely subthreshold: (52) Criteria (a) and (b) yield a finite range of external drives for which the theory applies. Since for most parameter settings μmin is an almost monotonically decaying function of the drive (see Fig 10 and Results), constraint (b) is usually only relevant for the lower boundary of the drive, while the upper boundary is determined by constraint (a): (Fig 10). For extreme combinations of high noise and weak coupling, however, μmin rises again for high drive, so constraint (b) also sets the upper boundary (see Fig Ad in S1 Appendix, bottom right panel, corresponding to the three crossed-out parameter settings in Fig B in S1 Appendix: no valid theoretical estimate of the network frequency at full synchrony, since ).

Quantifying performance.

We quantify the performance of our Gaussian-drift approximation across the range of drives for which

  1. (a). the theory applies ()
  2. (b). the spiking network has not crossed the point of full synchrony ()

The size of this regime varies for different parameter settings. To ensure comparability we interpolate the results of all spiking network simulations to the same fine resolution: We then compute the average relative error of the estimated network frequencies for each parameter setting: (53) We introduce a second score to quantify what portion of the relevant range of spiking network dynamics (from the Hopf bifurcation, , to the point of full synchrony, ) is covered by the theory: (54) We define an overall performance index as (55) The larger the performance index χp the better the Gaussian-drift approximation captures the spiking network dynamics. A systematic evaluation of the performance of the Gaussian-drift approximation for different network parameters and levels of drive can be found in S1 Appendix.

Analytical approximation of oscillation dynamics for linear drive

We want to characterize the transient dynamics of any cycle i with initial mean membrane potential and linear drive by deriving functions (56) Here refers to the mean membrane potential reached at the end of cycle i, which potentially serves as the initial condition for the next cycle.

Step 1.

The local maximum of the mean membrane potential can be found with the same ansatz that was used for constant drive: Again we truncate the recurrent expression for the total current two Δ-time windows before the end of the population spike (A1): (57) Again we approximate the trajectory of μ during the upstroke based on relaxation towards only the excitatory drive, which is now a linear function of time (cf. (A2)). Using the beginning of cycle i as the time origin t = 0, we obtain (58) Under this approximation can be written as (59) and the trajectory before time (x ≥ 0) can be approximated as: (60) Inserting the above expression in the local maximum condition (Eq (57)) yields: (61) For constant drive () we recover Eq (37), and thus the asympotic solution . For m ≠ 0, we can solve Eq (61) for small m. To this end, we insert the perturbation series and only keep the terms linear in m. Using Eq (37), we obtain to first order in m with (62) Since the drive is strongly superthreshold () the first order deviation of from its asymptotic value is generally very small. For biologically plausible parameters (e.g. a synaptic delay Δ that is not too small), the second term of Eq (62) is slightly larger than the first. Thus, , i.e. is slightly smaller than its asymptotic value for linearly increasing drive, and slightly larger otherwise (see Fig 6C, cyan dots vs line).

Since already the zeroth-order approximation is close to the numerical solution of Eq (61), and the reset mechanism remains unchanged, this implies that also is close to .

The duration of the upstroke can be obtained from Eq (59) taking into account that : (63) where W is the Lambert W function, which has solutions for arguments > − exp(−1). For positive slope m > 0 and low drive () this introduces a constraint on the initial value: (64) (cf. Figs 6B and 7Ai).

The instantaneous frequency of the cycle follows as (65)

Note that for linearly changing drive, the asymptotic initial value no longer implies the asymptotic frequency . If the drive increases linearly with a positive slope m > 0, the correct initial value leading to (Fig 6B, left, white line) is larger than (Fig 6B, left, black line). Conversely, if the drive decreases linearly (m < 0), the correct initial value leading to is smaller than (Fig 6B, right, white vs black line). The correction of the initial condition can be understood as follows: if e.g. the slope is positive, m > 0, the driving current IE(t) is smaller than during the upstroke of μ(t), which lasts until time (cf. Eq (12)) and Fig 6A, left). A smaller driving current results in a slower increase of μ(t), (cf. Eq (8a)), and hence a longer rise time from to . In order to match the asymptotic frequency, the initial value must therefore be chosen larger than so as to compensate the slower increase of μ(t) during the rising phase. A similar argument holds when the driving current is decreasing (m < 0).

Step 2.

We will now compute the mean membrane potential that is reached at the end of a cycle i. Note that this step is technically not necessary to understand the instantaneous frequency of an isolated cycle i, which we already derived above as a function of the reference drive , the slope m, and an arbitrary initial condition (Eq (65)). It is however of interest to demonstrate that the mean membrane potential at the end of the cycle is close to the asymptotic reference , since it serves as the initial condition of the next cycle. It is this property, together with the dependence of on the initial condition , that implies the emergence of IFA under a piecewise linear drive that first increases, and then decreases over multiple consecutive ripple cycles (cf. Results, Fig 6C).

We find by integrating the total current during the downstroke: (66) The total current I can be split into two parts: Istat, which is approximately equal to the feedback current for constant drive (Eq (41), except for slight deviations in ), and an additive new term Im caused by the linear change in the external drive: For constant drive (m = 0) we recover the asymptotic solution (Eq (50)). For m ≠ 0 we integrate Eq (66) numerically and find that is indeed close to the asymptotic solution . One can infer intuitively, that the sign of the (small) deviation of from equals the sign of the slope m: If the drive increases linearly, the drive during the upstroke of the mean membrane potential () is below the reference drive (Eq (12)). Thus μ rises towards a slightly smaller maximum (Eq (62)) and with slightly reduced speed, which decreases the resulting population rate and thus the inhibitory feedback (Eqs (29) and (23b)). Furthermore, the excitatory drive during the feedback-induced downstroke of μ ()) is stronger than the reference drive (Eq (12)). It follows that the downstroke of the mean membrane potential is reduced, and thus . The opposite argument can be made for linearly decreasing drive.

At this point it is clear that a piecewise linear drive, that first increases and then decreases over multiple ripple cycles (Eq (17)) will inevitably induce IFA. It is also clear, that this IFA asymmetry will vanish in the limit of infinitely slow drive (|m| → 0).

Showing a concrete example of IFA under linear drive (Fig 6C) requires a forward integration of the network response over multiple cycles. Note that our analytical ansatz centered around the reference point (Eq (56)) focuses on isolated cycles. For any individual cycle i—characterized by the drive , the slope m and the initial condition —we can (semi-)analytically calculate the frequency difference (Eqs (65) and (32)). The colorplots in Fig 6B show Δfi at a fine resolution on two hyperplanes in this three-dimensional parameter space (drive with slope m = ±0.4/ms). In retrospect, the drive at the beginning and end of any individual cycle can easily be inferred as and . Thus, to approximate the ripple dynamics over multiple consecutive cycles under linear drive (Fig 6B and 6C, circular markers) we search the previously explored parameter space and match cycles self-consistently such that the drive at the end of one cycle equals the drive at the beginning of the next: . For the double ramp examples shown in Figs 6 and 7 we further restricted the search for trajectories during the upwards-ramp to trajectories where the drive in the last cycle ends at the plateau level .

Comparing theory and simulation for piecewise linear drive.

Since the theory provides a discrete, cycle-wise estimate of the instantaneous network frequency, we compare the result to the discrete estimate of instantaneous frequency in the simulations, based on the inverse of the distances between consecutive peaks in the oscillatory population rate. In the spiking network, SPW-like drive is modeled as a piecewise linear double ramp with an intermediate plateau phase of 20 ms (Eq (17)). The Gaussian-drift approximation is used to estimate the instantaneous network frequencies separately for the rising and falling phases of the drive (linear increase or decrease with slope ±m). The plateau phase is ignored, since the network frequencies rapidly converge to the asymptotic frequency associated to the drive during the plateau phase. In both simulation and theory IFA is quantified by computing a linear regression slope over the instantaneous frequencies. The theoretically estimated instantaneous frequencies are shifted in time to account for a hypothetical plateau phase of 20 ms in between up- and downstroke and allow full comparability with the simulation results.

For every theoretical instantaneous frequency estimate an error is calculated relative to the average instantaneous frequencies observed in the spiking simulation around the same time point (ti ± 1.5 ms): The average relative error of the theoretical estimate, compared to the simulations, is then computed as (67) see Table 1.

Stationary solution and linear stability analysis of the Fokker-Planck Equation

In what follows, we briefly summarize how we approximated the spiking network’s bifurcation point in the mean-field limit as well as the network frequency and the mean unit firing rate in the bifurcation (red triangle and circular marker in Fig 1B). This involves finding the stationary solution of the Fokker-Planck Equation (Eq (23)) and analyzing its linear stability with respect to the external drive. These are standard procedures described in the literature [34, 51, 53, 90]. Here, for an easier reference, we reproduced these findings using the mathematical symbols used throughout the manuscript.

Stationary solution.

The stationary solution p0(V) of the FPE has been derived by [34] (see also [51]). The constant population rate r0 and resulting total drive I0 in the stationary state can be inferred self-consistently by solving: where is the firing rate of an uncoupled LIF neuron receiving constant drive I and Gaussian white noise of intensity D (f-I curve, [91]).

Linear stability analysis.

For a given external drive IE one assumes a weak, periodic perturbation of the population rate around its stationary value: (68) The bifurcation where the stationary state loses stability corresponds to λ = 0. In the recurrently coupled network this perturbation of the rate translates into a perturbation of the input current: (69) The linear response of the LIF units to this weakly modulated drive I(t) under Gaussian white noise is given by convolution with the linear response function G: This output rate must match the weakly periodically modulated rate r(t) that we assumed in the beginning (Eq (68)): At the bifurcation (r1(t) = eiωt, Eq (68)) this self-consistent condition is equivalent to (70) where denotes the Fourier transform of the linear response function (susceptibility). We use the exact expression for the susceptibility of an LIF unit under Gaussian white noise (more specifically, the complex-conjugated of the expression derived by [53]): where are parabolic cylinder functions and (for an alternative expression in terms of confluent hypergeometric functions, see [92]). We solve the amplitude and phase condition (Eq (70)) numerically to find the critical drive (at which the stationary state loses stability) and the corresponding frequency ω of the emerging oscillation. The network frequency and mean unit firing rate at the bifurcation are thus given by and . This approach is equivalent to the derivation by [34] via linear expansion of the FPE solution (see also [71]).

Supporting information

S1 Appendix. Performance evaluation of the Gaussian-drift approximation for constant drive.

The Gaussian-drift approximation under constant drive is compared to spiking network simulations for a range of noise intensities D and inhibitory coupling strengths K (Fig A). The parameter dependencies of the point of full synchrony are captured well by the theory (Fig B).

https://doi.org/10.1371/journal.pcbi.1011886.s001

(PDF)

S2 Appendix. The influence of network architecture and the shape of the external drive on the asymptotic and instantaneous ripple oscillation dynamics.

A covariation of network architecture and stimulus profile demonstrates that IFA is modulated by, but occurs largely independent of, the shape of the asymptotic network frequency as a function of the external drive (Fig A). Furthermore, we illustrate that a simple square pulse cannot elicit IFA in the feedback-based inhibition-first model.

https://doi.org/10.1371/journal.pcbi.1011886.s002

(PDF)

S1 Fig. Transient bimodality in the membrane potential distribution can affect instantaneous ripple frequency dynamics in the spiking network.

(A) Same layout as in Fig 2A: Spiking network response to an isolated downwards ramp stimulus with the same slope as in Fig 2D, middle, after time t > 10 ms (N = 10, 000). Note that units that participate in the third population spike tend not to spike in the fourth population spike (red lines in raster plot), which is an indication of a residual bimodality in the membrane potential distribution from one cycle to the next (only faintly visible in voltage plot, see red square). (B) Same layout as Fig 2B: instantaneous network frequencies pooled together from 50 such ramp-down-only simulations. What appears as a continuous non-monotonic “wiggle” in the instantaneous frequencies of Fig 2D, middle (gray dots) is now clearly identifiable as a single outlier cycle (marked in red).

https://doi.org/10.1371/journal.pcbi.1011886.s003

(TIF)

Acknowledgments

We thank José R. Donoso, Nikolaus Maier, Naomi Auer, and Gaspar Cano for valuable discussions and comments on the manuscript.

References

  1. 1. Todorova R, Zugaro M. Hippocampal ripples as a mode of communication with cortical and subcortical areas. Hippocampus. 2020;30(1):39–49. pmid:30069976
  2. 2. Oyanedel CN, Durán E, Niethard N, Inostroza M, Born J. Temporal associations between sleep slow oscillations, spindles and ripples. Eur J Neurosci. 2020;52(12):4762–4778. pmid:32654249
  3. 3. Lee AK, Wilson MA. Memory of sequential experience in the hippocampus during slow wave sleep. Neuron. 2002;36(6):1183–1194. pmid:12495631
  4. 4. Diba K, Buzsáki G. Forward and reverse hippocampal place-cell sequences during ripples. Nat Neurosci. 2007;10(10):1241–1242. pmid:17828259
  5. 5. Vaz AP, Wittig JH, Inati SK, Zaghloul KA. Replay of cortical spiking sequences during human memory retrieval. Science. 2020;367(6482):1131–1134. pmid:32139543
  6. 6. Buzsáki G. Two-stage model of memory trace formation: A role for “noisy” brain states. Neuroscience. 1989;31(3):551–570. pmid:2687720
  7. 7. Girardeau G, Benchenane K, Wiener SI, Buzsáki G, Zugaro MB. Selective suppression of hippocampal ripples impairs spatial memory. Nat Neurosci. 2009;12(10):1222–1223. pmid:19749750
  8. 8. Ego-Stengel V, Wilson MA. Disruption of ripple-associated hippocampal activity during rest impairs spatial learning in the rat. Hippocampus. 2010;20(1):1–10. pmid:19816984
  9. 9. Fernández-Ruiz A, Oliva A, de Oliveira EF, Rocha-Almeida F, Tingley D, Buzsáki G. Long-duration hippocampal sharp wave ripples improve memory. Science. 2019;364(6445):1082–1086. pmid:31197012
  10. 10. Klinzing JG, Niethard N, Born J. Mechanisms of systems memory consolidation during sleep. Nat Neurosci. 2019;22:1598–1610. pmid:31451802
  11. 11. Staresina BP, Niediek J, Borger V, Surges R, Mormann F. How coupled slow oscillations, spindles and ripples control neuronal processing and communication during human sleep. Nat Neurosci. 2023;26(8):1429–1437. pmid:37429914
  12. 12. Carr MF, Jadhav SP, Frank LM. Hippocampal replay in the awake state: A potential substrate for memory consolidation and retrieval. Nat Neurosci. 2011;14(2):147–153. pmid:21270783
  13. 13. Jadhav SP, Kemere C, German PW, Frank LM. Awake hippocampal sharp-wave ripples support spatial memory. Science. 2012;336(6087):1454–1458. pmid:22555434
  14. 14. Pfeiffer BE, Foster DJ. Hippocampal place-cell sequences depict future paths to remembered goals. Nature. 2013;497(7447):74–79. pmid:23594744
  15. 15. O’Keefe J. Place units in the hippocampus of the freely moving rat. Exp Neurol. 1976;51(1):78–109. pmid:1261644
  16. 16. Buzsáki G, Horvath Z, Urioste R, Hetke J, Wise K. High-frequency network oscillation in the hippocampus. Science. 1992;256(5059):1025–1027. pmid:1589772
  17. 17. Buzsáki G. Hippocampal sharp wave-ripple: A cognitive biomarker for episodic memory and planning. Hippocampus. 2015;25(10):1073–1188. pmid:26135716
  18. 18. Wu C, Shen H, Luk WP, Zhang L. A fundamental oscillatory state of isolated rodent hippocampus. J Physiol. 2002;540(2):509–527. pmid:11956340
  19. 19. Maier N, Nimmrich V, Draguhn A. Cellular and network mechanisms underlying spontaneous sharp wave–ripple complexes in mouse hippocampal slices. J Physiol. 2003;550(3):873–887. pmid:12807984
  20. 20. Maier N, Kempter R. Hippocampal Sharp Wave/Ripple Complexes—Physiology and Mechanisms. In: Axmacher N, Rasch B, editors. Cogn. Neurosci. Mem. Consol. Springer International Publishing Switzerland; 2017. p. 227–249.
  21. 21. Schlingloff D, Kali S, Freund TF, Hajos N, Gulyas AI. Mechanisms of sharp wave initiation and ripple generation. J Neurosci. 2014;34(34):11385–11398. pmid:25143618
  22. 22. Bazelot M, Teleńczuk MT, Miles R. Single CA3 pyramidal cells trigger sharp waves in vitro by exciting interneurones. J Physiol. 2016;594(10):2565–2577. pmid:26728572
  23. 23. Mitzdorf U. Current source-density method and application in cat cerebral cortex: Investigation of evoked potentials and EEG phenomena. Physiol Rev. 1985;65(1):37–100. pmid:3880898
  24. 24. Schomburg EW, Anastassiou CA, Buzsáki G, Koch C. The spiking component of oscillatory extracellular potentials in the rat hippocampus. J Neurosci. 2012;32(34):11798–11811. pmid:22915121
  25. 25. Sullivan D, Csicsvari J, Mizuseki K, Montgomery S, Diba K, Buzsáki G. Relationships between hippocampal sharp waves, ripples, and fast gamma oscillation: Influence of dentate and entorhinal cortical activity. J Neurosci. 2011;31(23):8605–8616. pmid:21653864
  26. 26. Maier N, Tejero-Cantero Á, Dorrn AL, Winterer J, Beed PS, Morris G, et al. Coherent phasic excitation during hippocampal ripples. Neuron. 2011;72(1):137–152. pmid:21982375
  27. 27. Traub RD, Schmitz D, Jefferys JGR, Draguhn A. High-frequency population oscillations are predicted to occur in hippocampal pyramidal neuronal networks interconnected by axoaxonal gap junctions. Neuroscience. 1999;92(2):407–426. pmid:10408594
  28. 28. Traub RD, Bibbig A. A model of high-frequency ripples in the hippocampus based on synaptic coupling plus axon-axon gap junctions between pyramidal neurons. J Neurosci. 2000;20(6):2086–2093. pmid:10704482
  29. 29. Memmesheimer RM. Quantitative prediction of intermittent high-frequency oscillations in neural networks with supralinear dendritic interactions. Proc Natl Acad Sci USA. 2010;107(24):11092–11097. pmid:20511534
  30. 30. Evangelista R, Cano G, Cooper C, Schmitz D, Maier N, Kempter R. Generation of sharp wave-ripple events by disinhibition. J Neurosci. 2020;40(41):7811–7836. pmid:32913107
  31. 31. Levenstein D, Buzsáki G, Rinzel J. NREM sleep in the rodent neocortex and hippocampus reflects excitable dynamics. Nat Commun. 2019;10(1):1–12. pmid:31171779
  32. 32. Ecker A, Bagi B, Vértes E, Steinbach-Németh O, Karlócai MR, Papp OI, et al. Hippocampal sharp wave-ripples and the associated sequence replay emerge from structured synaptic interactions in a network model of area CA3. Elife. 2022;11:e71850. pmid:35040779
  33. 33. Ylinen A, Bragin A, Nádasdy Z, Jandó G, Szabó I, Sik A, et al. Sharp wave-associated high-frequency oscillation (200 Hz) in the intact hippocampus: network and intracellular mechanisms. J Neurosci. 1995;15(1):30–46. pmid:7823136
  34. 34. Brunel N, Hakim V. Fast Global Oscillations in Networks of Integrate-and-Fire Neurons with Low Firing Rates. Neural Comput. 1999;11(7):1621–1671. pmid:10490941
  35. 35. Brunel N, Wang XJ. What determines the frequency of fast network oscillations with irregular neural discharges? I. Synaptic dynamics and excitation-inhibition balance. J Neurophysiol. 2003;90(1):415–430. pmid:12611969
  36. 36. Taxidis J, Coombes S, Mason R, Owen MR. Modeling sharp wave-ripple complexes through a CA3-CA1 network model with chemical synapses. Hippocampus. 2012;22:995–1017. pmid:21452258
  37. 37. Donoso JR, Schmitz D, Maier N, Kempter R. Hippocampal ripple oscillations and inhibition-first network models: Frequency dynamics and response to GABA modulators. J Neurosci. 2018;38(12):3124–3146. pmid:29453207
  38. 38. Ramirez-Villegas JF, Willeke KF, Logothetis NK, Besserve M. Dissecting the synapse- and frequency-dependent network mechanisms of in vivo hippocampal sharp wave-ripples. Neuron. 2018;100(5):1224–1240.e13. pmid:30482688
  39. 39. Melonakos ED, White JA, Fernandez FR. A model of cholinergic suppression of hippocampal ripples through disruption of balanced excitation/inhibition. Hippocampus. 2018;29(9):1–14.
  40. 40. Braun W, Memmesheimer RM. High-frequency oscillations and sequence generation in two-population models of hippocampal region CA1. PLoS Comput Biol. 2022;18(2):e1009891. pmid:35176028
  41. 41. Malerba P, Krishnan GP, Fellous JM, Bazhenov M. Hippocampal CA1 ripples as inhibitory transients. PLOS Comput Biol. 2016;12(4):e1004880. pmid:27093059
  42. 42. Liu AA, Henin S, Abbaspoor S, Bragin A, Buffalo EA, Farrell JS, et al. A consensus statement on detection of hippocampal sharp wave ripples and differentiation from other fast oscillations. Nat Commun. 2022;13 (6000). pmid:36224194
  43. 43. Stark E, Roux L, Eichler R, Senzai Y, Royer S, Buzsáki G. Pyramidal cell-interneuron interactions underlie hippocampal ripple oscillations. Neuron. 2014;83(2):467–480. pmid:25033186
  44. 44. Ponomarenko AA, Korotkova TM, Sergeeva OA, Haas HL. Multiple GABAA receptor subtypes regulate hippocampal ripple oscillations. Eur J Neurosci. 2004;20(8):2141–2148. pmid:15450093
  45. 45. Nguyen DP, Kloosterman F, Barbieri R, Brown EN, Wilson MA. Characterizing the dynamic frequency structure of fast oscillations in the rodent hippocampus. Front Integr Neurosci. 2009;3. pmid:19562084
  46. 46. Hulse BK, Moreaux LC, Lubenov EV, Siapas AG. Membrane potential dynamics of CA1 pyramidal neurons during hippocampal ripples in awake mice. Neuron. 2016;89(4):800–813. pmid:26889811
  47. 47. Brunel N, Hakim V. Sparsely synchronized neuronal oscillations. Chaos. 2008;18(1):015113. pmid:18377094
  48. 48. Csicsvari J, Hirase H, Czurkó A, Mamiya A, Buzsáki G. Oscillatory coupling of hippocampal pyramidal cells and interneurons in the behaving rat. J Neurosci. 1999;19(1):274–287. pmid:9870957
  49. 49. Risken H. The Fokker-Planck Equation: Methods of solution and applications. 2nd ed. Springer; 1989.
  50. 50. Gardiner CW. Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences. 2nd ed. Springer; 1985.
  51. 51. Abbott LF, van Vreeswijk C. Asynchronous states in networks of pulse-coupled oscillators. Phys Rev E. 1993;48(2):1483–1490. pmid:9960738
  52. 52. Brunel N. Dynamics of sparsely connected networks of excitatory and inhibitory neurons. J Comput Neurosci. 2000;8:183–208. pmid:10809012
  53. 53. Lindner B, Schimansky-Geier L. Transmission of noise coded versus additive signals through a neuronal ensemble. Phys Rev Lett. 2001;86(14):2934–2937. pmid:11290076
  54. 54. Fourcaud N, Brunel N. Dynamics of the Firing Probability of Noisy Integrate-and-Fire Neurons. Neural Comput. 2002;14(9):2057–2110. pmid:12184844
  55. 55. Brunel N, Hakim V, Richardson MJE. Firing-rate resonance in a generalized integrate-and-fire neuron with subthreshold resonance. Phys Rev E. 2003;67(5):051916.
  56. 56. Goedeke S, Diesmann M. The mechanism of synchronization in feed-forward neuronal networks. New J Phys. 2008;10(1):015007.
  57. 57. Plesser HE, Gerstner W. Noise in Integrate-and-Fire Neurons: From Stochastic Input to Escape Rates. Neural Comput. 2000;12(2):367–384. pmid:10636947
  58. 58. Chizhov AV, Graham LJ. Population model of hippocampal pyramidal neurons, linking a refractory density approach to conductance-based neurons. Phys Rev E. 2007;75(1):011924. pmid:17358201
  59. 59. Schwalger T. Mapping input noise to escape noise in integrate-and-fire neurons: a level-crossing approach. Biol Cybern. 2021;115(5):539–562. pmid:34668051
  60. 60. Schieferstein N. Hippocampal ripple oscillations in inhibitory network models [PhD Thesis]. Humboldt University Berlin; 2023.
  61. 61. Kohus Z, Káli S, Rovira-Esteban L, Schlingloff D, Papp O, Freund TF, et al. Properties and dynamics of inhibitory synaptic communication within the CA3 microcircuits of pyramidal cells and interneurons expressing parvalbumin or cholecystokinin. J Physiol. 2016;594(13):3745–3774. pmid:27038232
  62. 62. Pawelzik H, Hughes DI, Thomson AM. Physiological and morphological diversity of immunocytochemically defined parvalbumin- and cholecystokinin-positive interneurones in CA1 of the adult rat hippocampus. J Comp Neurol. 2002;443(4):346–367. pmid:11807843
  63. 63. Chizhov AV, Graham LJ. Efficient evaluation of neuron populations receiving colored-noise current based on a refractory density method. Phys Rev E. 2008;77(1):011910. pmid:18351879
  64. 64. Tchumatchenko T, Malyshev A, Geisel T, Volgushev M, Wolf F. Correlations and synchrony in threshold neuron models. Phys Rev Lett. 2010;104(5):058102. pmid:20366796
  65. 65. Badel L. Firing statistics and correlations in spiking neurons: A level-crossing approach. Phys Rev E. 2011;84(4):041919. pmid:22181187
  66. 66. Börgers C. The PING Model of Gamma Rhythms. In: An Introd. to Model. Neuronal Dyn. Springer; 2017. p. 249–261.
  67. 67. Wang XJ, Buzsáki G. Gamma oscillation by synaptic inhibition in a hippocampal interneuronal network model. J Neurosci. 1996;16(20):6402–6413. pmid:8815919
  68. 68. Diesmann M, Gewaltig MO, Aertsen A. Stable propagation of synchronous spiking in cortical neural networks. Nature. 1999;402(December):529–533. pmid:10591212
  69. 69. Lewis CM, Ni J, Wunderle T, Jendritza P, Lazar A, Diester I, et al. Cortical gamma-band resonance preferentially transmits coherent input. Cell Rep. 2021;35(5):109083. pmid:33951439
  70. 70. Bartos M, Vida I, Jonas P. Synaptic mechanisms of synchronized gamma oscillations in inhibitory interneuron networks. Nat Rev Neurosci. 2007;8(1):45–56. pmid:17180162
  71. 71. Brunel N, Hansel D. How Noise Affects the Synchronization Properties of Recurrent Networks of Inhibitory Neurons. Neural Comput. 2006;18(5):1066–1110. pmid:16595058
  72. 72. Hajos N, Karlocai MR, Nemeth B, Ulbert I, Monyer H, Szabo G, et al. Input-output features of anatomically identified CA3 neurons during hippocampal sharp wave/ripple oscillation in vitro. J Neurosci. 2013;33(28):11677–11691. pmid:23843535
  73. 73. Csicsvari J, Hirase H, Czurkó A, Mamiya A, Buzsáki G. Fast network oscillations in the hippocampal CA1 region of the behaving rat. J Neurosci. 1999;19(16):RC20. pmid:10436076
  74. 74. Klausberger T, Magill PJ, Márton LF, Roberts JDB, Cobden PM, Buzsáki G, et al. Brain-state- and cell-type- specific firing of hippocampal interneurons in vivo. Nature. 2003;421:844–848. pmid:12594513
  75. 75. Klausberger T, Somogyi P. Neuronal diversity and temporal dynamics: The unity of hippocampal circuit operations. Science. 2008;321(5885):53–57. pmid:18599766
  76. 76. Varga C, Golshani P, Soltesz I. Frequency-invariant temporal ordering of interneuronal discharges during hippocampal oscillations in awake mice. Proc Natl Acad Sci USA. 2012;109(40):E2726–E2734. pmid:23010933
  77. 77. Lapray D, Lasztoczi B, Lagler M, Viney TJ, Katona L, Valenti O, et al. Behavior-dependent specialization of identified hippocampal interneurons. Nat Neurosci. 2012;15(9):1265–1271. pmid:22864613
  78. 78. Katona L, Lapray D, Viney TJ, Oulhaj A, Borhegyi Z, Micklem BR, et al. Sleep and movement differentiates actions of two types of somatostatin-expressing GABAergic interneuron in rat hippocampus. Neuron. 2014;82(4):872–886. pmid:24794095
  79. 79. Jahnke S, Timme M, Memmesheimer RM. A unified dynamic model for learning, replay, and sharp-wave/ripples. J Neurosci. 2015;35(49):16236–16258. pmid:26658873
  80. 80. Shu Y, Hasenstaub A, Duque A, Yu Y, McCormick DA. Modulation of intracortical synaptic potentials by presynaptic somatic membrane potential. Nature. 2006;441(7094):761–765. pmid:16625207
  81. 81. Shu Y, Yu G, Yang J, McCormick DA. Selective control of cortical axonal spikes by a slowly inactivating K+ current. Proc Natl Acad Sci USA. 2007;104(27):11453–11458. pmid:17581873
  82. 82. Traub RD, Whittington MA, Buhl EH, Jefferys JGR, Faulkner HJ. On the mechanism of the γβ frequency shift in neuronal oscillations induced in rat hippocampal slices by tetanic stimulation. J Neurosci. 1999;19(3):1088–1105. pmid:9920671
  83. 83. Buzsáki G, Wang XJ. Mechanisms of gamma oscillations. Annu Rev Neurosci. 2012;35:203–225. pmid:22443509
  84. 84. Perumal MB, Latimer B, Xu L, Stratton P, Nair S, Sah P. Microcircuit mechanisms for the generation of sharp-wave ripples in the basolateral amygdala: A role for chandelier interneurons. Cell Rep. 2021;35(6):109106. pmid:33979609
  85. 85. Buzsáki G, Anastassiou CA, Koch C. The origin of extracellular fields and currents—EEG, ECoG, LFP and spikes. Nat Rev Neurosci. 2016;13(6):407–420.
  86. 86. Gerstner W, Kistler WM. Spiking Neuron Models. Cambridge University Press; 2002.
  87. 87. Goodman DFM, Brette R. The brian simulator. Front Neurosci. 2009;3(2):192–197. pmid:20011141
  88. 88. Meyer R, Obermayer K. Pypet: A python toolkit for data management of parameter explorations. Front Neuroinform. 2016;10:38. pmid:27610080
  89. 89. Uhlenbeck GE, Ornstein LS. On the Theory of the Brownian Motion. Phys Rev. 1930;36(5):823–841.
  90. 90. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics: From single neurons to networks and models of cognition and beyond. Cambridge University Press; 2014.
  91. 91. Holden AV. Models of the Stochastic Activity of Neurones. Berlin: Springer-Verlag; 1976.
  92. 92. Brunel N, Chance FS, Fourcaud N, Abbott LF. Effects of synaptic noise and filtering on the frequency response of spiking neurons. Phys Rev Lett. 2001;86(10):2186–2189. pmid:11289886