LIF Dynamics
The Leaky Integrate-and-Fire (LIF) neuron is the workhorse model of computational neuroscience. It captures the essential dynamics of a biological neuron — passive membrane decay, synaptic integration, and threshold-triggered spiking — while remaining analytically tractable and computationally efficient. This page develops the classical LIF model, extends it to fractional order, derives the Grunwald-Letnikov discretization used in SPIRES, and analyzes the effects of the fractional order on neural dynamics.
The Classical LIF Neuron
Biophysical Form
A biological neuron’s membrane can be modeled as a parallel RC circuit. The membrane capacitance charges in response to input current , while the membrane resistance causes passive leakage toward the resting potential . The resulting equation is:
When the membrane potential reaches the threshold , the neuron emits a spike and is reset to . The membrane time constant is .
Normalized Form
Dividing Equation (1) by and using :
where is an optional bias current. This is the standard form used in most reservoir computing implementations. The dynamics are straightforward:
- Leak: The term drives toward the resting potential with time constant .
- Integration: The input current charges the membrane.
- Spike-and-reset: When , emit a spike and set .
The solution in the absence of spiking and for constant input is an exponential approach to equilibrium:
The exponential decay means the classical LIF neuron has a single characteristic timescale. Memory of past inputs fades exponentially — fast and uniform.
The Fractional LIF (FLIF) Neuron
Motivation
The exponential decay of the classical LIF is at odds with biological observations. Cortical neurons exhibit power-law adaptation, ion channels display non-Markovian kinetics, and dendritic processing involves anomalous subdiffusion. All of these phenomena are naturally described by fractional-order dynamics.
The fractional LIF (FLIF) model replaces the integer-order time derivative with a Caputo fractional derivative of order .
Biophysical Form
Normalized Form
When , the Caputo derivative reduces to the ordinary derivative and Equation (4) reduces to the classical LIF. When , the neuron acquires a power-law memory kernel: the present voltage depends on the entire history of inputs and states, weighted by .
Free Response and Mittag-Leffler Decay
For the unforced FLIF () with initial condition , the solution is:
where is the Mittag-Leffler function. This function interpolates between two extremes:
- For : (exponential decay)
- For small : (stretched exponential)
- For large : (power-law tail)
The power-law tail is the key feature. While the classical LIF forgets its initial condition exponentially fast, the FLIF retains a memory that decays only algebraically. This slow forgetting is precisely the behavior needed to capture long-range temporal dependencies.
Grunwald-Letnikov Discretization
Derivation
To simulate the FLIF numerically, we use the Grunwald-Letnikov (GL) discretization. Starting from the GL definition of the fractional derivative and setting :
where are the GL coefficients. Substituting into the FLIF equation (4) and solving for :
Rearranging to isolate :
This is the FLIF-GL update rule implemented in SPIRES. At each time step, the new voltage is computed from two contributions:
- Instantaneous dynamics: The first term captures the leak and input at the current time step, scaled by .
- History correction: The summation over past voltages, weighted by the GL coefficients, encodes the fractional memory.
GL Coefficient Computation
The GL coefficients are defined by the generalized binomial coefficients:
These can be computed efficiently via the recurrence:
The first several coefficients for representative values of :
| 0 | 1.000 | 1.000 | 1.000 | 1.000 |
| 1 | ||||
| 2 | 0.000 | |||
| 3 | 0.000 | |||
| 4 | 0.000 |
Note that for , only and are nonzero, and the update rule reduces to the classical Euler forward step. For , the coefficients are nonzero for all , encoding the infinite memory of the fractional operator.
History Length
In principle, the GL sum extends over the entire history ( to ). In practice, the sum is truncated at a finite history length . The truncation error decreases as:
The slower the power-law decay (smaller ), the more history must be retained for a given accuracy. In SPIRES, is a configurable parameter. Typical values range from 50 to 500, depending on the task’s temporal scale and the chosen .
The computational cost of the history correction is per time step for a reservoir of neurons, making the primary knob for the memory-computation trade-off.
Effects of on Neural Dynamics
The fractional order profoundly shapes the behavior of the FLIF neuron across multiple dimensions.
Membrane Potential Decay
- : Exponential decay with time constant . The neuron “forgets” its state on a single timescale.
- : Mittag-Leffler decay — initially stretched-exponential, asymptotically power-law. The neuron retains a fading trace of its entire history.
Effective Rheobase Shift
The rheobase is the minimum sustained current required to bring the neuron to threshold. For the classical LIF, the rheobase is:
For the FLIF, the fractional derivative introduces an effective increase in the rheobase. Intuitively, the power-law memory kernel causes the neuron to “remember” its subthreshold state more persistently, which opposes the charging process. Lower values of produce a higher effective rheobase, meaning the neuron requires stronger input to fire.
This rheobase shift has an important consequence for reservoir computing: lower makes the reservoir more selective, responding only to sufficiently strong or persistent input patterns.
Input Sensitivity vs. Memory Retention
The fractional order controls a fundamental trade-off between two desirable properties:
- Input sensitivity (high ): The neuron responds rapidly to new inputs, making it an effective sensor of recent stimuli. However, it quickly forgets past events.
- Memory retention (low ): The neuron maintains long traces of past inputs, enabling it to detect slow-varying patterns and long-range dependencies. However, it is less responsive to sudden changes.
This trade-off is not merely qualitative. It can be quantified precisely using information-theoretic measures, as discussed in Memory and Information Theory.
Frequency Response
The Laplace-domain transfer function of the FLIF neuron is:
This is a fractional-order low-pass filter. The roll-off rate is dB/decade, compared to dB/decade for the classical LIF. Lower produces a shallower roll-off, meaning the FLIF neuron passes a broader range of frequencies — it is more sensitive to slow temporal components of the input.
Spike-and-Reset Mechanism
The spike-and-reset rule is the same for classical and fractional LIF neurons:
- If : record a spike at time .
- Set .
- Optionally, enforce a refractory period during which is held at .
An important subtlety arises with the GL discretization: when is reset, the history buffer retains the pre-reset voltage values. This means the fractional memory “remembers” the approach to threshold even after the reset, creating a form of spike aftereffect that influences subsequent dynamics. This is actually biologically realistic, as real neurons exhibit post-spike membrane potential trajectories that depend on the preceding interspike interval.
Summary
| Property | Classical LIF () | Fractional LIF () |
|---|---|---|
| Decay law | Exponential | Mittag-Leffler |
| Asymptotic tail | Exponential | Power-law |
| Memory of past | Single timescale | Infinite hierarchy of timescales |
| Rheobase | Higher than classical | |
| Frequency roll-off | dB/decade | dB/decade |
| Update cost per neuron | ||
| Parameters | Same + |
References
- Teka, W. W., Marinov, T. M., & Bhatt, S. J. (2014). Fractional-order leaky integrate-and-fire model with long-term memory and power law dynamics. Computational and Mathematical Methods in Medicine, 2014.
- Teka, W. W., Upadhyay, R. K., & Mondal, A. (2017). Fractional-order leaky integrate-and-fire model: frequency adaptation and coincidence detection. Biosystems, 155, 32—42.
- Lundstrom, B. N., Higgs, M. H., Spain, W. J., & Fairhall, A. L. (2008). Fractional differentiation by neocortical pyramidal neurons. Nature Neuroscience, 11(11), 1335—1342.
- Podlubny, I. (1999). Fractional Differential Equations. Academic Press.
- Gorenflo, R., Kilbas, A. A., Mainardi, F., & Rogosin, S. V. (2014). Mittag-Leffler Functions, Related Topics and Applications. Springer.