Skip to contentSkip to Content
DocsTheoryFractional Calculus

Fractional Calculus

Fractional calculus extends the familiar operations of differentiation and integration to non-integer orders. Where classical calculus asks “what is the first derivative?”, fractional calculus asks “what is the α\alpha-th derivative, for α=0.7\alpha = 0.7?” This generalization, far from being a mathematical curiosity, turns out to be the natural language for describing systems with memory — systems whose present state depends not only on the immediate past but on their entire history.

This page introduces the three principal definitions of fractional derivatives used in the literature, explains their properties and relationships, and motivates their application to neural dynamics.

Biological Motivation

Three lines of experimental evidence point to fractional-order dynamics in biological neural systems.

Power-law spike-frequency adaptation. Cortical neurons adapt their firing rates according to power laws rather than exponentials. Lundstrom et al. (2008) demonstrated that pyramidal neurons in rat somatosensory cortex exhibit spike-frequency adaptation that follows tαt^{-\alpha} over multiple timescales, precisely the signature of a fractional-order process. An integer-order model would require an implausible cascade of exponential time constants to reproduce this behavior.

Anomalous subdiffusion of ions. Ion transport in dendritic spines does not follow Fick’s second law (classical diffusion). Instead, ions exhibit anomalous subdiffusion, where the mean squared displacement grows as x2tα\langle x^2 \rangle \propto t^\alpha with α<1\alpha \lt 1. Henry et al. (2008) showed that this subdiffusive regime is well modeled by fractional-order cable equations, with the fractional exponent capturing the tortuous geometry of dendritic arbors.

Natural stimuli and 1/f spectra. Natural sensory signals — speech, visual scenes, and environmental sounds — exhibit power spectra that decay as 1/fβ1/f^\beta. A fractional derivative of order α\alpha acts as a high-pass filter with transfer function proportional to (jω)α(j\omega)^\alpha, effectively whitening a 1/fα1/f^\alpha spectrum. This suggests that fractional differentiation may be a computational primitive of sensory processing, tuned to the statistical structure of the environment.

The Three Definitions

Riemann-Liouville Fractional Derivative

The Riemann-Liouville (RL) definition generalizes the Cauchy formula for repeated integration. For 0<α<10 \lt \alpha \lt 1:

RLDtαx(t)  =  1Γ(1α)ddt0tx(τ)(tτ)αdτ(1)\tag{1} {}_{\text{RL}} D_t^\alpha \, x(t) \;=\; \frac{1}{\Gamma(1 - \alpha)} \, \frac{d}{dt} \int_0^t \frac{x(\tau)}{(t - \tau)^\alpha} \, d\tau

The key idea is to first compute a fractional integral of order (1α)(1-\alpha) and then take an ordinary first derivative. The Gamma function Γ()\Gamma(\cdot) generalizes the factorial to non-integer arguments.

Properties:

  • The RL derivative of a constant is not zero: RLDtαC=Ctα/Γ(1α){}_{\text{RL}} D_t^\alpha \, C = C \, t^{-\alpha} / \Gamma(1 - \alpha). This is counterintuitive and creates difficulties when specifying initial conditions.
  • The RL definition is the most general and subsumes classical integer-order derivatives when αN\alpha \in \mathbb{N}.
  • Initial conditions for RL fractional differential equations involve fractional integrals, which lack clear physical interpretation.

Caputo Fractional Derivative

The Caputo definition reverses the order of operations: first differentiate, then fractionally integrate. For 0<α<10 \lt \alpha \lt 1:

CDtαx(t)  =  1Γ(1α)0tx˙(τ)(tτ)αdτ(2)\tag{2} {}_C D_t^\alpha \, x(t) \;=\; \frac{1}{\Gamma(1 - \alpha)} \int_0^t \frac{\dot{x}(\tau)}{(t - \tau)^\alpha} \, d\tau

Here x˙(τ)=dx/dτ\dot{x}(\tau) = dx/d\tau is the ordinary first derivative of xx.

Properties:

  • The Caputo derivative of a constant is zero, matching physical intuition.
  • Initial conditions for Caputo fractional differential equations are specified in terms of integer-order derivatives (e.g., x(0)=x0x(0) = x_0), which have clear physical meaning as initial voltage, position, concentration, etc.
  • Requires x(t)x(t) to be differentiable, whereas RL does not. This is a mild restriction for smooth physical signals but matters for discontinuous inputs.
  • When x(0)=0x(0) = 0, the Caputo and RL definitions coincide.

The Caputo form is preferred for modeling physical systems because it admits standard initial conditions.

Grunwald-Letnikov Fractional Derivative

The Grunwald-Letnikov (GL) definition generalizes the limit definition of the classical derivative. For any α>0\alpha > 0:

GLDtαx(t)  =  limh01hαk=0t/h(1)k(αk)x(tkh)(3)\tag{3} {}_{\text{GL}} D_t^\alpha \, x(t) \;=\; \lim_{h \to 0} \frac{1}{h^\alpha} \sum_{k=0}^{\lfloor t/h \rfloor} (-1)^k \binom{\alpha}{k} x(t - kh)

where the generalized binomial coefficients are:

(αk)=α(α1)(α2)(αk+1)k!(4)\tag{4} \binom{\alpha}{k} = \frac{\alpha (\alpha - 1)(\alpha - 2) \cdots (\alpha - k + 1)}{k!}

Properties:

  • The GL definition is equivalent to the RL definition under mild smoothness conditions.
  • It is inherently a discrete approximation — the sum over past values at intervals of hh directly yields a numerical scheme when hh is set to the simulation time step Δt\Delta t.
  • Each term in the sum weights a past state x(tkh)x(t - kh) by a coefficient that decays as a power law, encoding the non-local memory of the fractional derivative.
  • The GL form is the preferred basis for numerical simulation, since it translates directly into an update rule without requiring quadrature of singular integrals.

The Memory Kernel

All three definitions share a fundamental property: non-locality. The fractional derivative at time tt depends on the entire history of x(τ)x(\tau) for τ[0,t]\tau \in [0, t], weighted by a power-law kernel (tτ)α(t - \tau)^{-\alpha}.

This is in stark contrast to integer-order derivatives, which are local operations. The first derivative dx/dtdx/dt depends only on the infinitesimal neighborhood of tt. The fractional derivative Dtαx(t)D_t^\alpha x(t) depends on the entire past.

The weighting kernel has the form:

K(tτ)=(tτ)αΓ(1α)(5)\tag{5} K(t - \tau) = \frac{(t - \tau)^{-\alpha}}{\Gamma(1 - \alpha)}

This kernel is:

  • Singular at τ=t\tau = t (recent history is weighted most heavily)
  • Heavy-tailed as τ0\tau \to 0 (distant history is never fully forgotten)
  • Tunable via α\alpha: as α1\alpha \to 1, the kernel becomes increasingly concentrated near τ=t\tau = t, recovering the local behavior of an ordinary derivative

The power-law nature of this kernel means that fractional-order systems have memory that fades algebraically, not exponentially. This is precisely the behavior observed in biological neurons (power-law adaptation) and in anomalous diffusion.

Properties of Fractional Derivatives

Several properties of fractional derivatives are essential for working with fractional-order neural models.

Linearity. Fractional derivatives are linear operators:

Dtα[ax(t)+by(t)]=aDtαx(t)+bDtαy(t)D_t^\alpha \bigl[a \, x(t) + b \, y(t)\bigr] = a \, D_t^\alpha x(t) + b \, D_t^\alpha y(t)

Composition (semigroup property). For Riemann-Liouville derivatives under appropriate conditions:

DtαDtβx(t)=Dtα+βx(t)D_t^\alpha \, D_t^\beta \, x(t) = D_t^{\alpha + \beta} \, x(t)

This property does not hold in general for the Caputo definition due to the handling of initial conditions.

Laplace transform. The Laplace transform of the Caputo derivative is:

L{CDtαx(t)}=sαX(s)sα1x(0)\mathcal{L}\bigl\{{}_C D_t^\alpha x(t)\bigr\} = s^\alpha X(s) - s^{\alpha - 1} x(0)

This is the fractional analog of L{dx/dt}=sX(s)x(0)\mathcal{L}\{dx/dt\} = sX(s) - x(0) and is the key tool for analyzing fractional-order systems in the frequency domain. The factor sαs^\alpha in the transfer function produces the high-pass filtering behavior that whitens 1/fα1/f^\alpha signals.

Fractional derivative of a power function. For β>1\beta > -1:

CDtαtβ=Γ(β+1)Γ(βα+1)tβα{}_C D_t^\alpha \, t^\beta = \frac{\Gamma(\beta + 1)}{\Gamma(\beta - \alpha + 1)} \, t^{\beta - \alpha}

When β<α\beta \lt \alpha, this evaluates to zero under the Caputo definition (since a polynomial of degree less than α\lceil \alpha \rceil is effectively a “constant” for the fractional operator).

Why Grunwald-Letnikov for Simulation

The GL definition is the natural choice for numerical implementation of fractional-order neural dynamics for three reasons.

1. Direct discretization. Setting h=Δth = \Delta t in Equation (3) immediately yields a discrete-time update rule. No numerical quadrature or special function evaluation is required.

2. Recursive coefficient computation. The GL coefficients ck(α)=(1)k(αk)c_k(\alpha) = (-1)^k \binom{\alpha}{k} can be computed recursively:

c0(α)=1,ck(α)=(1α+1k)ck1(α)(6)\tag{6} c_0(\alpha) = 1, \qquad c_k(\alpha) = \left(1 - \frac{\alpha + 1}{k}\right) c_{k-1}(\alpha)

This avoids computing factorials or Gamma functions and is numerically stable.

3. Finite history truncation. In practice, the infinite sum in Equation (3) is truncated to a finite history length LL:

GLDtαx(t)    1Δtαk=0L(1)k(αk)x(tkΔt){}_{\text{GL}} D_t^\alpha \, x(t) \;\approx\; \frac{1}{\Delta t^\alpha} \sum_{k=0}^{L} (-1)^k \binom{\alpha}{k} x(t - k\Delta t)

Because the coefficients decay as ck(α)kα1|c_k(\alpha)| \sim k^{-\alpha-1} for large kk, the truncation error decreases as a power law with LL. In SPIRES, the history length LL is a configurable parameter that controls the trade-off between memory fidelity and computational cost.

Interpolation Between Integer Orders

The fractional order α\alpha provides a continuous interpolation between qualitatively different dynamical behaviors:

α\alphaBehaviorMemory KernelAnalog
α0+\alpha \to 0^+Pure integrationUniform weightingInfinite memory
α=0.5\alpha = 0.5Half-derivativet1/2t^{-1/2} decayDiffusion-like
α1\alpha \to 1^-Classical derivativeDelta-likeMarkovian

This tunability is the central advantage of fractional calculus for neural modeling. Rather than choosing between a neuron with exponential memory (integer-order) or building ad hoc multi-timescale architectures, a single parameter α\alpha smoothly controls the memory profile.

References

  1. Lundstrom, B. N., Higgs, M. H., Spain, W. J., & Fairhall, A. L. (2008). Fractional differentiation by neocortical pyramidal neurons. Nature Neuroscience, 11(11), 1335—1342.
  2. Henry, B. I., Langlands, T. A. M., & Wearne, S. L. (2008). Fractional cable models for spiny neuronal dendrites. Physical Review Letters, 100(12), 128103.
  3. Teka, W. W., Marinov, T. M., & Bhatt, S. J. (2014). Fractional-order leaky integrate-and-fire model with long-term memory and power law dynamics. Computational and Mathematical Methods in Medicine, 2014.
  4. Teka, W. W., Upadhyay, R. K., & Mondal, A. (2017). Fractional-order leaky integrate-and-fire model: frequency adaptation and coincidence detection. Biosystems, 155, 32—42.
  5. Podlubny, I. (1999). Fractional Differential Equations. Academic Press.
  6. Oldham, K. B., & Spanier, J. (1974). The Fractional Calculus. Academic Press.

Theory home | LIF Dynamics →

Last updated on