Introduction by Example
We shortly introduce the fundamental concepts of spikeDE through a simple example: training a SNN on the MNIST dataset using fractional-order dynamics. This tutorial assumes no prior knowledge of SNNs or differential equation solvers—everything you need will be explained along the way.
Recommend Reading
For an introduction to SNNs, we refer the interested reader to Training Spiking Neural Networks Using Lessons From Deep Learning.
What is spikeDE?
spikeDE is a PyTorch-based library designed to implement the Fractional-Order Spiking Neural Network (f-SNN) framework. Unlike traditional SNN libraries that rely on first-order Ordinary Differential Equations (ODEs) with Markovian properties—where the current state depends only on the immediate past—spikeDE governs neuron dynamics using Fractional-Order Differential Equations (FDEs). This approach is grounded in the observation that biological neurons often exhibit non-Markovian behaviors, such as power-law relaxation and long-range temporal correlations, which cannot be captured by integer-order models.
Crucially, spikeDE serves as a generalized framework that strictly encompasses traditional integer-order SNNs. By setting the fractional order \(\alpha = 1\), the library naturally recovers standard Leaky Integrate-and-Fire (LIF) and Integrate-and-Fire (IF) models, making it a superset of existing approaches rather than an alternative solver. When \(0 < \alpha < 1\), the Caputo fractional derivative introduces a power-law memory kernel, allowing the membrane potential to depend on its entire history. This capability enables the modeling of complex phenomena like persistent memory, fractal dendritic structures, and enhanced robustness to input perturbations, offering a more biologically plausible and mathematically rich foundation for spiking networks.
At its core, spikeDE provides:
- Fractional Neuron Models: Implementations of f-LIF and f-IF neurons that naturally encode long-term dependencies via fractional calculus.
- Generalized Wrapper (
SNNWrapper): A flexible interface that converts any standard PyTorch network into an f-SNN, supporting both single-term and multi-term fractional dynamics. - Advanced Numerical Solvers: Efficient discretization methods (e.g., fractional Adams–Bashforth–Moulton, Grünwald–Letnikov) tailored for non-local fractional operators.
- Trainable Fractional Orders: Options to learn the fractional order \(\alpha\) and memory coefficients end-to-end, allowing the network to adapt its temporal memory span automatically.
This allows researchers to move beyond simple recurrence and explore how non-Markovian dynamics, history-dependent evolution, and fractional temporal scaling enhance learning in spiking networks across vision, graph, and sequence tasks.
Step-by-Step Walkthrough: MNIST Classification with spikeDE
Below, we walk through the key components of the provided example script. You can run this code after installing spikeDE and PyTorch.
Note
The full script is designed as a standalone example. Only classes/functions imported from spikeDE (e.g., SNN, SNNWrapper, LIFNeuron) are part of the package. Everything else—data loading, model definitions like CNNExample, utility functions like spike_converter—are user-defined helpers written specifically for this demo.
Importing Required Modules
| Importing requried modules | |
|---|---|
Here, only the last two lines involve spikeDE. The rest are standard PyTorch utilities for data handling and training loops.
Defining Your Base Network
Before wrapping a network with spikeDE, you define a standard PyTorch model using regular layers—but insert spiking neurons at activation points.
Key Insight
- This looks like a normal CNN—but instead of ReLU, we use
LIFNeuron. - Each
LIFNeuronmaintains internal membrane potential and emits spikes based on dynamics defined by tau (time constant), threshold, and a surrogate gradient for backpropagation. - The actual spiking behavior is not computed here directly—it’s handled later by
SNNWrapperduring time integration.
MLP Based Network
You could similarly define an MLP:
Converting Static Inputs to Spike Trains
SNNs process temporal spike sequences, not static images. So we must convert each MNIST image into a series of spikes over time.
In the training loop, inputs are scaled (data = 10 * data) to increase spike rates—this is a common heuristic.
Wrapping Your Model with SNNWrapper
This is where the fractional framework is applied. The SNNWrapper transforms your static network into a dynamical system driven by FDEs.
Key Parameters
integrator: Chooses the solver type:'odeint'/'odeint_adjoint'for classical ODEs (integer-order);'fdeint'/'fdeint_adjoint'for FDEs.
alpha: The fractional order (e.g.,0.5for single alpha,[0.3, 0.4, 0.5]for multi-alpha).multi_coefficient: Weights for each term (required ifalphahas multiple values).learn_coefficient: IfTrue, coefficient(s) become trainable parameter(s).learn_alpha: IfTrue, \(\alpha(s)\) become trainable parameter(s).
Training Loop: Time Integration Over Spikes
During training, static inputs are first encoded into temporal spike trains with shape [T, B, ...]. These sequences are then passed to the model alongside a time grid that defines the evolution interval for the fractional solver:
Key arguments explained
data_time: Specifies the discrete time points \(t_0, t_1, \dots, t_T\) over which the differential equation is solved.method: Selects the numerical integration scheme (e.g.,'gl'for the Grünwald–Letnikov formula, suitable for capturing long-range memory).options: Configures solver-specific parameters. For instance,memory=-1instructs the solver to utilize the full history of the state, which is essential for accurate fractional-order simulation.
The model returns a sequence of outputs corresponding to each time step. To obtain a single prediction for classification, we typically aggregate these temporal responses (e.g., via averaging or summing).
Full Training Pipeline
The following block combines data loading, model instantiation, and the training loop. It demonstrates how to pass the time grid to the solver and handle the temporal outputs of the f-SNN.
| Standalone Code | |
|---|---|
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 | |
Next Steps
- Different Neuron Types: Try different neuron types (
IFNeuron). - Experiment with \(\alpha\): Try setting
alpha=1.0to compare against standard LIF, oralpha=0.6for stronger memory effects. - Learnable Orders: Enable
learn_alpha=Trueto let the network discover the optimal memory depth per layer. - Multi-term Dynamics: Explore
multi_coefficientto simulate complex biological relaxation processes. - Visualization: Plot the membrane potential over time to observe the power-law decay characteristic of fractional systems.
spikeDE opens the door to physics-informed spiking networks—where neural dynamics obey principled mathematical laws beyond simple recurrence. Happy spiking!