Erdos-Renyi Random Graphs
The Erdos-Renyi random graph is the simplest network topology available in SPIRES. Each possible edge between neurons exists independently with a fixed probability , producing a homogeneous random network with well-understood statistical properties.
Mathematical Definition
The Erdos-Renyi model generates a graph on vertices where each of the possible edges is included independently with probability . For a directed graph (as used in SPIRES), each of the possible directed edges is included independently with probability .
In SPIRES, is the number of neurons (num_neurons) and is the connectivity parameter (connectivity). A connectivity of 0.1 means each directed edge exists with 10% probability.
Degree Distribution
In an Erdos-Renyi graph, the degree of each vertex follows a binomial distribution:
For large with held constant, this converges to a Poisson distribution:
where is the expected degree. The distribution is sharply peaked around its mean, meaning nearly all neurons have similar numbers of connections. There are no hubs (highly connected neurons) and no degree heterogeneity.
Properties Relevant to Reservoir Computing
Short Path Lengths
The average shortest path length in an Erdos-Renyi graph scales logarithmically:
This means information can propagate from any neuron to any other in a small number of synaptic steps. For a reservoir with and , the expected degree is 50 and the average path length is approximately 2.
Low Clustering
The clustering coefficient (probability that two neighbors of a vertex are themselves connected) equals the edge probability:
This is typically low for sparse networks. For , only 10% of potential triangles are closed. The lack of local structure means there are no tightly connected subgroups or functional modules.
Giant Component Threshold
A phase transition occurs at . Below this threshold, the graph consists of small disconnected components. Above it, a single giant component emerges that contains a fraction of all vertices approaching 1 as grows. For reservoir computing, the connectivity parameter should always be well above to ensure the reservoir is a connected network.
Implications for Reservoir Computing
The Erdos-Renyi topology provides a baseline for reservoir performance. Its key characteristics are:
- Uniform mixing: All neurons contribute roughly equally to the dynamics. There is no hierarchical structure or preferred pathways for information flow.
- Predictable spectral properties: The eigenvalue distribution of the adjacency matrix follows the circular law (for random matrices) and is well-characterized analytically.
- Reproducibility: Given the same random seed, the network is fully determined by and .
- No spatial structure: There is no notion of locality or neighborhood, which may be a disadvantage for tasks where spatial or modular organization of information is beneficial.
SPIRES API
The Erdos-Renyi topology is selected with:
cfg.connectivity_type = SPIRES_CONN_RANDOM;
cfg.connectivity = 0.1; /* edge probability p */Example Configuration
spires_reservoir_config cfg = {
.num_neurons = 500,
.num_inputs = 1,
.num_outputs = 1,
.spectral_radius = 0.95,
.ei_ratio = 0.8,
.input_strength = 0.1,
.connectivity = 0.1,
.dt = 1.0,
.connectivity_type = SPIRES_CONN_RANDOM,
.neuron_type = SPIRES_NEURON_LIF_DISCRETE,
.neuron_params = NULL,
};Spectral Radius and Rescaling
After generating the random weight matrix according to the Erdos-Renyi topology, SPIRES computes its spectral radius (the largest absolute eigenvalue) and rescales:
where is the user-specified spectral_radius. This ensures that the echo state property and the dynamical regime are controlled independently of the topology and connectivity.
Excitatory-Inhibitory Balance
The ei_ratio parameter determines the fraction of neurons that are excitatory. For ei_ratio = 0.8, 80% of neurons produce positive (excitatory) outgoing weights and 20% produce negative (inhibitory) outgoing weights. This balance is applied after the random topology is generated and before spectral radius rescaling.
The E/I balance affects the dynamics significantly: balanced networks (e.g., ei_ratio = 0.8) tend to produce richer, more variable activity patterns than purely excitatory networks, which is beneficial for the separation property of the reservoir.
Choosing the Connectivity Parameter
The connectivity parameter controls the density of the network:
| Connectivity | Expected Degree | Character |
|---|---|---|
| 0.01 | Very sparse; risk of disconnection for small | |
| 0.05 | Sparse; efficient computation | |
| 0.10 | Moderate; good default | |
| 0.20 | Dense; higher memory and computational cost | |
| 0.50 | Very dense; diminishing returns |
For reservoir computing, sparse connectivity (—) generally performs as well as dense connectivity while being significantly more computationally efficient. The random mixing property of Erdos-Renyi graphs ensures good information propagation even at low densities.
When to Use Erdos-Renyi
Choose the Erdos-Renyi topology when:
- Simplicity is desired: It is the easiest topology to reason about and has the fewest parameters (just ).
- Establishing a baseline: Compare against small-world and scale-free to determine if structured connectivity improves performance on your task.
- Tasks lack spatial structure: For problems without inherent modularity or locality, the uniform connectivity of Erdos-Renyi is a natural choice.
- Theory-guided experiments: When you want to relate reservoir behavior to analytical results from random matrix theory.
For tasks that benefit from local clustering (e.g., spatiotemporal pattern recognition), consider Watts-Strogatz. For tasks that benefit from hub neurons and heterogeneous degree distributions, consider Barabasi-Albert.