Skip to contentSkip to Content
DocsAPI ReferenceTrainingspires_train_online

spires_train_online

Perform a single online learning step, updating the readout weights toward a target vector using the current reservoir state.


Signature

spires_status spires_train_online(spires_reservoir *r, const double *target_vec, double lr);

Parameters

ParameterTypeDescription
rspires_reservoir *Handle to the reservoir. Must not be NULL. The reservoir must have been stepped at least once so that its internal state is populated.
target_vecconst double *Target output vector of length num_outputs. The weight update minimizes the error between the current output and this target. Must not be NULL.
lrdoubleLearning rate. Controls the step size of the weight update. Must be positive. Typical values range from 1e-5 to 1e-2.

Returns

spires_statusSPIRES_OK on success. On failure:

  • SPIRES_ERR_INVALID_ARGr or target_vec is NULL, or lr is non-positive.

Example

double lr = 1e-3; for (size_t t = 0; t < series_length; t++) { /* Step the reservoir */ spires_step(r, &input[t * num_inputs]); /* Update weights toward the target for this timestep */ spires_status s = spires_train_online(r, &target[t * num_outputs], lr); if (s != SPIRES_OK) { fprintf(stderr, "online training failed at t=%zu\n", t); break; } }

Notes

  • Update rule. This function implements a delta rule (Widrow-Hoff) update: W_out += lr * (target - y) * x^T, where x is the current reservoir state and y = W_out * x is the current output. This is a gradient descent step on the squared error.
  • No allocation. This function performs no heap allocation. It operates entirely on the reservoir’s pre-allocated internal buffers.
  • Comparison with ridge. Online training is useful for adaptive or non-stationary tasks where the target distribution changes over time. For fixed datasets, spires_train_ridge typically produces better results because it solves the linear regression problem in closed form.
  • Weight initialization. Before the first online update, W_out is zero-initialized (as set by spires_reservoir_create). If you have previously called spires_train_ridge, those weights are used as the starting point for online fine-tuning.
  • Thread safety. Do not call this function on a reservoir that is concurrently in use by another thread.

See Also

Last updated on