We consider a neural network that evolves in discrete time. At each timestep t, a neuron i either fires (fi(t) = 1) with probability ƒÐi(t), or does not fire (fi(t) = 0) with probability 1 - ƒÐi(t). The neurons are connected through plastic synapses with efficacies wij(t), where i is the index of the postsynaptic neuron. The efficacies wij can be either positive or negative (corresponding to excitatory and inhibitory synapses, respectively). we will consider here networks of stochastic leaky integrate-and-fire neurons, that evolve in discrete time according to: (4) where ƒÑi is the leakage time constant, and the sum on the right represents the growth of the potential caused by the injection of current during a timestep by the firing of presynaptic neurons. The neuron fires stochastically with probability ƒÐ(Vi(t)). If the neuron fires (fi(t) =1), the potential is reset to a base value (reset potential), Vi(t) = Vr. The input neurons fired Poisson spike trains, with a firing rate proportional to the activation, between 0 and 50 Hz. The spikes of motor neurons were converted to effector activations by integrating them with a leaky accumulator of time constant ƒÑe = 2 s. This is equivalent to performing a weighted estimate of the firing rate using an exponential kernel with the same time constant. The motor activations a evolved according to