On the Complexity of Computing and Learning with Networks of Spiking Neurons
In a network of spiking neurons a new set of parameters becomes relevant
which has no counterpart in traditional neural network models: the time
that a pulse needs to travel through a connection between two neurons (henceforth
called ``delay'' of a connection). It is known that these delays are tuned
in biological neural systems through a variety of mechanisms. We investigate
the VC-dimension of networks of spiking neurons where the delays are viewed
as ``programmable parameters'' and we prove tight bounds for this VC-dimension.
Thus we get quantitative estimates for the diversity of functions that
a network with fixed architecture can compute with different settings of
its delays. It turns out that a network of spiking neurons with $k$ adjustable
delays is able to compute a much richer class of Boolean functions than
a threshold circuit with $k$ adjustable weights. The results also yield
bounds for the number of training examples that an algorithm needs for
tuning the delays of a network of spiking neurons. Results about the computational
complexity of such algorithms are also given.