We establish versions of Descartes' rule of signs for radial basis
function (RBF) neural networks. These RBF rules of signs provide
tight bounds for the number of zeros of univariate networks with
certain parameter restrictions. Moreover, they can be used to
derive tight bounds for the Vapnik-Chervonenkis (VC) dimension and
pseudo-dimension of these networks. In particular, we show that
these dimensions are no more than linear. This result contrasts
with previous work showing that RBF neural networks with two and
more input nodes have superlinear VC dimension. The rules give rise
also to lower bounds for network sizes, thus demonstrating the
relevance of network parameters for the complexity of computing with
RBF neural networks.
|