We establish versions of Descartes' rule of signs for radial basis
function (RBF) neural networks. The RBF rules of signs provide tight
bounds for the number of zeros of univariate networks with certain
parameter restrictions. Moreover, they can be used to infer that
the Vapnik-Chervonenkis (VC) dimension and pseudo-dimension of these
networks are no more than linear. This contrasts with previous work
showing that RBF neural networks with two and more input nodes have
superlinear VC dimension. The rules give rise also to lower bounds
for network sizes, thus demonstrating the relevance of network
parameters for the complexity of computing with RBF neural networks.
|