We establish superlinear lower bounds on the Vapnik-Chervonenkis
(VC) dimension of neural networks with one hidden layer and local
receptive field neurons. As the main result we show that every
reasonably sized standard network of radial basis function (RBF)
neurons has VC dimension $\Omega(W\log k)$, where $W$ is the number
of parameters and $k$ the number of nodes. This significantly
improves the previously known linear bound. We also derive
superlinear lower bounds for networks of discrete and continuous
variants of center-surround neurons. The constants in all bounds are
larger than those obtained thus far for sigmoidal neural networks
with constant depth.
The results have several implications with regard to the
computational power and learning capabilities of neural networks
with local receptive fields. In particular, they imply that the
pseudo dimension and the fat-shattering dimension of these networks
is superlinear as well, and they yield lower bounds even when the
input dimension is fixed. The methods developed here appear
suitable for obtaining similar results for other kernel-based
function classes.
|