We show that the function class consisting of $k$-sparse polynomials
in $n$ variables has Vapnik-Chervonenkis (VC) dimension at least
$nk+1$. This result supersedes the previously known lower bound via
$k$-term monotone disjunctive normal form (DNF) formulas obtained by
Littlestone (1988). Moreover, it implies that the
VC dimension for $k$-sparse polynomials is strictly larger than the
VC dimension for $k$-term monotone DNF. The new bound is achieved
by introducing an exponential approach that employs Gaussian radial
basis function (RBF) neural networks for obtaining classifications
of points in terms of sparse polynomials.
|