Startseite » Mitarbeiter » M. Schmitt » Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
We calculate lower bounds on the size of sigmoidal neural networks
that approximate continuous functions. In particular, we show that for
the approximation of polynomials the network size has to grow as
$\Omega((\log k)^{1/4})$ where $k$ is the degree of the
polynomials. This bound is valid for any input dimension,
i.e. independently of the number of variables. The result is obtained
by introducing a new method employing upper bounds on the
Vapnik-Chervonenkis dimension for proving lower bounds on the size of
networks that approximate continuous functions.