Lower Bounds on Identification Criteria for Perceptron-like Learning Rules
We investigate the computational complexity of identifying neural weights
using Perceptron-like learning rules. These are understood as instructions
to change weights by a fixed amount after occurrence of an error. We are
considering worst-case bounds on the number of correction steps when the
training examples are taken from Boolean functions computable by McCulloch-Pitts
neurons. In order to avoid known exponential lower bounds for exact identification
we define and analyze three criteria that do not require that the learning
process generates weights that precisely represent the target function:
PAC identification, order identification, and sign identification. Our
results show that Perceptron-like learning rules cannot satisfy any of
these criteria when the number of correction steps is to be bounded by
a polynomial. This indicates that even by considerably lowering one's demands
one cannot prevent Perceptron-like rules from being computationally infeasible.