Identification Criteria and Lower Bounds for Perceptron-Like Learning Rules
Perceptron-like learning rules are known to require exponentially many
correction steps in order to identify Boolean threshold functions exactly.
We introduce criteria that are weaker than exact identification and investigate
whether learning becomes significantly faster if exact identification is
replaced by one of these criteria: PAC identification, order identification,
and sign identification. PAC identification is based on the learning paradigm
introduced by Valiant and known to be easier than exact identification.
Order identification uses the fact that each threshold function induces
an ordering relation on the input variables which can be represented by
weights of linear size. Sign identification is based on a property of threshold
functions known as unateness and requires only weights of constant size.
We show that Perceptron-like learning rules cannot satisfy these criteria
when the number of correction steps is to be bounded by a polynomial. We
also present an exponential lower bound for order identification with the
learning rules introduced by Littlestone. Our results show that efficiency
imposes severe restrictions on what can be learned with local learning
rules.