A product unit is a formal neuron that multiplies its input values
instead of summing them. Furthermore, it has weights acting as
exponents instead of being factors. We investigate the complexity of
learning for networks containing product units. We establish bounds on
the Vapnik-Chervonenkis (VC) dimension that can be used to assess the
generalization capabilities of these networks. In particular, we show
that the VC dimension for these networks is not larger than the best
known bound for sigmoidal networks. For higher-order networks we
derive upper bounds that are independent of the degree of these
networks. We also contrast these results with lower bounds.