We investigate the sample complexity for learning using higher-order
neurons. We calculate upper and lower bounds on the
Vapnik-Chervonenkis dimension and the pseudo dimension for
higher-order neurons that allow unrestricted interactions among the
input variables. In particular, we show that the degree of interaction
is irrelevant for the VC dimension and that the individual degree of
the variables plays only a minor role. Further, our results reveal
that the crucial parameters that affect the VC dimension of
higher-order neurons are the input dimension and the maximum number of
occurrences of each variable. The lower bounds that we establish are
asymptotically almost tight. In particular, they show that the VC
dimension in super-linear in the input dimension. Bounds for
higher-order neurons with sigmoidal activation function are also
derived.