A neural tree is a feedforward neural network with at most one edge outgoing
from each node. We investigate the number of examples that a learning algorithm
needs when using neural trees as hypothesis class. We give bounds for this
sample complexity in terms of the VC dimension. We consider trees consisting
of threshold, sigmoidal and linear gates. In particular, we show that the
class of threshold trees and the class of sigmoidal trees on $n$ inputs
both have VC dimension $\Omega(n\log n)$. This bound is asymptotically
tight for the class of threshold trees. We also present an upper bound
for this class where the constants involved are considerably smaller than
in a previous calculation. Finally, we argue that the VC dimension of threshold
or sigmoidal trees cannot become larger by allowing the nodes to compute
linear functions. This sheds some light on a recent result that exhibited
neural networks with quadratic VC dimension.