Abstract:
VC (Vapnik Chervonenkis) Dimension is a useful tool for measuring the power of a neural network or
some other types of classi ers. In the eld of learning theory VC dimension represents the generalized power
of a neural network. From mid 20th century researchers have been interested in this work and have provided
a vast horizon of upper and lower bounds for VC dimension of a neural network. Most of the published work
assumes feed forward neural network with no skip connections to establish the upper and lower bounds of
VC dimension. In this work we establish that the upper bound of VC Dimension for neural network with
piece wise polynomial activation functions can be tighter. Along with this we proposed some other methods
for calculating VC Dimension upper bound for RVFLN (neural network with skip connections). Most of
the relevant work on VC Dimension upper bound for neural network with sigmoidal activation functions
are based on model theoretic approach or number of operations on a basic computing model. Later in this
work we give a di erent approach for calculation of VC Dimension upper bound for neural network with
sigmoidal activation functions. Moreover on top this we give an idea about how a theoretical test error rate
and practical test error rate depend upon on the number of layers and the number of parameters for a feed
forward neural network.