Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks
Abstract
One of the reasons why neural networks are used in machine learning is their high expressive power, that is, the ability to express functions. Expressive power of neural networks depends on its structures and is measured by some indices. In this paper, we focus on one of these measures named "expressive number", which is based on the number of data that can be expressed. Expressive numbers enable us to see whether the size of a neural network is suitable for the given training data before we conduct machine learning. However, existing works on expressive numbers mainly target single hidden layer neural networks, and little is known about those with two or more hidden layers. In this paper, we give a lower bound of the maximum expressive number of two hidden layer neural networks and an upper bound of that of multilayer neural networks with ReLU activation function. This result shows the expressive number of two hidden layer neural networks is in O(a_1a_2) where a_1 and a_2 are the numbers of each hidden layer's neurons.
Keywords
Neural Network; Expressive Power; Expressive Number; ReLU Activation Function
Full Text:
PDFRefbacks
- There are currently no refbacks.