Multilayer perceptron classifier
Last updated
Last updated
Multilayer perceptron classifier (MLPC) is a classifier based on the feedforward artificial neural network. MLPC consists of multiple layers of nodes. Each layer is fully connected to the next layer in the network. Nodes in the input layer represent the input data. All other nodes map inputs to outputs by a linear combination of the inputs with the node's weights w and bias b and applying an activation function. This can be written in matrix form for MLPC with K+1 layers as follows:
Nodes in intermediate layers use sigmoid (logistic) function:
Nodes in the output layer use softmax function:
The number of nodes N in the output layer corresponds to the number of classes.
MLPC employs backpropagation for learning the model. We use the logistic loss function for optimization and L-BFGS as an optimization routine.
Trick:
The number of neurons in the first layer needs to be equal to number of features