Also known as feedforward neural
networks, or multilayer
perceptrons(MLPs)
As it name says, is an a
perceptron with multiple
layers
Are the quintessential deep learning
models
PERCEPTRON
The perceptron is an algorithm for supervised
learning of binary classifiers (functions that
can decide whether an input, represented by a
vector of numbers, belongs to some specific
class or not), It is a type of linear classifier,
Was invented in 1957 at the Cornell
Aeronautical Laboratory by Frank
Rosenblatt
Was one of the first artificial
neural networks to be produced
Perceptrons could not be trained
to recognise many classes of
patterns, due this, the researches
invented the DFF NN
Can be trained by a simple
learning algorithm that is
usually called the delta rule
Some features
Don't form a cycle, are different to recurrent
You can build only by combining
many layers of single perceptron
was the first and simplest type of
artificial neural network devised.
It can aproximate
almost any function
The information moves in only one
direction, forward, from the input nodes,
through the hidden nodes (if any) and to the
output nodes.
It can resolve non linear problems
the overall number of layers gives the
DEPTH of the model, the name deep
learning arose from this terminology
Final layer is
the output
layer
The training examples
specify directly what the
output layer must do at
each point of an input (p)
The behavior of the other layers is not directly
specified by the training data. this layers are called
HIDDEN LAYERS
Backpropagation
is the most used
algorithm to
learn in this
network
Form the basis of many important
commercial applications.
The convolutional networks used for object
recognition from photos are a specialized kind
of feedforward network
Some examples
Airline Marketing
Tactician
Backgammon
Data Compression
Driving – ALVINN
ECG Noise Filtering
Financial Prediction
Speech Recognition
Sonar Target Recognition
Disadvantages
Sometimes need a
lot of training time
it's bad
extrapolating
The existence of local minimums in
the error function makes training
difficult