To perform regression using MLPs, non-linear activation functions might be required in
the hidden layer but generally, no non-linear activation function is required for the
output layer.
☐
If you would want to predict an age of a person, we can use ReLU activation function in
the output layer.→→
For a regression problem, in which the output value is always within a range of values,
we could use sigmoid or tanh function and scale the values to ensure it is in the bounded
range.
Fig: 1