π₯Activation Functions
Last updated
Last updated
The main purpose of Activation Functions is to convert an input signal of a node in an ANN to an output signal by applying a transformation. That output signal now is used as a input in the next layer in the stack.
Function
Description
Linear Activation Function
Inefficient, used in regression
Sigmoid Function
Good for output layer in binary classification problems
Tanh Function
Better than sigmoid
Relu Function β¨
Default choice for hidden layers
Leaky Relu Function
Little bit better than Relu, Relu is more popular
Formula:
Graph:
It can be used in regression problem in the output layer
Formula:
Graph:
Almost always strictly superior than sigmoid function
Formula:
Shifted version of the Sigmoid function π€
Graph:
Activation functions can be different for different layers, for example, we may use tanh for a hidden layer and sigmoid for the output layer
If z is very large or very small then the derivative (or the slope) of these function becomes very small (ends up being close to 0), and so this can slow down gradient descent π’
Another and very popular choice
Formula:
Graph:
So the derivative is 1 when z is positive and 0 when z is negative
Disadvantage: derivative=0 when z is negative π
Formula:
Graph:
Or: π
A lot of the space of z the derivative of the activation function is very different from 0
NN will learn much faster than when using tanh or sigmoid
Well, if we use linear function then the NN is just outputting a linear function of the input, so no matter how many layers out NN has π, all it is doing is just computing a linear function π
β Remember that the composition of two linear functions is itself a linear function
If the output is 0 or 1 (binary classification) β‘ sigmoid is good for output layer
For all other units β‘ Relu β¨
We can say that relu is the default choice for activation function
Note:
If you are not sure which one of these functions work best π΅, try them all π€ and evaluate on different validation set and see which one works better and go with that π€π