Listly by mohitverma0491
State of the art Neural networks are capable of learning amazing things. Relu is one of the activation functions used to train neural networks. It has many advantages over other neural network activation functions.
Source: https://deeplearninguniversity.com/
ReLU produces an output which is maximum among 0 and x. So when x is negative, the output is 0 and when x is positive, the output is x.
Sigmoid activation function, also known as logistic function is one of the activation functions used in the neural network.
Deep Learning is a subfield of Machine Learning in which a layered architecture learns representations and each successive layers works with more meaningful data.
activation functions in neural network is one of the most important part of the neural network. Relu activation function has some very serious advantages over other activation functions such as hyperbolic tangent, sigmoid, leaky relu, parametarised relu, elu, and selu.
Disadvantages of sigmoid activation function.
Sigmoid activation function has some serious disadvantages as compared to relu, these are but not limited to -
Range lies between 0 and 1, which causes saturation
cannot be used as output neuron for unbounded regression output.
suffers from gradient explosion and dying gradient problem
mean in 0.5 and not 1
Disadvantages of relu-
suffer from dying relu
function is continuous over R
but it is not differentiable at 0.
no gradient to work with when output is less than 0.
advantages of relu-
easy to compute
doesnt suffer from dying gradients
doesnt suffer from exploding gradients.
fast in computation
At last, I would like to say that I hope that you have a great time reading this blog post and enjoy it as much as I did while writing it.
Thank you