Activation Functions | Deep Learning Tutorial 8 (Tensorflow Tutorial, Keras & Python) - codebasics - 深度學習 Deep Learning 公開課 - Cupoy
Activation functions (step, sigmoid, tanh, relu, leaky relu ) are very important in building a non l...
Activation functions (step, sigmoid, tanh, relu, leaky relu ) are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.
Github link for code in this tutorial: : https://github.com/codebasics/deep-le...
Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses.
🔖 Hashtags 🔖
#activationfunction #activationfunctionneuralnetwork #neuralnetwork #deeplearning