Typefully

5 Activation Functions to Know

Avatar

Share

 • 

2 years ago

 • 

View on X

5 activation functions you should know! 🧵
Activation Functions are just like any other mathematical function. It has three elements/steps: - Input (X-axis) - Calculation - Output (Y-axis) Different activation functions do different math. Let's discuss 5 🔽
1️⃣ ReLU ReLU is widely used due to its simplicity and effectiveness. It returns the input value if it is positive and zero otherwise. Usually, ReLU is the default activation function.
2️⃣ Sigmoid The sigmoid is a smooth S-shaped curve that maps the input to a value between 0 and 1. Sigmoid can be used for learning complex decision functions since it introduces non-linearity. It is mainly used for binary classification.
3️⃣ Tanh Tanh is similar to the Sigmoid function but maps the input to a value between -1 and 1.
4️⃣ Leaky ReLU Leaky ReLU is a variation of the ReLU function. It introduces a small slope for negative inputs, preventing neurons from becoming completely inactive (zero).
5️⃣ Softmax Softmax is primarily used in the output layer for multi-class classification problems. It transforms the raw outputs of the neural network into a vector of probabilities. Softmax ensures that the sum of the output probabilities is equal to 1.
If you liked this thread, you should also join our newsletter, DSBoost. We share: • Interviews • Podcast notes • Learning resources • Interesting collections of content dsboost.dev
Avatar

Levi

@levikul09

I explain Data Science on Grandma's level. Writing datagroundup.com