In the last lesson on representing neural networks, we became familiar with computational graphs and how neural network models are represented. We also became familiar with neural network terminology such as forward pass, input neurons, and output neurons.

In this lesson, we'll dive deeper into the role nonlinear activation functions play in deep neural networks. We'll discuss the most commonly used activation functions. The three most commonly used activation functions in neural networks are the sigmoid function, the ReLU function, and the tanh function. Because, we've learned about the sigmoid function in the last lesson, we'll focus on the ReLU and the tanh functions in this lesson.

As you work through each concept, you’ll get to apply what you’ve learned from within your browser so that there's no need to use your own machine to do the exercises. The Python environment inside of this course includes answer checking so you can ensure that you've fully mastered each concept before learning the next concept.


  • Learn about the different types of activation functions.
  • Learn how nonlinear activation functions improve neural network models.

Lesson Outline

1. Introduction To Activation Functions
2. ReLU Activation Function
3. Trigonometric Functions
4. Reflecting On The Tangent Function
5. Hyperbolic Tangent Function
6. Reflecting On The Hyperbolic Tangent Function
7. Next Steps
8. Takeaways