MISSION 242

Nonlinear Activation Functions

In the last lesson on representing neural networks, we became familiar with computational graphs and how neural network models are represented. We also became familiar with neural network terminology such as forward pass, input neurons, and output neurons.

In this lesson, we'll dive deeper into the role nonlinear activation functions play in deep neural networks. We'll discuss the most commonly used activation functions. The three most commonly used activation functions in neural networks are the sigmoid function, the ReLU function, and the tanh function. Because, we've learned about the sigmoid function in the last mission, we'll focus on the ReLU and the tanh functions in this mission.

As you work through each concept, you’ll get to apply what you’ve learned from within your browser so that there's no need to use your own machine to do the exercises. The Python environment inside of this course includes answer checking so you can ensure that you've fully mastered each concept before learning the next concept.

Objectives

  • Learn about the different types of activation functions.
  • Learn how nonlinear activation functions improve neural network models.

Mission Outline

1. Introduction To Activation Functions
2. ReLU Activation Function
3. Trigonometric Functions
4. Reflecting On The Tangent Function
5. Hyperbolic Tangent Function
6. Reflecting On The Hyperbolic Tangent Function
7. Next Steps
8. Takeaways

deep-learning-fundamentals

Course Info:

Advanced

The median completion time for this course is 6.2 hours. View Details

This course requires a premium subscription. This course includes one free mission, two paid missions, and one guided project. It is the 23rd course in the Data Scientist in Python path.

START LEARNING FREE

Take a Look Inside