COURSE

Decision Trees

Dive deeper into the world of machine learning by learning how to construct and interpret decision trees with our Decision Trees course. In this course, you'll build a decision tree implementation from the ground up.

You'll learn about concepts such as entropy, information gain, an error metric known as Area Under the Curve (AUC), and the ID3 algorithm. You'll also get an introduction to random forests and learn to reduce overfitting with random forests. And you'll learn to ensemble decision trees to improve prediction quality.

In this course, we’ll use scikit-learn, a machine learning library for Python that makes it easier to quickly train machine learning models, and to construct and tweak both decision trees and random forests to boost performance and improve accuracy.

At the end of the course, you'll complete a project in which you will apply different machine learning models to predict the number of bike rentals using data from a communal bike sharing station in Washington D.C. This project is a chance for you to combine the skills you learned in this course and practice the machine learning workflow by implementing a decision tree. This project also serves as a portfolio project that you can showcase to your future employer to demonstrate your machine learning skills.

By the end of this course, you'll be able to:

  • Understand the types of relationships in the data that decision trees can represent.
  • Implement the random forests machine learning model.
  • Build a decision tree implementation from the ground up.

START LEARNING

60+ FREE MISSIONS

By creating an account you agree to accept our terms of use and privacy policy.

Learn about Decision Trees

Introduction to Decision Trees

Learn about the building blocks of decision trees, including entropy, and information gain.

Building A Decision Tree

Learn how to create a decision tree using the ID3 algorithm.

Applying Decision Trees

Learn how to apply and tweak decision trees.

Introduction To Random Forests

Learn how to construct and apply random forests.

Predicting Bike Rentals

Apply decision trees and random forests to predict the number of bike rentals.