Introduction to Random Forests

Learn how to construct and apply random forests.


  • Learn how to ensemble decision trees to improve prediction quality.
  • Learn how to introduce variation with bagging.
  • Learn how to reduce overfitting with random forests.

Mission Outline

1. Introduction
2. Combining Model Predictions With Ensembles
3. Combining Our Predictions
4. Combining Our Predictions
5. Why Ensembling Works
6. Introducing Variation With Bagging
7. Selecting Random Features
8. Random Subsets in scikit-learn
9. Practice Putting it All Together
10. Tweaking Parameters to Increase Accuracy
11. Reducing Overfitting
12. When to Use Random Forests
13. Takeaways

Course Info:

Decision Trees


The average completion time for this course is 10-hours.

This course requires a premium subscription. This course has four paid missions and one guided project.  It is the 22nd course in the Data Scientist in Python path.


Take a Look Inside

Share On Facebook
Share On Twitter
Share On Linkedin
Share On Reddit