MISSION 187

Model Selection and Tuning

Learn how to select the optimum model and tune hyperparameters in Kaggle competitions.

Objectives

  • Learn how the k-nearest neighbors and random forest algorithms work.
  • Learn about hyperparameters and how to select the hyperparameters that give the best prediction.
  • Learn how to compare differrent algorithms to improve the accuracy of your predictions.

Mission Outline

1. Introducing Model Selection
2. Training a Baseline Model
3. Training a Model using K-Nearest Neighbors
4. Exploring Different K Values
5. Automating Hyperparameter Optimization with Grid Search
6. Submitting K-Nearest Neighbors Predictions to Kaggle
7. Introducing Random Forests
8. Tuning our Random Forests Model with GridSearch
9. Submitting Random Forest Predictions to Kaggle
10. Next Steps
11. Takeaways

kaggle-fundamentals

Course Info:

Kaggle Fundamentals

Intermediate

The average completion time for this course is 10-hours.

This course requires a premium subscription, and includes three missions and one guided project.  It is the 28th course in the Data Scientist in Python path.

START LEARNING FREE

Take a Look Inside