In this course, you’ll dive into the world of deep learning by gaining a deeper understanding of backpropagation and building a simple deep learning framework from scratch. As part of your journey, you’ll learn to optimize network parameters and employ regularization techniques to improve your models, creating a strong foundation in core deep learning concepts.
The course will also explore the essential role of optimizers in adjusting neural network parameters. You’ll delve into gradient descent and learn about batch size, learning rate schedules, weight decay, and momentum. Additionally, you’ll discover the popular Adam optimizer, which extends the idea of momentum, and learn how to optimize hyperparameters for neural networks while monitoring and comparing their performance.
By the end of this course, you’ll have a comprehensive understanding of fundamental deep learning concepts and be well-equipped to continue your deep learning journey.
- Build a 2-layer deep learning framework from scratch
- Optimize network parameters
- Prevent overfitting through regularization
Optimizing Network Parameters [2 lessons]
The Dataquest guarantee
Dataquest has helped thousands of people start new careers in data. If you put in the work and follow our path, you’ll master data skills and grow your career.
We believe so strongly in our paths that we offer a full satisfaction guarantee. If you complete a career path on Dataquest and aren’t satisfied with your outcome, we’ll give you a refund.