In this linear algebra for data science and machine learning course, we've explored two different ways to find the solution to a matrix equation when the right hand side isn't the zero vector. The first way we explored was Gaussian elimination, which involves using the row operations to transform the augmented representation of a linear system to echelon form and then finally to reduced row echelon form. The second way we explored was to compute the matrix inverse of our main matrix and left multiplying both sides of the equation to find our unknown vector.

While we can use these techniques to solve most of the linear systems we'll encounter, we need to learn what to do when:

  • The solution set for a linear system doesn't exist.
  • The solution set for a linear system isn't just a single vector.
  • The right hand-side is equal to the zero vector.

In this lesson, you will explore more techniques for solving linear systems in the above scenarios.

At the end of this lesson, you will have a solid foundation of calculus and linear algebra that we can build on to understand most of the more advanced machine learning techniques.

As you work through each concept, you’ll get to apply what you’ve learned by writing code in your browser. The Python environment inside of this course includes answer checking so you can ensure that you've fully mastered each concept before learning the next one.


  • The different solution sets to linear systems.
  • The difference between homogenous and nonhomogenous systems.

Lesson Outline

1. Introduction
2. Inconsistent Systems
3. Singular Matrix
4. Possible Solutions for Non-homogenous Systems
5. Homogenous Systems
6. Summary of Linear Systems
7. Next Steps
8. Takeaways