Chapter 5. From Simple Linear Regression to Multiple Linear Regression
In Chapter 2, Simple Linear Regression we used simple linear regression to model the relationship between a single explanatory variable and a continuous response variable; we used the diameter of a pizza to predict its price. In Chapter 3, Classification and Regression with K-Nearest Neighbors we introduced KNN and trained classifiers and regressors that used more than one explanatory variable to make predictions. In this chapter, we will discuss a multiple linear regression, a generalization of simple linear regression that regresses a continuous response variable onto multiple features. We will first analytically solve the values of the parameters that minimize the RSS cost function. We will then introduce a powerful learning algorithm that can estimate the values of the parameters that minimize a variety of cost functions, called gradient descent. We will discuss polynomial regression, another special case of multiple...