Using Lasso or ElasticNet in scikit-learn
Let's adapt the preceding example to use ElasticNets. Using scikit-learn, it is very easy to swap in the ElasticNet
regressor for the least squares one that we had before:
from sklearn.linear_model import Lasso las = Lasso(alpha=0.5)
Now we use las
, whereas earlier we used lr
. This is the only change that is needed. The results are exactly what we would expect. When using Lasso
, the R2
on the training data decreases to 0.71
(it was 0.74
before), but the cross-validation fit is now 0.59
(as opposed to 0.56
with linear regression). We trade a larger error on the training data in order to gain better generalization.
Visualizing the Lasso path
Using scikit-learn, we can easily visualize what happens as the value of the regularization parameter (alphas
) changes. We will again use the Boston data, but now we will use the Lasso regression
object:
las = Lasso() alphas = np.logspace(-5, 2, 1000) alphas, coefs, _= las.path(x, y, alphas...