You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

28 KiB

<html> <head> </head>

___

Copyright by Pierian Data Inc. For more information, visit us at www.pieriandata.com

Linear Regression Project Exercise - Solutions

Now that we have learned about feature engineering, cross validation, and grid search, let's test all your new skills with a project exercise in Machine Learning. This exercise will have a more guided approach, later on the ML projects will begin to be more open-ended. We'll start off with using the final version of the Ames Housing dataset we worked on through the feature engineering section of the course. Your goal will be to create a Linear Regression Model, train it on the data with the optimal parameters using a grid search, and then evaluate the model's capabilities on a test set.




Complete the tasks in bold

TASK: Run the cells under the Imports and Data section to make sure you have imported the correct general libraries as well as the correct datasets. Later on you may need to run further imports from scikit-learn.

Imports

In [1]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns

Data

In [2]:
df = pd.read_csv("../DATA/AMES_Final_DF.csv")
In [3]:
df.head()
Out[3]:
Lot Frontage Lot Area Overall Qual Overall Cond Year Built Year Remod/Add Mas Vnr Area BsmtFin SF 1 BsmtFin SF 2 Bsmt Unf SF ... Sale Type_ConLw Sale Type_New Sale Type_Oth Sale Type_VWD Sale Type_WD Sale Condition_AdjLand Sale Condition_Alloca Sale Condition_Family Sale Condition_Normal Sale Condition_Partial
0 141.0 31770 6 5 1960 1960 112.0 639.0 0.0 441.0 ... 0 0 0 0 1 0 0 0 1 0
1 80.0 11622 5 6 1961 1961 0.0 468.0 144.0 270.0 ... 0 0 0 0 1 0 0 0 1 0
2 81.0 14267 6 6 1958 1958 108.0 923.0 0.0 406.0 ... 0 0 0 0 1 0 0 0 1 0
3 93.0 11160 7 5 1968 1968 0.0 1065.0 0.0 1045.0 ... 0 0 0 0 1 0 0 0 1 0
4 74.0 13830 5 5 1997 1998 0.0 791.0 0.0 137.0 ... 0 0 0 0 1 0 0 0 1 0

5 rows × 274 columns

In [4]:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2925 entries, 0 to 2924
Columns: 274 entries, Lot Frontage to Sale Condition_Partial
dtypes: float64(11), int64(263)
memory usage: 6.1 MB

TASK: The label we are trying to predict is the SalePrice column. Separate out the data into X features and y labels

In [5]:
X = df.drop('SalePrice',axis=1)
y = df['SalePrice']

TASK: Use scikit-learn to split up X and y into a training set and test set. Since we will later be using a Grid Search strategy, set your test proportion to 10%. To get the same data split as the solutions notebook, you can specify random_state = 101

In [6]:
from sklearn.model_selection import train_test_split
In [7]:
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.10, random_state=101)

TASK: The dataset features has a variety of scales and units. For optimal regression performance, scale the X features. Take carefuly note of what to use for .fit() vs what to use for .transform()

In [8]:
from sklearn.preprocessing import StandardScaler
In [9]:
scaler = StandardScaler()
In [10]:
scaled_X_train = scaler.fit_transform(X_train)
scaled_X_test = scaler.transform(X_test)

TASK: We will use an Elastic Net model. Create an instance of default ElasticNet model with scikit-learn

In [11]:
from sklearn.linear_model import ElasticNet
In [12]:
base_elastic_model = ElasticNet()

TASK: The Elastic Net model has two main parameters, alpha and the L1 ratio. Create a dictionary parameter grid of values for the ElasticNet. Feel free to play around with these values, keep in mind, you may not match up exactly with the solution choices

In [13]:
param_grid = {'alpha':[0.1,1,5,10,50,100],
              'l1_ratio':[.1, .5, .7, .9, .95, .99, 1]}

TASK: Using scikit-learn create a GridSearchCV object and run a grid search for the best parameters for your model based on your scaled training data. In case you are curious about the warnings you may recieve for certain parameter combinations

In [14]:
from sklearn.model_selection import GridSearchCV
In [15]:
# verbose number a personal preference
grid_model = GridSearchCV(estimator=base_elastic_model,
                          param_grid=param_grid,
                          scoring='neg_mean_squared_error',
                          cv=5,
                          verbose=1)
In [16]:
grid_model.fit(scaled_X_train,y_train)
Fitting 5 folds for each of 42 candidates, totalling 210 fits
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 139422008253.771, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 165405536738.3816, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 132401459345.80408, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 198623915721.2173, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 155332691866.2165, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 323508201435.6844, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 336948879951.125, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 357509968449.6958, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 388683651672.2949, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 292809907400.37354, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 354244712120.8137, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 363490863623.8183, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 408539308800.79504, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 407083927692.8738, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 355296428563.4154, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 349811975084.8027, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 359107413789.82935, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 406611694447.3535, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 402834204598.12885, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 368658319394.3744, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 302531521066.7681, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 312733820929.518, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 347531367248.8534, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 352595740834.97046, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 255889875128.0997, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 170731423283.75342, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 186853289260.33496, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 196463479537.0419, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 212820336042.76007, tolerance: 1438198040.088288
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 4848819954.123901, tolerance: 1345680018.2551236
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 13130431879.617554, tolerance: 1355206692.5276787
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 4413768640.779419, tolerance: 1307913805.6588457
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 36282285999.41675, tolerance: 1415056940.0061066
  positive)
c:\users\marcial\anaconda3\envs\ml_master\lib\site-packages\sklearn\linear_model\_coordinate_descent.py:531: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 3970004908.5355225, tolerance: 1438198040.088288
  positive)
[Parallel(n_jobs=1)]: Done 210 out of 210 | elapsed:   19.3s finished
Out[16]:
GridSearchCV(cv=5, estimator=ElasticNet(),
             param_grid={'alpha': [0.1, 1, 5, 10, 50, 100],
                         'l1_ratio': [0.1, 0.5, 0.7, 0.9, 0.95, 0.99, 1]},
             scoring='neg_mean_squared_error', verbose=1)

TASK: Display the best combination of parameters for your model

In [17]:
grid_model.best_params_
Out[17]:
{'alpha': 100, 'l1_ratio': 1}

TASK: Evaluate your model's performance on the unseen 10% scaled test set. In the solutions notebook we achieved an MAE of $\$$14149 and a RMSE of $$$20532

In [18]:
y_pred = grid_model.predict(scaled_X_test)
In [19]:
from sklearn.metrics import mean_absolute_error,mean_squared_error
In [20]:
mean_absolute_error(y_test,y_pred)
Out[20]:
14195.35490056217
In [21]:
np.sqrt(mean_squared_error(y_test,y_pred))
Out[21]:
20558.508566893164
In [22]:
np.mean(df['SalePrice'])
Out[22]:
180815.53743589742

Great work!


</html>