반응형
Modeling
Applied Predictive Modeling
1. XGBoost
from xgboost.sklearn import XGBModel
from xgboost import XGBRegressor
pipe = make_pipeline(
OrdinalEncoder(),
XGBRegressor()
)
pipe.fit(X_train, y_train)
print('훈련 R^2: ', pipe.score(X_train, y_train))
print('TEST R^2: ', pipe.score(X_test, y_test))
print('\n훈련 MAE: ', mean_absolute_error(pipe.predict(X_train), y_train))
print('TEST MAE: ', mean_absolute_error(pipe.predict(X_test), y_test))
encoder = OrdinalEncoder()
X_train_encoded = encoder.fit_transform(X_train)
X_test_encoded = encoder.transform(X_test)
model = XGBRegressor(
n_estimators=3000,
max_depth=9,
learning_rate=0.2,
)
eval_set = [(X_train_encoded, y_train),
(X_test_encoded, y_test)]
model.fit(X_train_encoded, y_train,
eval_set=eval_set,
early_stopping_rounds=50
)
2. LightGBM
from lightgbm import LGBMRegressor
pipe = make_pipeline(
OrdinalEncoder(),
LGBMRegressor()
)
pipe.fit(X_train, y_train)
print('훈련 R^2: ', pipe.score(X_train, y_train))
print('TEST R^2: ', pipe.score(X_test, y_test))
print('\n훈련 MAE: ', mean_absolute_error(pipe.predict(X_train), y_train))
print('TEST MAE: ', mean_absolute_error(pipe.predict(X_test), y_test))
encoder = OrdinalEncoder()
X_train_encoded = encoder.fit_transform(X_train)
X_test_encoded = encoder.transform(X_test)
model = LGBMRegressor(
n_estimators=3000,
max_depth=9,
learning_rate=0.2,
)
eval_set = [(X_train_encoded, y_train),
(X_test_encoded, y_test)]
model.fit(X_train_encoded, y_train,
eval_set=eval_set,
early_stopping_rounds=50
)
반응형
'기본소양 > CODE' 카테고리의 다른 글
3. Applied Predictive Modeling [2] Importance (0) | 2021.02.18 |
---|---|
3. Applied Predictive Modeling [0] Preparing (0) | 2021.02.18 |
2. Tree based model CODE [4] Hyperparameter Tuning / Threshold (0) | 2021.02.09 |
2. Tree based model CODE [3] Model Selection (0) | 2021.02.09 |
2. Tree based model CODE [2] Tree Model (0) | 2021.02.09 |
댓글