data • Linear regression, Random Forest, GBMs • $1,352 WMAE, Random Forest • Feature engineering and Hyperparameter tuning • Scored Top 14% on Kaggle leaderboard Nicolas
columns advertisement data • Feature selection, imputing, scaling, encoding • Linear regression, Random Forest, GBMs • Feature engineering and hyperparameter tuning • Best RMSE of 0.24857 for Random Forest Deepa
credit behaviour data • Logistic regression, Random Forest, GBMs • 93.89% after 20-hour hyperparameter tuning • Scored Top 21% on Kaggle leaderboard David