Mon 5 Nov 2018 11:37 - 12:00 at Rock Lake - Session 2: Defect prediction

Hyperparameter tuning is the black art of automatically finding a good combination of control parameters for a data miner. While widely applied in empirical Software Engineering, there has not been much discussion on which hyperparameter tuner is best for software analytics.To address this gap in the literature, this paper applied a range of hyperparameter optimizers (grid search, random search, differential evolution, and Bayesian optimization) to a defect prediction problem. Surprisingly, no hyperparameter optimizer was observed to be ``best'' and, for one of the two evaluation measures studied here (F-measure), hyperparameter optimization, in 50% of cases, was no better than using default configurations.

We conclude that hyperparameter optimization is more nuanced than previously believed. While such optimization can certainly lead to large improvements in the performance of classifiers used in software analytics, it remains to be seen which specific optimizers should be applied to a new dataset.

Mon 5 Nov

Displayed time zone: Guadalajara, Mexico City, Monterrey change

11:15 - 12:00
Session 2: Defect predictionSWAN at Rock Lake
11:15
22m
Talk
Facilitating Feasibility Analysis: The Pilot Defects Prediction Dataset Maker
SWAN
Davide Falessi California Polytechnic State University, Max Jason Moede California Polytechnic State University, USA
11:37
22m
Talk
Is One Hyperparameter Optimizer Enough?
SWAN
Huy Tu North Carolina State University, USA, Vivek Nair