Semantic Scholar Open Access 2021 856 sitasi

Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges

B. Bischl Martin Binder Michel Lang Tobias Pielok J. Richter +7 lainnya

Abstrak

Most machine learning algorithms are configured by a set of hyperparameters whose values must be carefully chosen and which often considerably impact performance. To avoid a time‐consuming and irreproducible manual process of trial‐and‐error to find well‐performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods—for example, based on resampling error estimation for supervised machine learning—can be employed. After introducing HPO from a general perspective, this paper reviews important HPO methods, from simple techniques such as grid or random search to more advanced methods like evolution strategies, Bayesian optimization, Hyperband, and racing. This work gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with machine learning pipelines, runtime improvements, and parallelization.

Penulis (12)

B

B. Bischl

M

Martin Binder

M

Michel Lang

T

Tobias Pielok

J

J. Richter

S

Stefan Coors

J

Janek Thomas

T

Theresa Ullmann

M

Marc Becker

A

A. Boulesteix

D

Difan Deng

M

M. Lindauer

Format Sitasi

Bischl, B., Binder, M., Lang, M., Pielok, T., Richter, J., Coors, S. et al. (2021). Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges. https://doi.org/10.1002/widm.1484

Akses Cepat

Lihat di Sumber doi.org/10.1002/widm.1484
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Total Sitasi
856×
Sumber Database
Semantic Scholar
DOI
10.1002/widm.1484
Akses
Open Access ✓