This open access book provides a wealth of hands-on examples that illustrate how hyperparameter
tuning can be applied in practice and gives deep insights into the working mechanisms of
machine learning (ML) and deep learning (DL) methods. The aim of the book is to equip readers
with the ability to achieve better results with significantly less time costs effort and
resources using the methods described here. The case studies presented in this book can be run
on a regular desktop or notebook computer. No high-performance computing facilities are
required. The idea for the book originated in a study conducted by Bartz & Bartz GmbH for the
Federal Statistical Office of Germany (Destatis). Building on that study the book is addressed
to practitioners in industry as well as researchers teachers and students in academia. The
content focuses on the hyperparameter tuning of ML and DL algorithms and is divided into two
main parts: theory (Part I) and application (Part II). Essential topics covered include: a
survey of important model parameters four parameter tuning studies and one extensive global
parameter tuning study statistical analysis of the performance of ML and DL methods based on
severity and a new consensus-ranking-based way to aggregate and analyze results from multiple
algorithms. The book presents analyses of more than 30 hyperparameters from six relevant ML and
DL methods and provides source code so that users can reproduce the results. Accordingly it
serves as a handbook and textbook alike.