Fast Hyperparameter Tuning to Improve Model Performance

TurinTech AI
3 min readDec 8, 2021

--

Image by Daniel J. TOTH

Hyperparameter tuning plays an important role in the process of training an optimal machine learning model. During the training process, the performance of the target model is evaluated by monitoring metrics such as the values of the loss function or the accuracy score on the test/validation set, on which basis the hyperparameters can be fine-tuned to improve the model performance.

Figure 1. Hyperparameter Tuning process of XGBoost classifier with NSGAII Algorithm

Grid Search vs Random Search

Among many hyperparameter tuning techniques, two of the most basic and widely used are grid search and random search. In grid search, a.k.a brute force search, a grid of hyperparamter values are set up for evaluation, which enumerates every combination of hyperparameters. The disadvantage of this approach is that the grid grows exponentially with the number of hyperparameters.

Figure 2. Grid Search vs Random Search.
Image adapted from Bergstra and Bengio 2012 by Sydney F

Unlike grid search, random search selects random combinations of hyperparmaters to evaluate, which over the same domain is able to find models that are competitive within a small fraction of the computation time of a grid search. It can find the optimal hyperparameter combination by effectively searching a larger and high dimensional configuration space. Furthermore, it has also been shown to be sufficiently efficient for learning neural networks for several datasets.

Bayesian Optimisation

In order to improve the efficiency of the hyperparameter tuning process, Sequential Model Based Optimisation (SMBO) has been used in many applications where evaluation of the fitness function is expensive. The most typical and widely used one is Bayesian optimisation. It looks for the most promising hyperparameters according to the surrogate function which is much cheaper and easier to optimise, and evaluates them with the actual objective function.

Example of defining search space of hyperparameters of classification models by using HyperOpt:

Evolutionary Algorithm

Evolutionary algorithm is also recognised as a promising optimisation approach for hyperparameter tuning, especially surrogate model assisted Evolutionary Algorithm. Surrogate models, also called metamodels, are less expensive to run, and have the ability to approximate complex objective functions, so as to reproduce experiments or perform many repeats of their own experiments without relying on tremendous computational resources.

Figure 3. Evolutionary Algorithm. Image by Pablormier

Here is a piece of example code to use Platypus NSGAII to optimise hyperparameters for an XGBoost classification model, and maximise the accuracy score:

Better and Faster Optimisation with EvoML

Our Evolutionary AI Optimisation platform -EvoML- enables you to optimise hyperparameter better and faster. Apart from basic Bayesian, Evolutionary Algorithm, Random Search, EvoML also introduces a novel approach, Intelligence Evolutionary Algorithm, which integrates surrogate model into evolutionary algorithm to further shorten hyperparameter tuning time and speed up convergence. This efficient process allows you to achieve optimal model performance faster with less compute resources.

If you want to learn more about improving model performance, check out our blog How Can Complex Models Run Fast.

About the Author

Yuxi Huan​ ​| TurinTech Research & Engineering

Passionate about data science and engineering, interested in optimisation research. Ballet enthusiasts, big fan of classical music, traveling, baking, swimming, always looking to try out new things!

About TurinTech

TurinTech is the leader in Artificial Intelligence Optimisation. TurinTech empowers businesses to build efficient and scalable AI by automating the whole data science lifecycle with multi-objective optimisation. TurinTech enables organisations to drive AI transformation with minimum human effort, at scale and at speed.

TurinTech — AI. Optimised.

Learn more about TurinTech
Follow us on social media: LinkedIn and Twitter

--

--

No responses yet