Tuners

Tuners are here to do the hyperparameter search. You can create custom Tuners by subclassing kerastuner.engine.tuner.Tuner.

[source]

BayesianOptimization class:

kerastuner.tuners.bayesian.BayesianOptimization(hypermodel, objective, max_trials, num_initial_points=2, seed=None, hyperparameters=None, tune_new_entries=True, allow_new_entries=True, **kwargs)

BayesianOptimization tuning with Gaussian process.

Arguments:


[source]

Hyperband class:

kerastuner.tuners.hyperband.Hyperband(hypermodel, objective, max_epochs, factor=3, hyperband_iterations=1, seed=None, hyperparameters=None, tune_new_entries=True, allow_new_entries=True, **kwargs)

Variation of HyperBand algorithm.

Reference: Li, Lisha, and Kevin Jamieson. "Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization." Journal of Machine Learning Research 18 (2018): 1-52.

Arguments


[source]

RandomSearch class:

kerastuner.tuners.randomsearch.RandomSearch(hypermodel, objective, max_trials, seed=None, hyperparameters=None, tune_new_entries=True, allow_new_entries=True, **kwargs)

Random search tuner.

Arguments:


[source]

Sklearn class:

kerastuner.tuners.sklearn.Sklearn(oracle, hypermodel, scoring=None, metrics=None, cv=KFold(n_splits=5, random_state=1, shuffle=True), **kwargs)

Tuner for Scikit-learn Models.

Performs cross-validated hyperparameter search for Scikit-learn models.

Arguments:

Example:

import kerastuner as kt
from sklearn import ensemble
from sklearn import datasets
from sklearn import linear_model
from sklearn import metrics
from sklearn import model_selection

def build_model(hp):
  model_type = hp.Choice('model_type', ['random_forest', 'ridge'])
  if model_type == 'random_forest':
    model = ensemble.RandomForestClassifier(
        n_estimators=hp.Int('n_estimators', 10, 50, step=10),
        max_depth=hp.Int('max_depth', 3, 10))
  else:
    model = linear_model.RidgeClassifier(
        alpha=hp.Float('alpha', 1e-3, 1, sampling='log'))
  return model

tuner = kt.tuners.Sklearn(
    oracle=kt.oracles.BayesianOptimization(
        objective=kt.Objective('score', 'max'),
        max_trials=10),
    hypermodel=build_model,
    scoring=metrics.make_scorer(metrics.accuracy_score),
    cv=model_selection.StratifiedKFold(5),
    directory='.',
    project_name='my_project')

X, y = datasets.load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = model_selection.train_test_split(
    X, y, test_size=0.2)

tuner.search(X_train, y_train)

best_model = tuner.get_best_models(num_models=1)[0]

[source]

Tuner class:

kerastuner.engine.tuner.Tuner(oracle, hypermodel, max_model_size=None, optimizer=None, loss=None, metrics=None, distribution_strategy=None, directory=None, project_name=None, logger=None, tuner_id=None, overwrite=False)

Tuner class for Keras models.

May be subclassed to create new tuners.

Arguments:


get_best_models method:

Tuner.get_best_models(num_models=1)

Returns the best model(s), as determined by the tuner's objective.

The models are loaded with the weights corresponding to their best checkpoint (at the end of the best epoch of best trial).

This method is only a convenience shortcut. For best performance, It is recommended to retrain your Model on the full dataset using the best hyperparameters found during search.

Args: num_models (int, optional): Number of best models to return. Models will be returned in sorted order. Defaults to 1.

Returns: List of trained model instances.


get_state method:

BaseTuner.get_state()

Returns the current state of this object.

This method is called during save.


load_model method:

Tuner.load_model(trial)

Loads a Model from a given trial.

Arguments:


on_epoch_begin method:

Tuner.on_epoch_begin(trial, model, epoch, logs=None)

A hook called at the start of every epoch.

Arguments:


on_batch_begin method:

Tuner.on_batch_begin(trial, model, batch, logs)

A hook called at the start of every batch.

Arguments:


on_batch_end method:

Tuner.on_batch_end(trial, model, batch, logs=None)

A hook called at the end of every batch.

Arguments:


on_epoch_end method:

Tuner.on_epoch_end(trial, model, epoch, logs=None)

A hook called at the end of every epoch.

Arguments:


run_trial method:

Tuner.run_trial(trial, *fit_args, **fit_kwargs)

Evaluates a set of hyperparameter values.

This method is called during search to evaluate a set of hyperparameters.

Arguments:


save_model method:

Tuner.save_model(trial_id, model, step=0)

Saves a Model for a given trial.

Arguments:


search method:

BaseTuner.search(*fit_args, **fit_kwargs)

Performs a search for best hyperparameter configuations.

Arguments:


set_state method:

BaseTuner.set_state(state)

Sets the current state of this object.

This method is called during reload.

Arguments:

state: Dict. The state to restore for this object.


[source]

BaseTuner class:

kerastuner.engine.base_tuner.BaseTuner(oracle, hypermodel, directory=None, project_name=None, logger=None, overwrite=False)

Tuner base class.

May be subclassed to create new tuners, including for non-Keras models.

Arguments:


get_best_hyperparameters method:

BaseTuner.get_best_hyperparameters(num_trials=1)

Returns the best hyperparameters, as determined by the objective.

This method can be used to reinstantiate the (untrained) best model found during the search process.

Example:

best_hp = tuner.get_best_hyperparameters()[0]
model = tuner.hypermodel.build(best_hp)

Arguments:

Returns:

List of HyperParameter objects.


get_best_models method:

BaseTuner.get_best_models(num_models=1)

Returns the best model(s), as determined by the objective.

This method is only a convenience shortcut. For best performance, It is recommended to retrain your Model on the full dataset using the best hyperparameters found during search.

Arguments:

num_models (int, optional). Number of best models to return. Models will be returned in sorted order. Defaults to 1.

Returns:

List of trained model instances.


get_state method:

BaseTuner.get_state()

Returns the current state of this object.

This method is called during save.


load_model method:

BaseTuner.load_model(trial)

Loads a Model from a given trial.

Arguments:


run_trial method:

BaseTuner.run_trial(trial, *fit_args, **fit_kwargs)

Evaluates a set of hyperparameter values.

This method is called during search to evaluate a set of hyperparameters.

For subclass implementers: This method is responsible for reporting metrics related to the Trial to the Oracle via self.oracle.update_trial.

Simplest example:

def run_trial(self, trial, x, y, val_x, val_y):
    model = self.hypermodel.build(trial.hyperparameters)
    model.fit(x, y)
    loss = model.evaluate(val_x, val_y)
    self.oracle.update_trial(
      trial.trial_id, {'loss': loss})
    self.save_model(trial.trial_id, model)

Arguments:


save_model method:

BaseTuner.save_model(trial_id, model, step=0)

Saves a Model for a given trial.

Arguments:


search method:

BaseTuner.search(*fit_args, **fit_kwargs)

Performs a search for best hyperparameter configuations.

Arguments:


set_state method:

BaseTuner.set_state(state)

Sets the current state of this object.

This method is called during reload.

Arguments:

state: Dict. The state to restore for this object.