# Hyperparameter Tuning with Keras Tuner

# Hyperparameter Tuning with Keras Tuner: A Beginner’s Guide

Hyperparameter tuning is crucial for optimizing machine learning models. In this tutorial, we’ll use Keras Tuner to automatically find the best hyperparameters for a TensorFlow model using the MNIST dataset.

## Prerequisites

- Basic understanding of Python and TensorFlow
- TensorFlow installed in your environment

## Step 1: Install Keras Tuner

First, let’s install the Keras Tuner library:

```
pip install -q -U keras-tuner
```

## Step 2: Import Required Libraries

Now, let’s import the necessary packages:

```
import tensorflow as tf
from tensorflow import keras
import keras_tuner as kt
```

## Step 3: Prepare the MNIST Dataset

Load and preprocess the MNIST dataset:

```
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
x_train = x_train.astype('float32') / 255.0
x_test = x_test.astype('float32') / 255.0
```

## Step 4: Define the Model Building Function

Create a function that builds and compiles the model:

```
def model_builder(hp):
model = keras.Sequential()
model.add(keras.layers.Flatten(input_shape=(28, 28)))
# Tune the number of units in the first Dense layer
hp_units = hp.Int('units', min_value=32, max_value=512, step=32)
model.add(keras.layers.Dense(units=hp_units, activation='relu'))
model.add(keras.layers.Dense(10))
# Tune the learning rate for the optimizer
hp_learning_rate = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
model.compile(optimizer=keras.optimizers.Adam(learning_rate=hp_learning_rate),
loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
return model
```

This function defines a model with tunable hyperparameters:

`hp_units`

: Number of units in the first Dense layer (32 to 512)`hp_learning_rate`

: Learning rate for the Adam optimizer (0.01, 0.001, or 0.0001)

## Step 5: Create the Hyperparameter Tuner

Initialize the Hyperband tuner:

```
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt')
```

We’re using the Hyperband algorithm, which is efficient for tuning deep learning models.

## Step 6: Implement Early Stopping

To prevent overfitting during tuning, let’s add an early stopping callback:

```
early_stop = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=5)
```

## Step 7: Perform Hyperparameter Search

Now, let’s start the hyperparameter search:

```
tuner.search(x_train, y_train,
epochs=30,
validation_split=0.2,
callbacks=[early_stop])
```

This process may take some time as it explores different hyperparameter combinations.

## Step 8: Get Best Hyperparameters and Train Final Model

After the search is complete, we can get the best hyperparameters and train our final model:

```
# Get the optimal hyperparameters
best_hps = tuner.get_best_hyperparameters(num_trials=1)[0]
# Build the model with the optimal hyperparameters and train it
model = tuner.hypermodel.build(best_hps)
history = model.fit(x_train, y_train, epochs=50, validation_split=0.2)
```

## Step 9: Evaluate the Final Model

Finally, let’s evaluate our tuned model on the test set:

```
test_loss, test_acc = model.evaluate(x_test, y_test, verbose=2)
print(f'\nTest accuracy: {test_acc}')
```

## Conclusion

Congratulations! You’ve successfully used Keras Tuner to perform hyperparameter tuning on a TensorFlow model. Here’s what we covered:

- Setting up Keras Tuner
- Defining a model with tunable hyperparameters
- Using the Hyperband algorithm for efficient hyperparameter search
- Implementing early stopping to prevent overfitting
- Training the final model with the best hyperparameters

This tuned model should perform better than a model with arbitrary hyperparameters. As you become more comfortable with these concepts, you can explore more complex models and additional hyperparameters to tune.

Remember, while automated tuning is powerful, it’s also important to understand the intuition behind different hyperparameters and how they affect your model. Happy tuning!