![]() ![]() More about this approach can be found at. This tuner uses a bandit-based approach to optimization. In order to tune these hyperparameters I am using the Hyperband Keras tuner: tuner = kt.Hyperband(hypermodel=build_hypermodel, objective='val_loss', max_epochs=100, factor=3, hyperband_iterations=1, directory='test_dir', project_name='a', seed=seed, overwrite=True) I am tuning the size of the filters in the Conv2D layers and the hidden units in the first Dense layer. ![]() Note that I used filter sizes of 16, 32, and 64 in my Conv2d layers and 64 units in the first Dense layer. Model: "functional_1" _ Layer (type) Output Shape Param # = inp (InputLayer) 0 _ conv2d (Conv2D) (None, 30, 30, 16) 448 _ max_pooling2d (MaxPooling2D) (None, 15, 15, 16) 0 _ conv2d_1 (Conv2D) (None, 13, 13, 32) 4640 _ max_pooling2d_1 (MaxPooling2 (None, 6, 6, 32) 0 _ conv2d_2 (Conv2D) (None, 4, 4, 64) 18496 _ max_pooling2d_2 (MaxPooling2 (None, 2, 2, 64) 0 _ flatten (Flatten) (None, 256) 0 _ dense (Dense) (None, 64) 16448 _ dense_1 (Dense) (None, 10) 650 = Total params: 40,682 Trainable params: 40,682 Non-trainable params: 0Īs show below I am going to train it with the Adam optimizer and default learning rate: tf.(learning_rate=1e-3)Īfter training this resulted in an accuracy of 66.5%. BaselineĪs in previous discussions, I ran a baseline model for comparison to the tuned models. ![]() Feel free to follow along here and in an example notebook on github. In this part I am going to introduce another of the built-in tuners in the Keras Tuner library and apply it to avoiding overfitting when training. I looked at learning rates and choices of optimizers. In part 3 of this discussion, I introduced the concept of tuning hyperparameters which control the training of the model.
0 Comments
Leave a Reply. |