Change the Learning Rate of the Adam Optimizer on a Keras Network

InstructorChris Achard

Share this video with your friends

Send Tweet

We can specify several options on a network optimizer, like the learning rate and decay, so we’ll investigate what effect those have on training time and accuracy. Each data sets may respond differently, so it’s important to try different optimizer settings to find one that properly trades off training time vs accuracy for your data.