Module keras.optimizer_v2
Sub-modules
keras.optimizer_v2.adadelta
-
Adadelta optimizer implementation.
keras.optimizer_v2.adagrad
-
Adagrad optimizer implementation.
keras.optimizer_v2.adam
-
Adam optimizer implementation.
keras.optimizer_v2.adamax
-
Adamax optimizer implementation.
keras.optimizer_v2.ftrl
-
Ftrl-proximal optimizer implementation.
keras.optimizer_v2.gradient_descent
-
SGD optimizer implementation.
keras.optimizer_v2.learning_rate_schedule
-
Various learning rate decay functions.
keras.optimizer_v2.legacy_learning_rate_decay
-
Various learning rate decay functions.
keras.optimizer_v2.nadam
-
Nadam optimizer implementation.
keras.optimizer_v2.optimizer_v2
-
Version 2 of class Optimizer.
keras.optimizer_v2.rmsprop
-
RMSprop optimizer implementation.
keras.optimizer_v2.utils
-
Optimizer utilities.