Skip to content

Commit f0fb8b4

Browse files
Fix adaptive epsilon for Adam and indexed gradient updates.
PiperOrigin-RevId: 590078069
1 parent 21767f1 commit f0fb8b4

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tf_keras/optimizers/adam.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -201,7 +201,7 @@ def update_step(self, gradient, variable):
201201
v_hat = self._velocity_hats[self._index_dict[var_key]]
202202
v_hat.assign(tf.maximum(v_hat, v))
203203
v = v_hat
204-
variable.assign_sub((m * alpha) / (tf.sqrt(v) + self.epsilon))
204+
variable.assign_sub((m * alpha) / (tf.sqrt(v) + epsilon_hat))
205205
else:
206206
# Dense gradients.
207207
m.assign_add((gradient - m) * (1 - self.beta_1))

0 commit comments

Comments
 (0)