From 679f15d506d475d184827f0aee8c76391d0c491a Mon Sep 17 00:00:00 2001 From: Mikhail Orlov <56746057+Mike-Orlov@users.noreply.github.com> Date: Sun, 5 Dec 2021 18:48:07 +0300 Subject: [PATCH] Update activation_functions.rst // misprint Hi, I've just stumbled upon a little misprint in ELU block python code. We should use "np.exp(z)" instead of "e^z" (or declare variable "e"). --- docs/activation_functions.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/activation_functions.rst b/docs/activation_functions.rst index 8946f04..726fee7 100644 --- a/docs/activation_functions.rst +++ b/docs/activation_functions.rst @@ -57,7 +57,7 @@ ELU is very similiar to RELU except negative inputs. They are both in identity f +-------------------------------------------------------+------------------------------------------------------+ | .. math:: | .. math:: | | R(z) = \begin{Bmatrix} z & z > 0 \\ | R'(z) = \begin{Bmatrix} 1 & z>0 \\ | -| α.( e^z – 1) & z <= 0 \end{Bmatrix} | α.e^z & z<0 \end{Bmatrix} | +| α.( np.exp(z) – 1) & z <= 0 \end{Bmatrix} | α.e^z & z<0 \end{Bmatrix} | +-------------------------------------------------------+------------------------------------------------------+ | .. image:: images/elu.png | .. image:: images/elu_prime.png | | :align: center | :align: center |