File tree Expand file tree Collapse file tree 2 files changed +3
-3
lines changed
tensorflow-framework/src/main/java/org/tensorflow/framework/activations Expand file tree Collapse file tree 2 files changed +3
-3
lines changed Original file line number Diff line number Diff line change 3434 * <p>where {@code alpha} and {@code scale} are pre-defined constants ({@code alpha=1.67326324} and
3535 * {@code scale=1.05070098}).
3636 *
37- * <p>Basically, the SELU activation function multiplies {@code scale} (> 1) with the output of the
37+ * <p>Basically, the SELU activation function multiplies {@code scale} (> 1) with the output of the
3838 * elu function to ensure a slope larger than one for positive inputs.
3939 *
4040 * <p>The values of {@code alpha} and {@code scale} are chosen so that the mean and variance of the
Original file line number Diff line number Diff line change 2424/**
2525 * Sigmoid activation. {@code sigmoid(x) = 1 / (1 + exp(-x))}.
2626 *
27- * <p>Applies the sigmoid activation function. For small values {@code (< -5)} , {@code sigmoid}
28- * returns a value close to zero, and for large values (> 5) the result of the function gets close to
27+ * <p>Applies the sigmoid activation function. For small values (< -5), {@code sigmoid}
28+ * returns a value close to zero, and for large values (> 5) the result of the function gets close to
2929 * 1.
3030 *
3131 * <p>Sigmoid is equivalent to a 2-element Softmax, where the second element is assumed to be zero.
You can’t perform that action at this time.
0 commit comments