clj-djl.nn
batchnorm-block
(batchnorm-block)(batchnorm-block {:keys [axis center epsilon momentum scale]})cov2d-block
(cov2d-block {:keys [kernel-shape filters bias dilation groups padding stride]})elu
(elu data alpha)Applies ELU(Exponential Linear Unit) activation on the input NDArray or NDList
elu-block
(elu-block alpha)Creates a LambdaBlock that applies the ELU activation function in its forward function
ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))
forward
(forward block inputs)(forward block paramstore inputs labels-or-training? & [params])gelu
(gelu data)Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or NDList
gelu-block
(gelu-block)Creates a LambdaBlock that applies the GELU activation function in its forward function
leaky-relu-block
(leaky-relu-block alpha)Create a LamdaBlock with LeakyReLU as forward function:
LeakyRelu = (if (>= x 0) x (* neg_slope x))
mish-block
(mish-block)Creates a LambdaBlock that applies the Mish activation function in its forward function
prelu-block
(prelu-block)Creates a LambdaBlock that applies the PreLU activation function in its forward function, the neg_slope is learnt during training
selu
(selu data)Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or NDList
selu-block
(selu-block)Creates a LambdaBlock that applies the SELU activation function in its forward function
SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is 1.0507009873554804934193349852946 and alpha is 1.6732632423543772848170429916717
swish-block
(swish-block beta)Creates a LambdaBlock that applies the Swish activation function in its forward function