clj-djl.nn

add

(add net block)

batch-flatten

(batch-flatten array & more)

batch-flatten-block

(batch-flatten-block & more)

batchnorm-block

(batchnorm-block)(batchnorm-block {:keys [axis center epsilon momentum scale]})

build

(build builder)

clear

(clear block)

cov2d-block

(cov2d-block {:keys [kernel-shape filters bias dilation groups padding stride]})

dropout

(dropout {:keys [rate]})

elu

(elu data alpha)

Applies ELU(Exponential Linear Unit) activation on the input NDArray or NDList

elu-block

(elu-block alpha)

Creates a LambdaBlock that applies the ELU activation function in its forward function

ELU <- (if (> x 0) x (* alpha (- (pow e x) 1)))

forward

(forward block inputs)(forward block paramstore inputs labels-or-training? & [params])

gelu

(gelu data)

Applies GELU(Gausian Error Linear Unit) activation on the input NDArray or NDList

gelu-block

(gelu-block)

Creates a LambdaBlock that applies the GELU activation function in its forward function

get-parameters

(get-parameters block)

identity-block

(identity-block)

initialize

(initialize block manager datatype- & input-shapes)

leaky-relu

(leaky-relu data alpha)

leaky-relu-block

(leaky-relu-block alpha)

Create a LamdaBlock with LeakyReLU as forward function:

LeakyRelu = (if (>= x 0) x (* neg_slope x))

linear

(linear {:keys [bias units]})

linear-block

linear-builder

(linear-builder)

mish

(mish data)

Applies Mish activation on the input NDArray or NDList

mish-block

(mish-block)

Creates a LambdaBlock that applies the Mish activation function in its forward function

new-linear-builder

new-normal-initializer

normal-initializer

(normal-initializer)(normal-initializer sigma)

opt-bias

(opt-bias builder bias)

prelu-block

(prelu-block)

Creates a LambdaBlock that applies the PreLU activation function in its forward function, the neg_slope is learnt during training

relu

(relu data)

relu-block

(relu-block)

selu

(selu data)

Applies SELU(Scaled Exponential Linear Unit) activation on the input NDArray or NDList

selu-block

(selu-block)

Creates a LambdaBlock that applies the SELU activation function in its forward function

SELU <- (* lambda (if (> x 0) x (* alpha (- (pow e x) 1)))), where lamda is 1.0507009873554804934193349852946 and alpha is 1.6732632423543772848170429916717

sequential

(sequential)(sequential {:keys [blocks initializer parameter]})

sequential-block

set-initializer

(set-initializer net initializer parameter)

set-units

(set-units builder unit)

sigmoid

(sigmoid data)

sigmoid-block

(sigmoid-block)

softplus

(softplus data)

softplus-block

(softplus-block)

swish

(swish data beta)

Applies Swish activation on the input NDArray or NDList

swish-block

(swish-block beta)

Creates a LambdaBlock that applies the Swish activation function in its forward function

tanh

(tanh data)

tanh-block

(tanh-block)