Topics
Namespaces
clj-djl.ndarray
Public variables and functions:
- *
- *!
- **
- +
- +!
- -
- -!
- /
- <
- <=
- =
- >
- >=
- abs
- acos
- all-close
- arange
- argmax
- argmin
- argsort
- base-manager
- concat
- copy
- create
- create-csr
- create-row-sparse
- cumsum
- datatype
- div
- dot
- dup
- duplicate
- exp
- expand-dims
- eye
- flatten
- float-or-int
- full
- get
- get-device
- get-element
- get-gradient
- get-shape
- head
- index
- is-sparse
- linspace
- log
- log-softmax
- log10
- max
- mean
- min
- mul
- muli
- ndindex
- ndlist
- new-base-manager
- new-ndindex
- new-ndlist
- new-shape
- norm
- ones
- ones-like
- pow
- pp
- prod
- random-multinomial
- random-normal
- random-uniform
- reshape
- scalar?
- set
- set-requires-gradient
- set-scalar
- shape
- singleton-or-throw
- size
- sort
- sparse?
- split
- sqrt
- squeeze
- stack
- sum
- t
- to-array
- to-type
- to-vec
- trace
- transpose
- zeros
- zeros-like
clj-djl.nn
Public variables and functions:
- add
- batch-flatten
- batch-flatten-block
- batchnorm-block
- build
- clear
- cov2d-block
- dropout
- elu
- elu-block
- forward
- gelu
- gelu-block
- get-parameters
- identity-block
- initialize
- leaky-relu
- leaky-relu-block
- linear
- linear-block
- linear-builder
- mish
- mish-block
- new-linear-builder
- new-normal-initializer
- normal-initializer
- opt-bias
- prelu-block
- relu
- relu-block
- selu
- selu-block
- sequential
- sequential-block
- set-initializer
- set-units
- sigmoid
- sigmoid-block
- softplus
- softplus-block
- swish
- swish-block
- tanh
- tanh-block
clj-djl.training
Public variables and functions:
- accuracy
- add-accumulator
- add-evaluator
- add-training-listeners
- as-consumer
- backward
- binary-accuracy
- close
- config
- default-training-config
- fit
- forward
- get-accumulator
- get-devices
- get-evaluators
- get-gradient
- get-loss
- get-manager
- get-metrics
- get-model
- get-result
- get-training-result
- gradient-collector
- initialize
- iter-seq
- iterate-dataset
- metrics
- new-accuracy
- new-binary-accuracy
- new-default-training-config
- new-default-training-listeners
- new-gradient-collector
- new-progress-bar
- new-topk-accuracy
- new-trainer
- new-training-config
- notify-listeners
- opt-initializer
- opt-optimizer
- parameter-store
- progress-bar
- set-metrics
- set-requires-gradient
- softmax-cross-entropy-loss
- step
- stop-gradient
- topk-accuracy
- train-batch
- trainer
- training-config
- training-listeners
- update-accumulator
- validate-batch