The Cuda integration v2

By Youssef MENJOUR , Graiphic CTO

A first version of the Cuda integration is now used on HAIBAL to allow you to get the best out of your NVIDIA graphics card. The next step will be to finish and integrate all the functions (layers, optimizers, metrics). We will also continue to integrate the latest programming innovations from NVIDIA.

 

A first version of the Cuda integration is now used on HAIBAL to allow you to get the best out of your NVIDIA graphic card. The next step will be to finish and integrate all the functions (layers, optimizers, metrics). We will also continue to integrate the latest programming innovations from NVIDIA.

 

HAIBAL IN A FEW FIGURES

  • 16 activation functions (ELU, Exponential, GELU, HardSigmoid, LeakyReLU, Linear, PRELU, ReLU, SELU, Sigmoid, SoftMax, SoftPlus, SoftSign, Swish, TanH, ThresholdedReLU)
  • 84 functional layers (Dense, Conv, MaxPool, RNN, Dropout…)
  • 14 loss functions (BinaryCrossentropy, BinaryCrossentropyWithLogits, Crossentropy, CrossentropyWithLogits, Hinge, Huber, KLDivergence, LogCosH, MeanAbsoluteError, MeanAbsolutePercentage, MeanSquare, MeanSquareLog, Poisson, SquaredHinge)
  • 15 initialization functions (Constant, GlorotNormal, GlorotUniform, HeNormal, HeUniform, Identity, LeCunNormal, LeCunUniform, Ones, Orthogonal, RandomNormal, Random,Uniform, TruncatedNormal, VarianceScaling, Zeros)
  • 7 optimizers (Adagrad, Adam, Inertia, Nadam, Nesterov, RMSProp, SGD)

 

 

HAIBAL RELEASES

You can consult the latest release notes on this page