Rectified linear unit (RELU)

[Google Scholar] [Wikipedia]

Notes: neural network, activation function
Papers:

An activation function used in neural networks defined by

f(x) = max(0, x)

Activation function

  • SIGMA: A sparse and irregular GEMM accelerator with flexible interconnects for DNN training [qin:hpca:2020]