An activation function used in neural networks defined by

```
f(x) = max(0, x)
```

## Notes related to Rectified linear unit (RELU)

## Papers related to Rectified linear unit (RELU)

- SIGMA: A sparse and irregular GEMM accelerator with flexible interconnects for DNN training [qin:hpca:2020]