Skip to yearly menu bar Skip to main content


[Re] Numerical influence of ReLU'(0) on backpropagation

Tommaso Martorella · Hector Manuel Ramirez Contreras · Daniel Garcia

Great Hall & Hall B1+B2 (level 1) #1905
[ ] [ Project Page ]
Thu 14 Dec 3 p.m. PST — 5 p.m. PST


Neural networks have become very common in machine learning, and new problems and trends arise as the trade-off between theory, computational tools and real-world problems become more narrow and complex. We decided to retake the influence of the ReLU'(0) on the backpropagation as it has become more common to use lower floating point precisions in the GPUs so that more tasks can run in parallel and make training and inference more efficient. As opposed to what theory suggests, the original authors shown that when using 16- and 32-bit precision, the value of ReLU'(0) may influence the result. In this work we extended some experiments to see how the training and test loss are affected in simple and more complex models.

Chat is not available.