Dying ReLU and Initialization: Theory and Numerical Examples

Communications in Computational Physics
Vol. 28 No. 5 (2020), pp. 1671-1706
Preview Buy Article · USD $20 Full text requires purchase or subscription
Author(s)
, , ,
1 MIT, Dept Math, Cambridge, MA 02139 USA
2 Brown Univ, Div Appl Math, Providence, RI 02912 USA
3 Fuzhou Univ, Coll Math & Comp Sci, Fuzhou 350116, Fujian, Peoples R China
Received
August 26, 2020
Accepted
October 14, 2020
Abstract

The dying ReLU refers to the problem when ReLU neurons become inactive and only output 0 for any input. There are many empirical and heuristic explanations of why ReLU neurons die. However, little is known about its theoretical analysis. In this paper, we rigorously prove that a deep ReLU network will eventually die in probability as the depth goes to infinite. Several methods have been proposed to alleviate the dying ReLU. Perhaps, one of the simplest treatments is to modify the initialization procedure. One common way of initializing weights and biases uses symmetric probability distributions, which suffers from the dying ReLU. We thus propose a new initialization procedure, namely, a randomized asymmetric initialization. We show that the new initialization can effectively prevent the dying ReLU. All parameters required for the new initialization are theoretically designed. Numerical examples are provided to demonstrate the effectiveness of the new initialization procedure.

Stay updated
Share
How to Cite