Skip to yearly menu bar Skip to main content


Poster

Activation-Descent Regularization for Input Optimization of ReLU Networks

Hongzhan Yu · Sicun Gao


Abstract:

We present a new approach for input optimization of ReLU networks that explicitly takes into account the effect of changes in activation patterns. We analyze local optimization steps in both the input space and the space of activation patterns to propose methods with superior local descent properties. To accomplish this, we convert the discrete space of activation patterns into differentiable representations and propose regularization terms that improve each descent step. Our experiments demonstrate the effectiveness of the proposed input-optimization methods for improving the state-of-the-art in various areas, such as adversarial learning, generative modeling, and reinforcement learning.

Live content is unavailable. Log in and register to view live content