Timezone: »

 
Oral
Escaping the Gravitational Pull of Softmax
Jincheng Mei · Chenjun Xiao · Bo Dai · Lihong Li · Csaba Szepesvari · Dale Schuurmans

Tue Dec 08 06:15 AM -- 06:30 AM (PST) @ Orals & Spotlights: Reinforcement Learning

The softmax is the standard transformation used in machine learning to map real-valued vectors to categorical distributions. Unfortunately, this transform poses serious drawbacks for gradient descent (ascent) optimization. We reveal this difficulty by establishing two negative results: (1) optimizing any expectation with respect to the softmax must exhibit sensitivity to parameter initialization (softmax gravity well''), and (2) optimizing log-probabilities under the softmax must exhibit slow convergence (softmax damping''). Both findings are based on an analysis of convergence rates using the Non-uniform \L{}ojasiewicz (N\L{}) inequalities. To circumvent these shortcomings we investigate an alternative transformation, the \emph{escort} mapping, that demonstrates better optimization properties. The disadvantages of the softmax and the effectiveness of the escort transformation are further explained using the concept of N\L{} coefficient. In addition to proving bounds on convergence rates to firmly establish these results, we also provide experimental evidence for the superiority of the escort transformation.

Author Information

Jincheng Mei (University of Alberta / Google Brain)
Chenjun Xiao (University of Alberta)
Bo Dai (Google Brain)
Lihong Li (Amazon)
Lihong Li

Lihong Li is a Senior Principal Scientist at Amazon. He obtained a PhD degree in Computer Science from Rutgers University. After that, he has held research positions in Yahoo!, Microsoft and Google. His main research interests are in reinforcement learning, including contextual bandits, and related problems in AI. His work has found applications in recommendation, advertising, Web search and conversation systems, and has won best paper awards at ICML, AISTATS and WSDM. He regularly serves as area chair or senior program committee member at major AI/ML conferences such as AAAI, AISTATS, ICLR, ICML, IJCAI and NeurIPS.

Csaba Szepesvari (DeepMind / University of Alberta)
Dale Schuurmans (Google Brain & University of Alberta)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors