Factorization machines (FMs) are a supervised learning approach that can use second-order feature combinations even when the data is very high-dimensional. Unfortunately, despite increasing interest in FMs, there exists to date no efficient training algorithm for higher-order FMs (HOFMs). In this paper, we present the first generic yet efficient algorithms for training arbitrary-order HOFMs. We also present new variants of HOFMs with shared parameters, which greatly reduce model size and prediction times while maintaining similar accuracy. We demonstrate the proposed approaches on four different link prediction tasks.
Mathieu Blondel (NTT)
Research scientist at NTT CS Labs.
Akinori Fujino (NTT)
Naonori Ueda (NTT Communication Science Laboratories / RIKEN AIP)
Masakazu Ishihata (Hokkaido University)
More from the Same Authors
2022 Poster: Efficient and Modular Implicit Differentiation »
Mathieu Blondel · Quentin Berthet · Marco Cuturi · Roy Frostig · Stephan Hoyer · Felipe Llinares-Lopez · Fabian Pedregosa · Jean-Philippe Vert
2022 Poster: Learning Energy Networks with Generalized Fenchel-Young Losses »
Mathieu Blondel · Felipe Llinares-Lopez · Robert Dadashi · Leonard Hussenot · Matthieu Geist
2017 Poster: Multi-output Polynomial Networks and Factorization Machines »
Mathieu Blondel · Vlad Niculae · Takuma Otsuka · Naonori Ueda
2017 Poster: A Regularized Framework for Sparse and Structured Neural Attention »
Vlad Niculae · Mathieu Blondel