Random Feature Maps for the Itemset Kernel

Authors

  • Kyohei Atarashi Hokkaido University
  • Subhransu Maji University of Massachusetts, Amherst
  • Satoshi Oyama Hokkaido University

DOI:

https://doi.org/10.1609/aaai.v33i01.33013199

Abstract

Although kernel methods efficiently use feature combinations without computing them directly, they do not scale well with the size of the training dataset. Factorization machines (FMs) and related models, on the other hand, enable feature combinations efficiently, but their optimization generally requires solving a non-convex problem. We present random feature maps for the itemset kernel, which uses feature combinations, and includes the ANOVA kernel, the all-subsets kernel, and the standard dot product. Linear models using one of our proposed maps can be used as an alternative to kernel methods and FMs, resulting in better scalability during both training and evaluation. We also present theoretical results for a proposed map, discuss the relationship between factorization machines and linear models using a proposed map for the ANOVA kernel, and relate the proposed feature maps to prior work. Furthermore, we show that the maps can be calculated more efficiently by using a signed circulant matrix projection technique. Finally, we demonstrate the effectiveness of using the proposed maps for real-world datasets..

Downloads

Published

2019-07-17

How to Cite

Atarashi, K., Maji, S., & Oyama, S. (2019). Random Feature Maps for the Itemset Kernel. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 3199-3206. https://doi.org/10.1609/aaai.v33i01.33013199

Issue

Section

AAAI Technical Track: Machine Learning