EA-CG: An Approximate Second-Order Method for Training Fully-Connected Neural Networks

Authors

  • Sheng-Wei Chen HTC Research
  • Chun-Nan Chou HTC Research
  • Edward Y. Chang HTC Research

DOI:

https://doi.org/10.1609/aaai.v33i01.33013337

Abstract

For training fully-connected neural networks (FCNNs), we propose a practical approximate second-order method including: 1) an approximation of the Hessian matrix and 2) a conjugate gradient (CG) based method. Our proposed approximate Hessian matrix is memory-efficient and can be applied to any FCNNs where the activation and criterion functions are twice differentiable. We devise a CG-based method incorporating one-rank approximation to derive Newton directions for training FCNNs, which significantly reduces both space and time complexity. This CG-based method can be employed to solve any linear equation where the coefficient matrix is Kroneckerfactored, symmetric and positive definite. Empirical studies show the efficacy and efficiency of our proposed method.

Downloads

Published

2019-07-17

How to Cite

Chen, S.-W., Chou, C.-N., & Chang, E. Y. (2019). EA-CG: An Approximate Second-Order Method for Training Fully-Connected Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 3337-3346. https://doi.org/10.1609/aaai.v33i01.33013337

Issue

Section

AAAI Technical Track: Machine Learning