Weighted Oblique Decision Trees

Authors

  • Bin-Bin Yang Nanjing University
  • Song-Qing Shen Nanjing University
  • Wei Gao Nanjing University

DOI:

https://doi.org/10.1609/aaai.v33i01.33015621

Abstract

Decision trees have attracted much attention during the past decades. Previous decision trees include axis-parallel and oblique decision trees; both of them try to find the best splits via exhaustive search or heuristic algorithms in each iteration. Oblique decision trees generally simplify tree structure and take better performance, but are always accompanied with higher computation, as well as the initialization with the best axis-parallel splits. This work presents the Weighted Oblique Decision Tree (WODT) based on continuous optimization with random initialization. We consider different weights of each instance for child nodes at all internal nodes, and then obtain a split by optimizing the continuous and differentiable objective function of weighted information entropy. Extensive experiments show the effectiveness of the proposed algorithm.

Downloads

Published

2019-07-17

How to Cite

Yang, B.-B., Shen, S.-Q., & Gao, W. (2019). Weighted Oblique Decision Trees. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 5621-5627. https://doi.org/10.1609/aaai.v33i01.33015621

Issue

Section

AAAI Technical Track: Machine Learning