AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Latent Dependency Forest Models
Shanbo Chu, Yong Jiang, Kewei Tu

Last modified: 2017-02-12

Abstract


Probabilistic modeling is one of the foundations of modern machine learning and artificial intelligence. In this paper, we propose a novel type of probabilistic models named latent dependency forest models (LDFMs). A LDFM models the dependencies between random variables with a forest structure that can change dynamically based on the variable values. It is therefore capable of modeling context-specific independence. We parameterize a LDFM using a first-order non-projective dependency grammar. Learning LDFMs from data can be formulated purely as a parameter learning problem, and hence the difficult problem of model structure learning is circumvented. Our experimental results show that LDFMs are competitive with existing probabilistic models.

Keywords


probabilistic modeling; latent dependency forest model; non-projective dependency grammar

Full Text: PDF