Proceedings:
Book One
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 17
Track:
Machine Learning and Data Mining
Downloads:
Abstract:
We develop a theory for learning scenarios where multiple learners co-exist but there are mutual compatibility constraints on their outcomes. This is natural in cognitive learning situations, where multiple learning tasks co-exist but there are mutual compatibility constraints on their outcomes, so that a valid sentence, image or any other domain representation is produced. We suggest that work in this direction may help to resolve the contrast between the hardness of learning as predicted by the current theoretical models and the apparent ease at which cognitive systems seem to learn. A model of concept learning is studied in which the target concept is required to cohere with other concepts of interest. The coherency is expressed via a (Boolean) constraint that the concepts have to satisfy. Under this model, learning a concept is shown to be easier (in terms of sample complexity and mistake bounds) and the concepts learned are shown to be more robust to noise in their input (attribute noise). These properties are established for half spaces and the connection to large margin theory is discussed.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 17