In many of the abstract geometric models which have been used to represent concepts and their relationships, regions possessing some cohesive property such as convexity or linearity have played a significant role. When the implication or containment relationship is used as an ordering relationship in such models, this gives rise to logical operators for which the disjunction of two concepts is often larger than the set union obtained in Boolean models. This paper describes some of the characteristic properties of such broad non-distributive composition operations and their applications to learning algorithms and classification structures. As an example we describe a quad tree representation which we have used to provide a structure for indexing objects and composition of regions in a spatial database. The quad tree combines logical, algebraic and geometric properties in a naturally non-distributive fashion. The lattice of subspaces of a vector space is presented as a special example, which draws a middle-way between 'noninductive' Boolean logic and 'overindictive' tree-structures. This gives rise to composition operations that are already used as models in physics and cognitive science.