Previous studies using our PACT Geometry Tutor have shown that students learn mathematics more deeply in the same amount of time if they are asked not only to make correct problem solving steps but also to explain those steps (Aleven, Koedinger, and Cross 1999). In the current tutor, students enter the name of a geometry definition or theorem to explain a step, but our goal is for them to state these explanations in their own words. We are building a natural language understanding system to analyze such free-form explanations. We have identified a number of challenges posed by real student explanations and present an evaluation of our system’s performance on a corpus of such explanations. Understanding in this tutoring domain requires more than figuring out the intentions of the student, but also to what extent the student has stated those intentions with sufficient mathematical precision. We argue that this kind of analysis cannot be performed with statistical methods alone, but requires a knowledge-based understanding system.