Computational modeling of sketch understanding is interesting both scientifically and for creating systems that interact with people more naturally. Scientifically, understanding sketches requires modeling aspects of visual processing, spatial representations, and conceptual knowledge in an integrated way. Software that can understand sketches is starting to be used in classrooms, and it could have a potentially revolutionary impact as the models and technologies become more advanced. This paper looks at one such effort, Sketch Worksheets, which have been used in multiple classroom experiments already, with students ranging from elementary school to college. Sketch Worksheets are a software equivalent of pencil and paper worksheets commonly found in classrooms, but they provide on-the-spot feedback based on what students draw. They are built on the CogSketch platform, which provides qualitative visual and spatial representations and analogical processing based on computational models of human cognition. This paper explores three issues. First, we examine how research from cognitive science and artificial intelligence, combined with the constraints of creating new kinds of educational software, led to the representations and processing in CogSketch. Second, we examine how these capabilities have been used in Sketch Worksheets, drawing upon experiments with fifth-grade students in biology and college students in engineering design and in geoscience. Finally, we examine some open issues in sketch understanding that need to be addressed to better model high-level aspects of vision, and for sketch understanding systems to reach their full potential for supporting education.