Reid Priedhorsky, David Pitchford, et al.
CSCW 2012
In real-world applications when deploying Machine Learning (ML) models, initial model development includes close analysis of the model results and behavior by a data scientist. Once trained, however, models may need to be retrained with new data or updated to adhere to new rules or regulations. This presents two challenges. First, how to communicate how a model is making its decisions before and after retraining, and second how to support model editing to take into account new requirements. To address these needs, we built AIMEE (AI Model Explorer and Editor), a tool created to address these challenges by providing interactive methods to explain, visualize, and modify model decision boundaries using rules. Rules should benefit model builders by providing a layer of abstraction for understanding and manipulating the model and reduces the need to modify individual rows of data directly. To evaluate if this was the case, we conducted a pair of user studies totaling 23 participants to evaluate AIMEE's rules-based approach for model explainability and editing. We found that participants correctly interpreted rules and report on their perspectives of how rules are beneficial (and not), ways that rules could support collaboration, and provide a usability evaluation of the tool.
Reid Priedhorsky, David Pitchford, et al.
CSCW 2012
Seyed Omid Sadjadi, Jason W. Pelecanos, et al.
INTERSPEECH 2014
Casey Dugan, Werner Geyer, et al.
CHI 2010
Vagner Figueredo De Santana, Ashwath Vaithinathan Aravindan, et al.
IHC 2025