About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
STVR
Paper
Design and industrial evaluation of a tool supporting semi-automated website testing
Abstract
Software testing is the most time-intensive and resource-intensive aspect of software development. Can support for testing be improved? This case study describes the motivations and design decisions behind the development of the testing tool, CoTester and its deployment to multiple development teams. CoTester outperforms available testing tools by representing tests using an easy-to-understand scripting language and thus making the tests easily editable. The design decisions of the testing tool were derived after conducting a series of interviews with testers and collecting their experiences with manual as well as automated testing. CoTester was developed to support these users, working in an environment of mixed manual and automatic tests, with a progression from manual to automatic testing when circumstances warrant. A series of deployments to four development teams showed that CoTester worked very well for non-professional testers (i.e. those who do testing only part-time), and it was also found to be useful by some professional testers. © 2012 John Wiley & Sons, Ltd.