Published:
2019-10-21
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 7
Volume
Issue:
Vol. 7 (2019): Proceedings of the Seventh AAAI Conference on Human Computation and Crowdsourcing
Track:
Technical Papers
Downloads:
Abstract:
Software verification addresses the important societal problem of software correctness by using tools to mechanically prove that software is free of errors. Since the software verification problem is undecidable, automated tools have limited capabilities; hence, to verify non-trivial software, engineers use human-in-the-loop theorem provers that depend on human-provided insights such as loop invariants. The effective use of modern theorem provers requires significant expertise and recent work has explored the possibility of creating human computation games that enable non-experts to find useful loop invariants. A common feature of these games is that they do not show the code to be verified. We present and evaluate a game which does show players code. Showing code poses a number of design challenges, such as avoiding cognitive overload, but, as our experimental evaluation confirms, also provides an opportunity for richer human-computer interactions that lead to more effective human-in-the-loop systems which augment the ability of programmers who are not verification experts to find loop invariants.
DOI:
10.1609/hcomp.v7i1.5277
HCOMP
Vol. 7 (2019): Proceedings of the Seventh AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-820-6