Trackable Reasoning and Analysis for Crowdsourcing and Evaluation (TRACE)

J. Stromer-Galley, B. McKernan
Syracuse University, New York, United States

Poster stand number: T109

Keywords: Analysis, Collaboration, Decision-Making

Intelligence analysts continually risk critical reasoning errors while conducting analysis. The analytic products they write often do not provide customers with a clear sense of the reasoning and assumptions that led to the judgment. None of the currently available commercial or experimental applications intended to address these issues help teams through the entire analytical process or offer support for composing a thorough report. Research indicates that users often find these other applications and techniques to be laborious, restrictive, and counter-intuitive. We have developed the TRACE application to address these issues. TRACE offers a flexible reasoning approach that provides a guided process through the entire analytical process. TRACE offers report templates, guidance, and several tools that help teams process and evaluate information, offload their memory and reasoning work to the application, and share that reasoning effortlessly with teammates. These features help reduce cognitive errors in reasoning. TRACE also learns how analysts and teams work over time and provides smart nudging to draw attention to information they may have missed from source materials or other teammates’ work. Experimental testing using rigorous methods indicate that TRACE significantly enhances users’ reasoning and that users find TRACE to be highly usable.