A Case Study of Human-Authored versus Automatic Dashboard Summaries.
Published in Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA '25), 2025
Recommended citation: Jane Hoffswell, Victor S. Bursztyn, Shunan Guo, and Eunyee Koh. 2025. A Case Study of Human-Authored versus Automatic Dashboard Summaries. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA '25). Association for Computing Machinery, New York, NY, USA, Article 40, 1–7. https://doi.org/10.1145/3706599.3720155 https://dl.acm.org/doi/abs/10.1145/3706599.3720155
Automatically generated insights can help people interpret key trends in their data; similarly, dashboard summaries can highlight key insights for large and complex analytic dashboards that combine multiple datasets or visualizations. In this work we perform a case study evaluation with five industry professionals to understand how people prioritize insights and author concise summaries; to inform the design of improved automatic techniques, we compare the results to a fully automatic approach. We observed three notable characteristics of human-authored dashboard summaries compared to the automatic method: (1) incorporation of explanations or speculation, (2) improved structural consistency in the text, and (3) careful consideration of the precision for numeric values.
