The Best Paper Award of the 17th International Conference on Cross-Cultural Design
has been conferred to
Entong Gao (Beihang University and Peking University, P.R. China),
Hanyu Zhong, Ruiqing Yuan, Jialu Guo and Zhe Chen (Beihang University, P.R. China)
for the paper entitled
"“How Do You Understand? Your Eyes Show It”: Explainable Artificial Intelligence for Cross-Language Comprehension Prediction Through Eye Movement"

Entong Gao
(presenter)

Best Paper Award for the 17th International Conference on Cross-Cultural Design, in the context of HCI International 2025, Gothenburg, Sweden, 22 - 27 June 2025

Certificate for Best Paper Award of the 17th International Conference on Cross-Cultural Design presented in the context of HCI International 2025, Gothenburg, Sweden, 22 - 27 June 2025
Paper Abstract
Eye movements have long been linked to comprehension performance, serving as a valuable window into cognitive processing in human computer interaction (HCI). This research investigates the potential of explainable artificial intelligence (XAI) to predict comprehension based on eye movement across native and nonnative language scenarios. Study 1 applies deep learning models to prediction, with study 2 utilizing SHapley Additive exPlanations (SHAP) for model interpretability and conducting experiments with AI agents to optimize interaction strategies based on predicted comprehension levels in study 3. The findings reveal: 1) Transformer model outperforms other models in predicting comprehension, with intelligibility predictions being more accurate than comprehensibility predictions, particularly in native scenarios. 2) In native scenarios, comprehension is closely linked to early eye movement activities, particularly with blink activities, while nonnative comprehension relies more on later-stage processing, reflecting the increased cognitive demands of processing nonnative language. 3) In nonnative environments, Reward Factor strategies are crucial for alleviating cognitive load and enhancing user engagement, compared to native contexts. The research provides a novel approach by integrating eye movement with XAI and agents experiments, revealing key eye movement features that correspond comprehension and exploring how AI agents can tailor interaction strategies based on comprehension levels. This study highlights the potential for AI to improve user interaction by dynamically adjusting to comprehension levels, particularly in multilingual contexts, offering practical implications for personalized information system and HCI.
The full paper is available through SpringerLink, provided that you have proper access rights.