AIResearch AIResearch
Back to articles
Games

How a Video Game Reveals AI's Explanation Problem

A mobile game's AI assistant shows how users can act effectively without truly understanding complex systems—offering a new model for explainable AI design.

AI Research
April 01, 2026
4 min read
How a Video Game Reveals AI's Explanation Problem

As artificial intelligence becomes increasingly embedded in education, work, and daily life, users face a growing paradox: they can complete tasks efficiently with AI assistance while struggling to explain how those were achieved. This phenomenon, highlighted by a high school student's observation that AI enables getting good grades without genuine learning, represents what researchers call an "efficient yet blind cognitive environment." The field of Explainable Artificial Intelligence (XAI) has emerged to address this by making AI decision-making processes more transparent and understandable. However, new research examining the popular mobile game Arknights suggests that traditional approaches to explanation might be missing a crucial element—how understanding emerges through interaction rather than being delivered as finished information.

The study reveals that Arknights' AI system, called PRTS (Preliminary Rhodesisland Terminal System), creates what researchers term "usable but unverifiable" explanations. These explanations provide enough information for players to take action but insufficient detail to establish stable causal understanding of how the system works. Players can deploy operators, manage resources, and complete combat missions through PRTS's interface, yet they repeatedly encounter situations where the system's guidance proves incomplete, delayed, or even misleading. This design creates a distinctive form of player agency centered not on direct control but on interpretive reasoning—what the paper conceptualizes as "explanatory agency."

The research ology involved qualitative close reading and interface analysis of Arknights, with particular focus on how PRTS mediates player experience. The researcher conducted extended playthroughs of the main storyline, documenting gameplay through systematic screenshot capture, UI feedback observation, and written play notes. Rather than analyzing algorithmic transparency at the code level, the study examined what it calls the "phenomenological black box"—how opacity, interpretation, and explanation are configured through narrative framing, UI design, and interaction structure. The analysis focused on the "implied player," meaning the kind of player experience the system anticipates and shapes, particularly through missions in early chapters and the intensified confrontation in Chapter 15.

Demonstrate how PRTS systematically organizes player agency through mediation rather than direct control. As shown in Figure 1, players access the game world through a "neural connection" with PRTS, establishing it as the primary entry point for understanding and action. Figure 2 illustrates how the terminal interface visualizes the Doctor's agency as remote command and coordination rather than embodied presence in the game world. During combat, shown in Figure 3, players act as "behind-the-scenes orchestrators" who translate judgments into commands executed by the system through dragging and clicking operations that resemble coordination more than direct control. The system creates "interpretive gaps" through incomplete pre-combat intelligence, Auto-Deploy failures without specific explanations, and narrative disruptions where PRTS's guidance becomes actively misleading, as demonstrated in Figure 4's depiction of error messages and forced Auto-Deploy in Chapter 15.

These have significant for XAI design beyond gaming. The paper suggests that Arknights offers an alternative to traditional explanatory paradigms that seek to eliminate uncertainty through increased algorithmic transparency. Instead, PRTS demonstrates how explanation can function as a "playable process" where strategic withholding, delayed feedback, and narrative disruptions of trust guide users from passive information recipients to active interpreters. This shift from "result-oriented" to "process-oriented" explanation could help address what researchers identify as a key problem in XAI: many users cannot apply XAI explanations even in relatively simple AI tasks because explanatory needs are often inferred subjectively by researchers rather than derived from users themselves.

The study acknowledges several limitations in its approach. The analysis specifically focuses on the phenomenological black box constructed through interface, narrative, and feedback rather than algorithmic logic in the software engineering sense. The paper explicitly states it does not claim any direct conceptual connection between Arknights and XAI functional models, nor does it suggest game developers were directly inspired by XAI research. Additionally, the researcher acknowledges that other cultural theoretical frameworks might offer alternative or better tools to address the research questions. The analysis is based on the implied player rather than attempting to generalize to all players' experiences, and it treats PRTS as a diegetic AI representation rather than examining its underlying computational implementation, which may rely on scripted mechanisms or Wizard-of-Oz-style design approaches.

Despite these limitations, the research provides valuable insights into how digital games can serve as explainable interfaces that mediate understanding through interaction. By examining how interpretive gaps are designed as playable structures in Arknights, the study reveals how systems can foster what it calls "explanatory agency"—the capacity to maintain effective decision-making while actively calibrating one's cognitive model of a system under conditions of opacity. This approach suggests that for complex AI systems, understanding might be better cultivated not through complete transparency but through carefully designed experiences that make interpretation an integral part of the interaction process.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn