Research on algorithmic black boxes: A cognitive science perspective
算法黑箱研究:基于认知科学的视角
Keywords:
Algorithmic black box, Cognitive perspective, Transparency and accountability, Gray box explanation, Algorithmic trust, Scenario-based algorithm governanceAbstract
Studying algorithmic black boxes from a cognitive dimension represents a third pathway beyond the existing technical and normative approaches. This perspective emphasizes understanding algorithmic black boxes through conceptual frameworks similar to those used for human cognition. Compared to the "black box" of the human mind, algorithmic black boxes face higher demands for transparency, which has given rise to the "algorithmic black box-transparency-accountability" framework. However, this focus often overshadows other important attributes of black boxes, such as their strangeness and legitimacy.Contrary to cognitive common sense, algorithmic black boxes also possess positive cognitive functions. They unify transitions across organizational levels in cognition and facilitate the shift from causal relationships to mechanisms. This capability challenges the myth of algorithmic explainability, avoids the illusion of explanatory depth, and provides directions for horizontal or nested explanations to unfold the black box. A visual "gray box" ladder explanation method, positioned between "white box" and "black box" approaches, can be employed.The analytical framework of the information and control phases s a re-evaluation of scenario-based regulation methods. Trust in algorithms can be categorized into three types: inherent trust, learned trust, and situational trust, corresponding to factors related to humans, algorithms, and the environment, respectively. Reducing "algorithm aversion" is influenced by the objectivity of algorithms, the autonomy of human processes, concerns about social judgment, and the degree of intrusion algorithms impose on human thought.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Studies in Science of Science

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.


