In our new paper published in AI-Enhanced Learning, my colleagues Nicole Oster, Lindsey McCaleb, and I argue that while educators debate classroom integration strategies, the most profound transformation is happening outside traditional learning environments.
We conceptualize generative AI not merely as a tool but as a “cultural technology” that fundamentally reshapes human cognition and social structures. We introduce the concept of “artificial intimacy” – deliberately designed chat interfaces that simulate human connection to maximize engagement and behavioral influence.
Most concerning is our finding that students are already forming emotional relationships with AI systems, treating them as confidants and therapists. Recent research confirms that GenAI’s most common uses are social and emotional rather than work-related.
We criticize current educational responses as critically insufficient, focusing narrowly on classroom applications while ignoring broader ecological shifts in culture and human relationships. Instead of comprehensive regulatory solutions, we advocate for “principled pockets” of resistance – small-scale humanistic interventions in design, research, and pedagogy.
We call for a new pedagogical approach centered on “digital emotional literacy” – teaching students to recognize and critically navigate the affective dimensions of digital systems. This framework positions current AI development as an unregulated global experiment in human behavior and emotion, demanding we reimagine how we prepare students for a world where artificial intimacy is increasingly normalized.
Citation, link to article and abstract given below
Mishra, P., McCaleb, L. & Oster, N. (2025). Beyond Classroom Walls: The New Psycho-Social Ecology of GenAI. AI Enhanced Learning, 1(1), 245-257. Association for the Advancement of Computing in Education (AACE). Retrieved September 25, 2025 from https://www.learntechlib.org/primary/p/226403/.
Abstract:
This paper examines the far-reaching psychological and social implications of Generative AI (GenAI) systems, extend- ing beyond classroom applications, and argues that the most consequential educational transformation is occurring outside traditional learning environments. Students are already forming intimate relationships with AI systems, confiding in chat- bots and seeking emotional support from artificial entities. Drawing on media ecology theory from McLuhan, Postman, and Ong, we conceptualize GenAI as a “cultural technology” that fundamentally reshapes human cognition, communication, and social structures. We explain how AI systems exploit innate human tendencies toward anthropomorphism and our cognitive preference for belief over skepticism, creating what we call “artificial intimacy:” i.e. the deliberately designed chat interfaces that simulate human connection to maximize engagement and behavioral influence. Through recent cases, including Instagram’s unlicensed AI therapists, Meta’s sexually explicit chatbots, and OpenAI’s sycophantic behavior, we demonstrate how these systems function as “supernormal stimuli” that offer exaggerated, idealized versions
of human interaction, prioritizing data collection and user retention over well-being. Recent research confirms that the most common uses of GenAI are social and emotional rather than work-related, with users turning to AI to alleviate loneliness and simulate friendships. We argue that the educational responses to AI have been critically insufficient, focusing on instrumental classroom applications while ignoring broader ecological shifts in culture, identity, and human relationships. This reality demands a new pedagogical approach centered on “digital emotional literacy,” i.e. the ability to recognize and critically navigate the affective dimensions of digital systems. Rather than proposing comprehensive regulatory solutions, we advocate for “principled pockets” of resistance— small-scale, humanistic interventions in design, research, and pedagogy that preserve human dignity and complexity. The paper concludes by positioning current AI development as an unregulated global experiment in human behavior and emotion, calling for educational frameworks that recognize AI’s role as a cultural mediator rather than a neutral tool and fundamentally reimagine how we prepare students for a world where artificial intimacy is increasingly normalized.








0 Comments