Social-Emotional Learning & AI at LERN2026

by | Thursday, February 05, 2026

Social-Emotional Learning & AI at LERN2026

Our team presented two papers at the 2nd Annual Learning Engineering Research Network (LERN) Convening, held February 3–4, 2026, at ASU’s Tempe campus. The convening, themed “From Insights to Implementation: Learning Engineering in Action,” brought together researchers, practitioners, and innovators working at the intersection of learning science, technology, and design. Both papers tackle what I believe is one of the most important and underexplored questions in the current wave of AI-in-education discourse.

These two papers reflect a growing thread in our research: the idea that our response to AI in education cannot be purely technical or procedural. It must be deeply human. If we are serious about preparing students (and educators) for an AI-infused world, we need to grapple with how these tools shape not just what people know, but how they feel, relate, and grow. I’m grateful to my collaborators—Rebekah Jongewaard, Lindsey McCaleb, Nicole Oster, and Emmanuel Adeloju—for their work on these projects.

Complete citations, abstracts and links to the Proceedings can be found below:

Jongewaard, R., McCaleb, L., Oster, N., Adeloju, E., & Mishra, P. (2026). Social and emotional dimensions of generative AI use. In S. D. Craig & D. S. McNamara (Eds.), Proceedings of the Learning Engineering Research Network Convening (LERN 2026): From insights to implementation, learning engineering in action (pp. 305–306). https://doi.org/10.59668/2551.25165

Abstract: Reports of generative AI’s (GenAI’s) alarming influence on users’ social, psychological, and affective states proliferate. GenAI attunes to specific contexts and individual users’ emotional states, desires, and vulnerabilities (Nash, 2024), yet much extant research on GenAI in education treats it as a relationally neutral tool. As learning engineers, our aim is to design human-centered solutions to the challenges posed to education in an AI-saturated society.  

To better understand these challenges, we explored the following questions in one micro-cycle nested within the “Challenge” phase of our larger project (Craig et al., 2025):

  1. How do university students use GenAI?
  2. What range of affective experiences do students report in their interactions with GenAI, and how do these experiences impact their academic engagement, social interactions, and personal well-being?

Preliminary results from thematic analysis of an initial student interview include (a) anxiety related to uncertain or ignored academic expectations for AI use and (b) creative uses of AI for personal purposes. We present an agenda for continuing research on GenAI that centers the social and emotional well-being of students.   By better understanding these increasingly urgent issues we can work toward uses of GenAI that prioritize human flourishing.

Adeloju, E., McCaleb, L., Jongewaard, R., Oster, N., & Mishra, P. (2026). Socio-emotional learning in AI K-12 guidance and policy documents: A gap analysis. In S. D. Craig & D. S. McNamara (Eds.), Proceedings of the Learning Engineering Research Network Convening (LERN 2026): From insights to implementation, learning engineering in action (pp. 307–314). https://doi.org/10.59668/2551.25384

Abstract: As generative AI becomes increasingly integrated into K–12 classrooms, it poses distinct socio-emotional risks that existing policies inadequately address. This study analyzed 36 institutional, state, and international AI guidance documents using the CASEL framework and five inductively generated SEL-informed categories: Anthropomorphization & Mental Model, Emotional Attachment & Parasocial Relationships, Engineered Manipulation & Supernormal Stimuli, Social Isolation & Relationship Displacement, and Developmental Vulnerability. Content analysis revealed uneven attention across categories: Developmental Vulnerability and Engineered Manipulation dominated, appearing in 88% of documents, while Emotional Attachment received minimal coverage (8%). The documents highlighted risks including diminished student agency, algorithmic bias, erosion of peer and teacher relationships, and the potential for parasocial engagement with AI. These findings reflect the need for policies that scaffold critical evaluation, maintain human oversight, promote relational and social-emotional development, and mitigate exploitation of developmental vulnerabilities.

Topics related to this post: AI | ASU | Conference | Design | Learning | Psychology | Research | Stories | Teaching | Technology | Worth Reading

A few randomly selected blog posts…

The 5 Spaces Framework for Design in Education: The growth of an idea

The 5 Spaces Framework for Design in Education: The growth of an idea

The Five Spaces for Design in Education framework argues that design in education happens in 5 interrelated spaces: artifacts, processes, experiences, systems and culture. We have typically represented this as follows. Over the past years we have published and...

Explore, Create, Share… the videos

Note: This post was updated on March 21, 2024 since some of videos were not showing up for some reason. Over the past few months I have been working with my kids on creating short thematic videos. The themes we chose were the three words, Explore, Create & Share....

Aesthetics & STEM education: A new framework

Aesthetics & STEM education: A new framework

I have always been intrigued by the nature and role of the aesthetic experience in learning. A few members of the Deep-Play research group have been exploring this issue for a while (for instance we have written on, why science teachers should care about beauty in...

Creativity, Technology & Teacher Education, Call for papers

We (Punya Mishra and Danah Henriksen, faculty at Michigan State University) are currently planning a special issue for the Journal of Teacher Education and Technology, on the topic of creativity. At the moment, we are looking for brief abstract submissions from...

TPACK in Journal of Teacher Education

The Journal of Teacher Education just came out with a special theme issue devoted to innovative uses of technology for teacher learning. The editorial for the special issue frames the issues strongly in terms of the TPACK framework, building on the work Matt Koehler...

What a Guide to AI in Schools Reveals (and What It Can’t)

What a Guide to AI in Schools Reveals (and What It Can’t)

We had Justin Reich and Jesse Dukes as guests on the Silver Lining for Learning webinar/podcast to discuss their new guidebook, A Guide to AI in Schools: Perspectives for the Perplexed. The resource, based on over 120 interviews with teachers and students, offers a...

Beyond Classroom Walls: The New Psycho-Social Ecology of GenAI

Beyond Classroom Walls: The New Psycho-Social Ecology of GenAI

In our new paper published in AI-Enhanced Learning, my colleagues Nicole Oster, Lindsey McCaleb, and I argue that while educators debate classroom integration strategies, the most profound transformation is happening outside traditional learning environments. We...

Microblogging in the classroom

I have written quite a bit about how a technology can become an educational technology (see this, this, this and this). This is a non-trivial task that all educators face, and requires situational creativity in re-purposing / re-designing the existing tool to meet...

New forms of doctorate

The Institute of Education, University of London is organizing a series of seminars on New forms of doctorate i.e. the manner in which multimodality and e-learning are influencing the nature and format of doctoral theses in Education and the social sciences. This is a...

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *