I recently joined Justin Hardman on the Education Vanguard podcast for a conversation that ranged from the evolution of TPACK (now over 20 years old) to what AI actually means for teachers and learners. Justin and I go back a long way: he hosted me for a keynote at one of his early 21CL conferences in Hong Kong, so it was a genuine pleasure to reconnect and dig into these questions together.
We talked about why contextual knowledge matters, why the hype around personalized AI tutoring misunderstands how social and deeply human learning really is, and why I think of generative AI as a smart, drunk, biased, sycophantic, overconfident intern. It is always available, always eager to please, and that eagerness is precisely the problem. It will validate your worst ideas with enthusiasm, and it carries the biases of everything it was trained on. Change a single word in a student essay and AI scoring shifts measurably in both grade and quality of feedback. That is not a quirk. That is a structural problem we need to take seriously.
One thread I keep returning to: AI amplifies expertise and magnifies gaps. Expert teachers look at an AI-generated lesson plan and see a curriculum-shaped object, something with the right vocabulary and formatting that is not actually usable in a real classroom, and they fix it. Novice teachers often cannot tell the difference. The same pattern shows up in creative work, writing, and design. The tool reveals your judgment as much as it supports it. I am cautiously optimistic about what this means inside classrooms. I am considerably more pessimistic about the broader cultural consequences, the manipulation, the relationships people are forming with these systems, and what it all means for how we think and learn together as a society. I remember being excited about social media in 2008. I do not want to be saying the same regretful things about AI a decade from now.
What I do believe is that educators need agency and permission to experiment rather than waiting for some definitive study that will never arrive, because the technology changes faster than the research cycle. Stay humble, stay curious, treat students as whole human beings. That is where I would start.
Listen on YouTube (embedded below), Spotify, or Apple Podcasts.
Justin asked me for recommendations at the end of the episode. First: Neil Postman’s Technopoly, written decades ago and more urgent than ever, about what happens when culture surrenders to technology and lets it define truth and meaning. Second: the Netflix film Humans in the Loop, about the data labelers in rural India whose invisible work powers the AI systems we all use every day. Both are worth your time.




0 Comments