As so many educators around the world end their school years and prepare for a long, well-deserved break, the most common question I’m asked right now is, “If I wanted to learn more about generative AI, what should I be doing?”
So, I put together the below playlist of resources and activities. I focused on pieces from reliable sources that are relatively short and accessible. Many of these resources are full of useful links themselves, so you can dive as deep as you like.
I hope you’ll give as much time to hands-on play and practice as you do to absorbing resources. The most effective way I have built my own AI capacity without being distracted by AI hype is to use the tools myself.
Establish a Foundation
I haven’t found a perfect introductory course for educators on AI, but I like “Teaching AI in the Classroom,” an open online course offered by the University of Adelaide. The course has versions for elementary and for secondary educators, offers a wealth of suggested classroom activities, and its explanations of the history, functionality, and implications of artificial intelligence will set you up well to explore generative AI more deeply. Importantly, it’s not affiliated with a company trying to sell an AI tool.
Consider the Big Picture
Beyond a basic understanding of the history of and technology behind AI, you should be familiar with some of the philosophical, ethical, and pedagogical issues that it raises in education and society.
“Educating Students in the Age of AI” This podcast where Ezra Klein interviews Rebecca Winthrop is an essential conversation about generative AI’s role in school.
“How Are Students Using AI?” Annette Vee shares data on the use of generative AI by young people and what that means for educators.
“The Tech Fantasy that Powers AI is Running on Fumes” Tressie McMillan Cottom on generative AI as a “mid” technology that deserves our critique, not our trust.
“Toward a New Theory of Arrival Technologies” This paper by Justin Reich and Jesse Dukes argues that the arrival of ChatGPT signals a new era of “arrival technologies” that schools can only manage, not control. This should change how schools approach technology integration.
“Will the Humanities Survive Artificial Intelligence?” History professor D. Graham Burnett asked his students to use generative AI for a reflective assignment, and the results had a deep impact on how he thinks and feels about the future of his field.

Deepen Your AI Literacy
Schools will not develop AI-literate students without AI-literate adults. Whatever our personal opinions about generative AI, I think we have a professional responsibility to understand its capabilities and complexities. I’ve organized this section according to Stanford University’s four elements of AI literacy:
Functional literacy: How does AI work?
Ethical literacy: How do we navigate the ethical issues of AI?
Rhetorical literacy: How do we use natural and AI-generated language to achieve our goals?
Pedagogical literacy: How do we use AI to enhance teaching and learning?
Consider your strengths and areas for growth in these areas. Design your own pathway through these materials based on your needs.
Note: when I use the term “general purpose chatbots,” I’m referring to tools like ChatGPT, Gemini, and Claude.
Functional AI Literacy
Resources
“AI Learns to Reason” Melanie Mitchell offers a helpful, concise explanation of reasoning models, the latest evolution in large language models (ChatGPT o3, Claude Sonnet 4, etc.)
“The ‘Privacy’ Discourse Policing AI in Schools” Leon Furze explains how to think about the privacy of our data and content when interacting with large language models, and he addresses the misconceptions that many people have about what it means to share content with these tools.
“Six AI Ideas We Need to Let Go Of” I wrote this piece about how our understanding of AI has to evolve as the technology evolves.
Activities
Practice Prompting. Prompting remains the most important skill a user can have. Explore AI for Education’s prompt library. Take note of how they design prompts: What elements do they share? Find prompts relevant to your context, edit them to suit your specific needs, and try them in a variety of general purpose chatbots.
Move Beyond Text Generation. Explore generative AI’s multimodal capabilities. Upload visuals (photos, diagrams, infographics, etc.) to a general purpose chatbot and ask the bots to interpret or analyze them. ChatGPT and Gemini can also create images based on your text prompts. Try AI audio tools like Eleven, voice mode in ChatGPT or Gemini, or music tools like Suno. Try video generators like HeyGen, Sora (requires ChatGPT+), or Veo 3 (requires Gemini Advanced).
Customize Generative AI. Explore the “mega-prompts” in Ethan Mollick and Lilach Mollick’s prompt library. These longer, more detailed prompts don’t just improve output, they get bots to behave in certain ways. Find one or two that interest you and try to edit them to fit your specific needs and context. Consider how bots behave differently when “mega-prompted” vs. simply prompted. Mike Caufield’s SIFT Toolbox is an impressive example of a mega-prompt that transforms a chatbots behavior.
Ethical

Resources
“9 Rules for New Technology” Ted Gioia takes inspiration from Wendell Berry in thinking through how to apply an ethical lens to evaluating emerging technologies like generative AI.
“The AI Cultural Hallucination Bias” Maha Bali lays out the implicit and explicit ways large language models’ Western, Anglo bias can affect teaching and learning.
“AI Skills that Matter, Part 4: Ethical Decision-Making” I wrote about the most important skill we and our students can develop to ensure we are responsible and effective users of generative AI.
“Engineered for Attachment: The Hidden Psychology of AI Companions” Punya Mishra and Melissa Warr remind us that schools’ focus on academic integrity distracts from the important concerns raised by AI companions.
”Power Hungry: AI and Our Energy Future” A recent report from MIT Technology Review on the environmental impact of scaling AI.
“Teaching AI Ethics 2025: Bias” and “Environment” Leon Furze is in the process of updating his popular 2023 series, “Teaching AI Ethics.” These are the first two posts. Subscribe to his blog to receive future updates.
Activities
Talk about AI Ethics. Engage a friend, family member, student, or colleague in conversation about Stanford’s Ethical Engine Cards. What does pondering these scenarios reveal about your stance on AI? What can you learn from others’ perspectives on these questions?
Test Bots for Bias. “Anti-bias prompts and testing for bias” Some useful hands-on activities from Ravit Dotan to better understand AI bias. Be sure to try the prompts in a variety of chatbots.
Rhetorical
Resources
“15 Times to Use AI, and 5 Not To” by Ethan Mollick
“AI Skills that Matter, Part 2: Lateral Reading” I wrote about an information literacy strategy that can help us and our students evaluate AI output.
“The Impact of GenAI on Human Learning: a Research Summary” Phillipa Hardman summarizes a few studies on the impact of generative AI use on critical thinking and what educators can do about it.
Activities
Engage in Reflective Play. Ask a general purpose chatbot to create something you would normally make yourself: a letter, a report, a lesson plan or meeting agenda, etc. Take the time to coach the bot to create the best product possible. Then, take Amelia King’s AI Usage Self-Assessment in Poe. It’s an excellent set of questions we should all ask ourselves when reflecting on our use of AI.
Pit Bots Against Each Other. Open three general purpose chatbots in your browser. Give them all the same prompt. Compare results. Take the output from one and ask the others to evaluate and improve it. Use this professor’s method as a template to see if you can use multiple bots to create a high-quality research report.
Host a Cheat-a-Thon. I’m fascinated by this competition that Penn State hosted, where students and faculty tested how good generative AI is at a variety of coursework. This would be an interesting activity to replicate with colleagues and students. (No, I’m not the Eric Hudson who won the faculty division).

Pedagogical
Resources
“AI and the Teaching of Writing” I wrote about six core principles that should serve as the foundation for the future of the writing classroom.
“AI Skills that Matter, Part 1: Extending the Mind” I wrote about how we can integrate AI into teaching and learning as a mind extension tool rather than as a replacement for thinking.
“Catch Them Learning: A Pathway to Academic Integrity in the Age of AI” via Cult of Pedagogy
“Post-Apocalyptic Education” This post by Ethan Mollick is a follow-up to his essential 2023 post, “The Homework Apocalypse.” He provides updates on how and why we should be rethinking assessment and the role of AI given the latest developments in the technology.
“The Potential of Using AI to Improve Student Learning in STEM: Now and in the Future” by the Committee for Advancing Discovery Research in Education
Activities
Evaluate the Pedagogy of Specialized AI Tools. Create a free account in an “AI for educators” tool like MagicSchool, Diffit, or Brisk AI. Generate lesson plans, rubrics, feedback, etc. in your area of expertise. How valuable is the initial output? How do you know? If it’s not valuable to you, how much additional input from you would be required for the output to meet your standards?
Evaluate the Pedagogy of General Purpose Bots. Share some teaching materials (either your own or AI-generated) with a general purpose chatbot and ask the bot to evaluate the materials against evidence-based standards of teaching and learning. Be specific about the research or theoretical framework you would like the bot to use in evaluating the materials (cognitive psychology, culturally responsive pedagogy, literacy and numeracy development, etc.). What do you notice? What do you wonder?
Pretend to be a Student. Try to use generative AI in the way that you know or you imagine that your students have, especially in trying to assist in the work of a course you teach. What are the different ways you might use generative AI? Which uses support the learning? Which not? How would you assess AI’s output? Which factors might a student consider in making a decision about AI use?
Almost two years ago, I wrote that the most important thing educators can do to respond to AI is to learn how to talk to students about it. I deeply believe that this is still the case. Our conversations with students about AI need to be more open, more generative, and more forward-thinking. For that, we need to build our AI capacity. I hope this guide empowers people to feel confident starting those conversations.
Upcoming Ways to Connect With Me
Speaking, Facilitation, and Consultation
If you want to learn more about my work with schools and nonprofits, take a look at my website and reach out for a conversation. I’d love to hear about what you’re working on.
In-Person Events
June 6. I’ll be delivering a keynote and facilitating two workshops (one on AI, one on student-centered assessment) at the STLinSTL conference at MICDS in St. Louis, MO, USA.
June 9. I'll be facilitating workshops on AI at the Summer Teaching and Learning Institute at Randolph School in Huntsville, AL, USA.
Books, not Links!
For those people who want to read a book to learn more about the field of AI, here are some recommendations:
Atlas of AI by Kate Crawford
Co-Intelligence by Ethan Mollick
The Coming Wave by Mustafa Suleyman
Unmasking AI by Joy Buolamwini
I really appreciate this playlist—especially the tools and frameworks that attempt to mitigate overwhelm and help educators build capacity thoughtfully. It’s such a crucial point that we not only need to understand AI ourselves, but model and teach ethical AI use for learners who will navigate these tools as adults in an AI-fueled economy.
I’m exploring how my CRAFT framework can support schools in integrating AI in ways that center equity and clarity, and help mitigate AI bias. Thanks for offering such a practical, values-driven resource for this work.
Nice article with a ton of useful advice. Thanks!