This is part three of the “Looking Ahead with AI” series. You can find part one, about teachers, here, and part two, about students, here.
A few things have happened in AI since my last post two weeks ago:
OpenAI released Chat-GPT4o, which will make GPT-4 with its more sophisticated output and multimodal capabilities available to everyone, for free.
Google is redesigning its search function around AI, which will change search for all of us, if it ever actually works.
Scarlett Johansson called out OpenAI’s shenanigans around using a likeness of her voice, emphasizing intellectual property concerns around AI and further illustrating the arrogance of technology companies. (It seems that Sam Altman and I have one thing in common: an obsession with the movie “Her.” But, I think we have different understandings of its message.)
And these are just the biggest stories! The last two weeks reflect the conditions under which school leaders have to make decisions about AI: a rapidly evolving landscape where keeping track of AI developments, much less digesting them, is difficult. Yet, decisions need to be made. About academic policy. About recommending or purchasing AI tools (or not). About AI literacy curriculum. About privacy and cybersecurity. The list goes on.
The feeling of urgency around making these decisions is a tempting distraction from the deeper work that needs to be done first, which is articulating a clear vision for AI in school. Looking ahead, I hope leaders prioritize clarifying their school’s vision over making decisions.
Of course leaders need to make decisions about AI. But, if we make decisions without vision, we run the risk that they will lack coherence, or work against each other, or confuse the people they are meant to help. If nothing else, decisions without vision are missed opportunities, where instead of moving us forward, they just move us to the next decision.
What’s more, leaders cannot make every decision about AI themselves. With an innovation this complex and this powerful, decision-making must be distributed and it must be coherent. We can’t control AI use in schools; instead, we should provide our community with the tools to make good decisions themselves. A vision can be a lens people use to clarify how they see AI and how they want to use it.
But First: Build Your AI Literacy
Hands-on experience is one of the best strategies for thinking practically about AI and moving beyond the hype that surrounds it. If you are an administrator who has never or barely used a chatbot before, set a goal for the next month: spend a total of two hours using chatbots to support your work (not the specialized education “wrappers” for teachers, but the general bots like ChatGPT, Claude, and Gemini). I have an earlier post with 13 ideas for school leaders to try.
What is a Vision?
A vision is a concise description of a future state your community is working towards. I like this distinction between a vision and a mission:
The goal of a vision is to inspire people to work together with collective purpose and to empower them to make decisions in service of that vision. A strong vision is essential to effective change. The culture, structure, and pedagogy that are the pillars of a school’s work should clearly reflect its mission, but changes to those pillars should be driven by a vision for the future of that mission.
Crafting a vision is both reflective and generative, and it can be collaborative. As leadership teams gather for retreats and planning meetings, consider taking the time to look to the past, present, and future.
Look to the Past
Counterintuitively, it can be helpful to start the work of vision-crafting by looking to your school’s past.
Identify “non-negotiables” in your school’s identity and work. Look at current or past strategic plans. If you have a Portrait of a Graduate or a similar resource, reacquaint yourself with it and the process by which you arrived at it. What are the values and ideas that drive your school’s identity? What are the learning outcomes you prioritize? In what ways might AI make those ideas more important? In what ways might it threaten them?
Reacquaint yourself with existing policies. Look at your policies on assessment, academic integrity, acceptable use of technology, etc. How and why were these policies created? How well has your community internalized these guidelines? When was the last time you reviewed and discussed these policies with your community (beyond the annual re-publishing of a handbook)? What do you already have policy-wise that applies to use of AI, especially when it comes to academic assistance and support?
Reflect on your history with technological innovation. How did your school handle previous technological innovations like the internet, smartphones, and social media? Look back on communications, policies, committee work, etc. What did you do well? What would you have done differently?
Look to the Present
When it comes to learning about the present state of their schools, leaders can learn a lot from the techniques of journalism and user research. Get curious and start talking to the people dealing with AI on a day-to-day basis. Do not rely solely on the people who come to you, nor on committees or task forces; those are self-selected groups. Seek out voices you don’t usually hear. Ask open-ended questions. Listen to the answers.
Learn who the early AI adopters are and how they are using AI creatively and productively. Learn what drives the perspectives of the skeptics. Compare AI use at your school to what data tells us about teachers and students. Report out on what you find through writing or meetings or informal conversations. Listening to people builds trust, and trust is a prerequisite for healthy teams and organizations, especially at times of change.
Use the same approach to the world beyond school. Reach out to contacts outside education (your community of parents can be a wonderful resource) to learn about how AI is affecting their industries. Especially given this is an election year in the U.S., learn about the social and civic implications of AI, not just its impact on the workforce.
Look to the Future
Looking to the future is not about making predictions and it is not about idealizing or catastrophizing the impact of AI. It is about practicing strategic foresight, where leadership teams educate themselves on trends, data, stakeholder experience, and research, then use that knowledge to create a variety of possible future scenarios. These scenarios should help teams identify major changes they should anticipate and prepare for.
To keep the Scarlett Johansson theme going in this post, the movie “Her” is a wonderful example of this kind of scenario: it paints a detailed picture of a future that is a logical evolution from where we are now. However you feel about that future, you cannot deny that it's a possibility. What should schools know and be able to do in anticipation of this future? What might alternative futures look like?
I facilitate strategic foresight sessions with boards and leadership teams, and it can be a creative and even fun way to take a serious approach to AI visioning. By forcing ourselves to be concrete and vivid, we can become clearer not just on the futures we face, but also which ones we want to work towards. It’s important to remember that we do have agency in an AI world; not everything is in our locus of control, but many things are.
Is Your Vision an AI Vision?
As you go through this process, be open to the idea that the vision you end up with may not be about AI. It simply might involve AI. I’ve worked with schools for whom AI has renewed their commitment to a new vision for assessment. I’ve worked with schools who are considering re-envisioning how they define and teach academic integrity, ethical citizenship, and technology/media literacy. It could very well be that what you end up with isn’t a standalone AI vision; it could be that you end up with a bigger, deeper vision that AI has inspired you to pursue.
Upcoming Ways to Connect with Me
Speaking, Facilitation, and Consultation. If you want to learn more about my school-based workshops and consulting work, reach out for a conversation at eric@erichudson.co or take a look at my website. I’d love to hear about what you’re working on!
Conferences. On June 17, I will be at “HopCon 2024” at the Hopkins School in New Haven, CT, USA. I’ll be delivering the general session and facilitating a hands-on workshop: “AI For Educators: Efficiency, Effectiveness, And Innovation.”
AI Workshops. I’ll be on a team of facilitators for “AI, Democracy, and Design,” a design intensive for educators being held June 20-21 in San Jose, CA, USA (offered via CATDC).
Leadership Institutes. I’m re-teaming with the amazing Kawai Lai for two leadership institutes in August (both offered via CATDC): “Cultivating Trust and Collaboration: A Roadmap for Senior Leadership Teams” will be held in San Francisco, CA, USA, from August 12-13 and “Unlocking Your Facilitation Potential” will be held in Los Angeles, CA, USA, from August 15-16.
Links!
I’m such a fan of Bryan Alexander and the way he thinks. Read his overview of how we are responding to AI culturally, which includes a ton of wonderful examples.
ChatGPT is a fairly accurate grader of writing with minimal training. Does that make us more comfortable with students and teachers using it for this purpose? Or less?
Emily Pitts Donahoe looks back on one year of teaching an ungraded writing class.
Bots are breaking the hiring process. What other application processes could they break?
“In a way, we are all Scarlett Johansson, waiting to be confronted with an uncanny reflection of ourselves that was created without our permission and from which we will reap no benefit.” Kyle Chayka on AI.