For this post, I wanted to write about the other thing (besides AI) I’ve been learning about for the past few months: trust.
I became interested in trust at the onset of the coronavirus pandemic in 2020, the start of what’s become an extended period of instability in education. At that time, I started working with schools in an intensive way on a lot of different topics (assessment, instructional design, leadership, change management, etc.). Over the years, no matter the topic or practice I’ve been brought in to discuss, trust keeps coming up: trust in leadership, trust in students, trust in colleagues, trust in institutions. In some cases, there is appreciation for how high-trust cultures have gotten schools through these hard years. In most cases, though, there has been a sense that trust has been damaged in some way and needs to be repaired. Before that repair happens, adopting new or innovative ideas is going to be hard.
So, I’ve been absorbing a lot of material about trust (and I would love more recommendations: please leave yours in the comments!). What I’ve learned is that trust is one of just a few non-negotiable elements of learning and growth, something that has to be present no matter the age or experience of the learner and no matter the style or tactics of the teacher or leader.
What is Trust? Why Does it Matter?
In his recent book How Trust Works, Peter Kim writes that “real trust requires a willingness to be vulnerable based on the belief that others won’t let you down, even though they could.” Whether we are talking about adults or children, supervisory relationships or peer relationships, schools or other types of organizations, the research is clear that vulnerability is evidence of trust.
As Brené Brown famously taught us, vulnerability is a risk. In being vulnerable, we are accepting risk of failure, of exposing incompetence, or of changing others’ perception of us. We cannot underestimate the importance of vulnerability—and, thus, risk—in learning new things. In his research, Stephen Chew has found that “students seek trust in teachers when taking courses that are both challenging and required for their major or that cover a topic in which students feel anxious and insecure. Minoritized students who feel outside the mainstream culture of the campus will be especially sensitive to the trustworthiness of their teachers.”
Chew makes a distinction between trust, which is based on perceptions of integrity and competence, and rapport, which is based on personal connection. Both matter for learning, but “trust is the more potent variable” because Chew has found teacher competence and integrity mattered more than likability when it came to students’ willingness to take on effortful learning activities.
Trust is important for both professional and student learning. For example, in their book Trust in Schools, Anthony Bryk and Barbara Schneider studied schools in Chicago, IL, USA, and found low-trust environments correlated with failed school reform efforts and the quality of teacher performance. In her very helpful dissertation “Student Trust in Teachers,” Regina Bankole found students’ trust in their teachers had a positive effect on their willingness to accept and pursue high academic expectations.
Ron Ritchart calls this “institutional mirroring”: A school that nurtures a professional culture of inquiry and innovation will see educators reflect that culture in their classrooms. If the adults in a school do not feel a high level of trust, students are less likely to trust, as well.
How Do We Build Trust?
Building trust is a complex, multi-faceted concept. Frameworks abound, to the point where I’ve started a spreadsheet to track the different ways researchers define what we need from others in order to trust them. Here’s a word cloud of what I have so far (the larger the word, the more frequently it appears across frameworks):
I’ll write more about building, breaking, and repairing trust in the future, but I do want to briefly highlight three elements that showed up across my reading.
Competence. Frances Frei and Anne Morriss define competence as people having faith in the rigor of your ideas and your ability to deliver on them. This combination of knowledge and skills is important. For example, Stephen Chew found that students do not view content expertise as sufficient for competence: they need to believe that their teachers know how to translate that expertise into meaningful and helpful learning activities.
Integrity. Integrity manifests in how aligned our behaviors are with our words. Consider this line from The Thin Book of Trust by Charles Feltman: “Simply put, people too often fail to recognize that when they express their intentions, expectations, desires, beliefs and values, they aren’t just describing themselves, they are creating expectations about their future behavior in the minds of those who listen to them” (emphasis is Feltman’s). People must see consistency and morality in our behavior in order to see us as having integrity.
Care. Taking some editorial license here and naming care as one of the three most important factors. It isn’t one of the largest words above, but so many of the smaller words are written about as elements of care. For example, in Culturally Responsive Teaching and the Brain Zaretta Hammond writes that students must trust their teachers to be able to do cognitively complex work, and care and trust are inextricably linked: “We have to not only care about students in a general sense but also actively care for them in a physical and emotional sense” (emphasis is Hammond’s). If people do not sense a sincere investment in their identity, experience, and potential, they do not sense care and are less likely to trust.
What Trust Isn’t
Research shows that policies and rules that are meant to reduce risk simultaneously reduce the need for, and thus the possibility of, trust.
I loved this example from Kim’s book: In one study, ten day care centers were finding that many parents were showing up after the designated time for pickup at the end of the day, causing staff to have to stay late. The researchers had the centers impose a monetary fine for tardiness, hypothesizing that a fine would improve parent timeliness for pickup. The result? More than twice as many parents started showing up late! Kim explains: “Their late arrivals were no longer an embarrassing violation of social norms that made the lives of day care teachers more difficult but instead an accepted activity with a reasonable price tag.”
Using policy to encourage certain behavior can result in what Frances Fukuyama calls “trust substitutes”: “People who do not trust one another will end up cooperating only under a system of formal rules and regulations, which have to be negotiated, agreed to, litigated, and enforced, sometimes by coercive means.”
Organizations create lengthy handbooks, detailed rules, and monitoring systems in order to be clear about the behavior they seek and to lower the possibility that a violation will occur. The problem? Culture becomes transactional, not relational. As Kim writes, research shows excessive risk mitigation “can foster the belief that others would only act in a trustworthy manner when they are compelled to do so and that they would reveal their true untrustworthy selves when given a chance.”
This has real costs for organizations. In The Speed of Trust, Stephen Covey names seven “Low-Trust Organizational Taxes” which are indicators that an organization prioritizes compliance and risk-aversion over trust: redundancy, bureaucracy, politics, disengagement, turnover, churn, and fraud. These are not just negative outcomes of low trust; they are symbols to people within and outside the organization that it’s a low-trust environment.
OK, Maybe There Is an AI Connection
I thought I would get through this whole post without mentioning AI, but writing it has made me think about how so many conversations we have about AI are conversations about cheating, which are really conversations about competence and integrity (both ours and that of students), which are really conversations about trust.
A perfect example is the debate about AI detectors (which really shouldn’t be a debate: they don’t work). Using an AI detector might lower the risk that a teacher mistakes AI work for student work, and it might deter students from using AI to cheat. But, what does using an AI detector signal about our trust in students? What is the impact of a false positive on a student’s trust in us?
AI is making schools feel vulnerable in this area, understandably so. What I have learned is that the way we react to that vulnerability will affect the level of trust in our cultures and our ability to do the important work of exploring AI beyond the limited realm of cheating. How do we prioritize protecting our trust in students and each other as we try to craft positions and policies on AI?
Like so much that has to do with AI, this is a both/and situation, a polarity to be managed, not a problem to be solved. We can be explicit about our position on AI and trust educators and students to explore the tool curiously and responsibly in their own ways.
I’ll leave you with a few more words from Peter Kim:
“For most of us, when someone trusts us, we want to prove them right.”
I’m going to keep learning about trust. What resources should I explore next?
Upcoming Workshops
Join me for some live learning about AI.
I’m so happy to continue my partnership with the California Teacher Development Collaborative (CATDC) to offer online workshops on AI. On October 24, I’ll be facilitating“Talking with Students about AI”and on November 7, I’ll be facilitating“AI, Assessment, and the Question of Rigor.”These are open to all, whether or not you live/work in California.I’ll be presenting at theInnovative Learning Conferenceat the Nueva School in San Mateo, CA, USA, October 26-27. My session is titled, “Redesigning Assessments with AI and Agency in Mind.”
Links!
I have not stopped thinking about this Washington Post article since I read it last month: “Her students reported her for a lesson on race. Can she trust them again?” The article is excellent, but I highly recommend listening to the accompanying podcast episode, where you hear directly from teachers and students.
This conversation between Tressie McMillan Cottom, one of my favorite scholars and writers, and Emily Drabinski, the president of the American Library Association, will remind you both of the experiences and learning that libraries make possible and of the cost of America’s fading trust in institutions.
Dan Meyer on what technology companies misunderstand (and what excellent teachers understand deeply) about what “personalized learning” means and looks like.
Would you trust AI more if you knew it was following a citizen-written constitution? Anthropic experimented with letting the public write rules for Claude.
A really interesting post from the CEO of Medium about how the company has refused to allow AI to be trained on writing that’s published on its platform.
Thanks for reading! If you enjoyed this post, I hope you’ll share it with others. It’s free to subscribe to Learning on Purpose. If you have feedback or just want to connect, you can always reach me at eric@erichudson.co.
I just finished Hernan Diaz’s novel TRUST and while it’s a smidge uneven across its varied parts, it’s a great consideration of narrative and genre and how both impact the reliability of the story told. It would be a good and different type of resource on this topic of trust. I predict you are already reading it! ❤️
PS I, too, was filled w thoughts after listening to the podcast interview about libraries. Would love for you to take up that topic and the teaching of research skills.
Love to see Diaz's novel getting a plug.
Eric, if you haven't seen Alberto Romero's post about AI detectors from back in the summer, you should check it out. It really helped shape my thinking on the issue. He has a good follow-up post about proactive strategies teachers can use in lieu of detectors that is also worth checking out.
https://open.substack.com/pub/thealgorithmicbridge/p/the-ai-generated-education-issue?r=2l25hp&utm_campaign=post&utm_medium=web