Happy May, all —
Tributes to Gatsby’s centenary have been many this year, but my favorite so far is the one featured this week, which traces Gatsby’s popularity across culture back to its role in high school English classes. See the first feature for excellent receipts on this thesis. Also in the features, the Constructive Dialogue Institute offers practical guidance for leaders aiming to build a culture of dialogue in challenging times. The guidance is aimed at higher education leaders, but it is adaptable to younger grades.
Also this week, Harvard is in the news — not just for renaming its DEI office and for the results of its antisemitism and anti-Muslim task forces, but also for that study that tracked graduates over decades to understand what leads to a life of happiness. It’s a good reminder every time I read it.
Also: find a thoughtful post by Larry Cuban on what students expect from teachers (which is different from a “portrait of a teacher”). It offers a compelling framing for how we prepare ourselves as educators for the classroom and what our job expectations might look like. See also posts on smartphone bans, math curriculum, and more — in addition to important developments in AI.
On the topic of AI, I’ll be on a live virtual chat this coming Tuesday exploring the implications of the recent Executive Order on AI in schools. This Executive Order will markedly accelerate the adoption of AI in schools in the US, and both teachers and schools leaders will want to get ahead of it as much as possible. Thanks to the Middle States Association for pulling this session together. It’s Tuesday, May 6 at 11am. You can register here.
Also, I’ll be at the third annual AI Symposium at the Loomis Chaffee School on June 3, co-leading three sessions: on learning science and AI, on AI supporting your passions as a teacher, and on how to draw clear boundaries for students on AI for learning. I’m also looking forward to Annie Murphy Paul’s keynote.
Hope to see you at either or both, and in the meantime, enjoy this issue!
Peter
Browse and search over 14,000 curated articles from past issues online:
““My whole theory of writing I can sum up in one sentence,” Fitzgerald wrote, in 1920. “An author ought to write for the youth of his own generation, the critics of the next, and the schoolmasters for ever afterward.””
“By constructive dialogue, we mean exchanges where participants engage across lines of difference with intellectual rigor and mutual respect. This approach rejects both uncritical agreement and unproductive conflict in favor of learning-focused engagement… This report is designed to guide higher education leaders—including presidents, vice presidents, provosts, and leaders of task forces and civic centers—in undertaking this difficult work.”
“Teachers often overlook (or forget) that students also have a list of what behaviors, knowledge, and skills they expect of their teachers. And just like teacher expectations, student expectations matter… Beginning in kindergarten (or preschool), over the years students develop views of what a “good” teacher (and teaching) are. That is where all of the classroom rules emblazoned on whiteboards or wall posters come into play. By the time, students are in high school, then, they have implicit models in their heads of who “good” teachers are, what they expect of students, and what they do in organizing and teaching a class.”
“Though it started out as a futuristic-sounding niche proposition, 3D-printed construction is really taking off throughout the United States and the variety of projects being printed is remarkable. Following the construction of a Walmart extension, a Marine barracks, and even an experimental Mars habitat, the latest example of the cutting-edge technology comes from the USA's first 3D-printed Starbucks coffee shop.”
The juxtaposition couldn’t be more clear. Common Sense Media and Stanford released this week a report stating clearly: “Social AI companions pose unacceptable risks to teens and children under 18.” And in the same week, the New York Times reports: “Google Plans to Roll Out its A.I. Chatbot to Children Under 13.” Not good — we are not ready for this. Now, it may be that the Google chatbot is not intended specifically as a social companion, but the distinction — already fuzzy for adults — will likely be even more unclear for kids.
I (obviously) believe there is enormous potential for the responsible use of AI — and it is already being realized by the most proactively engaged users — but the race by companies to capture the largest and youngest market is introducing enormous risk. In the current climate, I don’t believe this rush to consumers and students is going to slow, which only raises the pressure and stakes for educators and families to offer the best preparation we can for ourselves and kids. In this way, the recent Executive Order calling for the acceleration of AI into schools is a good thing — if in schools we are sure to center the responsible use of AI and help students build the right literacies and self-management skills to lead to more positive and prosocial uses of AI in their work and life. (I look forward to exploring this and other topics with the panel hosted by the Middle States Association this coming Tuesday at 11am.)
Many other important reads this week, including OpenAI’s sycophancy tweak, developments in the field, the importance of critical use of AI, and the urgency of understanding how this new technology actually works — which even the people who have created it don’t fully know. What a wild and weird time we are living in. (For those scholars of Anglo Saxon (Old) English — the word “weird” comes from the Anglo Saxon “wyrd,” meaning “fateful.” I almost always mean it that way when I use it.)
This and more, enjoy!
Peter
“Google plans to roll out its Gemini artificial intelligence chatbot next week for children under 13 who have parent-managed Google accounts, as tech companies vie to attract young users with A.I. products. “Gemini Apps will soon be available for your child,” the company said in an email this week to the parent of an 8-year-old. “That means your child will be able to use Gemini” to ask questions, get homework help and make up stories.”
“What I’m suggesting here is not necessarily resisting the technology. We’re almost at a point where that would be akin to resisting the internet or electricity in the classroom… But the way we use technology, and the form the technology takes in the future, is not set in stone. The line between ubiquity and inevitability may seem thin, but it represents the difference between a technology that is simply widespread and one that has been accepted as necessary, unchangeable, and permanent.”
“People outside the field are often surprised and alarmed to learn that we do not understand how our own AI creations work. They are right to be concerned: this lack of understanding is essentially unprecedented in the history of technology… People outside the field are often surprised and alarmed to learn that we do not understand how our own AI creations work. They are right to be concerned: this lack of understanding is essentially unprecedented in the history of technology.”
“Since OpenAI released ChatGPT in November 2022, Middle States has been helping schools to navigate a world of increasingly abundant AI. The Executive Order may feel disruptive, but you can do three things to position your school to thrive:”
“AI automates execution, forcing us to compete on higher-level cognitive skills – strategy, creativity, taste, critical analysis, and the relentless patience to iterate and refine ideas beyond the AI's initial output. We get a "return on the relinquished task," much like how offloading navigation via GPS frees up mental bandwidth for other things.”
“Will students benefit from AI? From their perspective, tremendously. Will there still be professors? Yes, but fewer of them, and those who remain will need to be technologists as much as they are scholars. Will jobs be lost? Yes, sadly and permanently.”
“Children shouldn’t speak with companion chatbots because such interactions risk self harm and could exacerbate mental health problems and addiction. That’s according to a risk assessment by children’s advocacy group Common Sense Media conducted with input from a lab at the Stanford University School of Medicine.”
“Social AI companions have unacceptable risks for teen users and should not be used by anyone under the age of 18. These social AI companions are designed to create emotional attachment and dependency. This is particularly concerning for developing adolescent brains that may struggle to maintain healthy boundaries between AI and human relationships.”
Copyright
Every week I send out articles I encounter from around the web. Subject matter ranges from hard knowledge about teaching to research about creativity and cognitive science to stories from other industries that, by analogy, inform what we do as educators. This breadth helps us see our work in new ways.
Readers include teachers, school leaders, university overseers, conference organizers, think tank workers, startup founders, nonprofit leaders, and people who are simply interested in what’s happening in education. They say it helps them keep tabs on what matters most in the conversation surrounding schools, teaching, learning, and more.
– Peter Nilsson