A weekly collection of education-related news from around the web.

Educator’s Notebook #503 (October 12, 2025)

INTRODUCTION

  • Dear friends,

    I am so happy to share here (first!) that the book that I have been working on with my co-author Maya Bialik is up for pre-order: Irreplaceable: How AI Changes Everything (and Nothing) About Teaching and Learning.

    This is a book for teachers, school leaders, and those who support teachers and school leaders. Due out in January, it explores and offers guidance on how to use AI effectively in schools. There’s so much to say about what’s in it:

    • It starts with a brief history of technologies that have changed education in the past — but then also how the core of education has remained timeless, or irreplaceable.
    • It’s not an argument just for or against AI; it’s a description of when and why to use AI, and when  and why not to use AI.
    • It offers clear lenses for organizing use of AI: how it can function as an assistant for teachers, an assistant for students, and an assistant in the classroom.
    • It engages risks of AI — and offer tools for mitigating risks: risks to learning, risks to equity, risks to human centered schools.
    • It provides deep insight into research about what works in education — elements of planning, instruction, feedback, and more — and then shows how AI can support and deepen what works.
    • It provides clear principles and examples for designing lessons with AI, sharing not just the underlying learning science, but also inherited wisdom from educators.
    • It also examines current limitations of AI, while illustrating how the technology is expected to progress in the years ahead — so schools and leaders can anticipate what is coming.
    • And perhaps most useful for fostering constructive dialogue about AI in your community, it outlines value systems and teaching philosophies that lead to different perspectives on AI — and by mapping them, helps build bridges across different perspectives towards AI, thereby facilitating discussions within your school and your school’s broader community.

    Maya and I have written this book from our various experiences and perspectives not only as teachers, school leaders and researchers, but also as technology founders, award-winning digital innovators, and artists. The book is meant to be both principled and practical.

    Perhaps most affirming to us as authors is this: we pitched the book in late 2023, and the principles that we outlined then are even more relevant today than they were two years ago. In this period of rapid change, this stability has confirmed to us that while so much is indeed changing around us, the heart of what happens in schools is timeless. And in this time of change, while we engage with new technologies by cultural necessity (and curiosity, too!) we want enduring truths to guide our work. This book aims to provide that guide.

    In the weeks ahead, I will be sharing more about the book in advance of its release in January, including samples of what’s inside, posts with material that we couldn’t fit in the book, and more.  We would be deeply grateful if you would pre-order a copy from Amazon.

    With gratitude,

    Peter

     


     

    Also this week… an excellent issue:

    In the features, find two posts about admissions: one from university counselors on how to effectively use AI to write letters of recommendation, and another from the university admissions perspective about how easy it is to spot student essays written with AI.

    In the Character section this week, Tim Dasey explores different kinds of empathy, and he reflects on what kind of empathy might be increasingly important in a digitally saturated world.

    For more analog teaching: see the post in Pedagogy from the Chronicle of Higher Ed about what makes the classroom special. It has some interesting echoes with last week’s post about the tendency in some circles to want to make education more fun. There’s a deeper nerve this discussion touches between the instinct to personalize, facilitating more of what students want, and the instinct to stretch students out of their own wants and needs and into the perspectives, wants, and needs of others. How much of education is about strengthening what is already in students, and how much of education is about stretching them beyond what is already in them? More on this in a future post.

    In the meantime, enjoy these articles and more!

    Peter

    PS. This week I’ll be sharing excerpts of the book at MASS CUE (Massachusetts Computer Using Educators). Drop me a line if you’ll be there — it would be great to say hello! Other events in the months ahead:

    • October 15-16 – MASS CUE (Foxborough, MA)
    • October 23 – NAMT (NYC)
    • January 11-14 – FETC (Orlando, FL)
    • February 25-27 – NAIS (Seattle, WA)
    • March 9-12 – SXSW EDU (Austin, TX)

     


     

    Browse and search over 15,000 curated articles from past issues online:

    Subscribe to the Educator’s Notebook

    • Edutopia
    • 09/29/25

    “As technology has evolved, I’ve begun to incorporate artificial intelligence (AI) into my workflow. However, when it comes to recommendations, that is a delicate process. Teachers and counselors know the key personal details about students that can make a real difference in the application process. While I use AI tools in some parts of my process, there are some aspects that remain (and for me will always be) human-driven. In this post, I’ll share some examples and resources that can help make writing letters of recommendation easier—with and without technology.”

    • Cornell
    • 09/25/25

    “Researchers in the Cornell Ann S. Bowers College of Computing and Information Science compared 30,000 college application essays written by humans to ones written by eight popular large language models (LLMs), AI models that process and generate text, like ChatGPT. Even when they specified a person’s race, gender and geographic location in the prompt, the models spit out highly uniform text that was easy to distinguish from actual human writing.”

ARTS

    • Jared Harbour
    • 10/05/25

    “What I’m saying is this: trust your audience enough to make something real. Something that reflects actual human experience, not a focus-grouped approximation of it. That can be joyful. That can be funny. That can be wildly entertaining. But it has to come from a place of genuine necessity—from artists who need to say this thing, in this way, right now.”

CHARACTER

    • Sweet GrAIpes
    • 10/08/25

    “The same AI might behave cautiously when discussing medical topics but exploratively when tackling creative writing. Effective users recognize these shifts and adjust their approach accordingly. That’s cognitive work, building and updating mental models of how the system operates in different contexts… Cognitive empathy manifests as systematic pattern recognition. You notice that ChatGPT tends toward sycophancy, agreeing with user statements even when they’re wrong, validating rather than challenging.”

CURRICULUM

DIVERSITY/INCLUSION

    • EdWeek
    • 10/06/25

    “You might call these “political origin stories”—tales of forming worldviews—about which attendees could then ask follow-up questions and reflect what they had heard (I go into greater detail in my book, Learning to Depolarize). Those who shared stories felt rejuvenated and buoyed by the experience, while many of those who listened found themselves similarly refreshed. “Listening became a gift,” said one attendee of the gatherings. “Colleagues left the space excited and invigorated by our exchanges,” said another. The feedback was overwhelmingly positive.”

HIGHER ED

HUMANITIES

INTERNATIONAL

LEADERSHIP

PD

PEDAGOGY

    • Chronicle of Higher Ed
    • 09/25/25

    “So much of what you’re teaching is how not to be paralyzed by difficulty. How do you deal with something that you find off-putting, either because you hate it on sight or because it’s difficult? What actually put off a lot of students about The House of Mirth is just the 19th-century diction. The analogy in the book is that it’s like running on the beach — most people find running on sand very tiring, and a few people love it. The meta-skill here is to ask the right question. Instead of, How am I outraged or bored or put off by this book? The first question should be, What kind of a reader does this book want me to be? What does it want me to care about?”

SOCIAL MEDIA

    • Washington Post
    • 10/07/25

    “After a month, more than three-quarters of the users who had typically spent roughly 30 minutes per day on TikTok were now spending nearly twice that much time on the app, on average. A few months later, these users’ daily watch times reached 70 minutes, on average. For some, their time spent scrolling every day more than tripled or quadrupled.”

    • Etymology Nerd
    • 10/05/25

STEM

SUSTAINABILITY

TECH

VISUAL DESIGN

WORKPLACE

    • EdWeek
    • 10/02/25

    “The newest and most ambitious of these experiments go beyond gauging performance based primarily on test scores to consider a teacher’s classroom practice, peer collaboration, and willingness to tackle high-need subjects, students, and communities.”

GENERAL

A.I. Update

A.I. UPDATE

  • As AI technology develops and spreads further into humanity’s technological lives, our human perspectives continue to diversify — and grow in amplitude.

    In the features, find a sweeping and increasingly evidence-based look at existential risk from AI. Also find a good procedural look at what AI is doing when its LLM functionality is reviewing data. This latter post is good functional literacy for those who may be using AI to synthesize large documents.

    Further, see in the Education section how California is making AI available to all students and faculty in its community colleges.  Meanwhile, others in schools are focused on literacy for both students and teachers. Also, multiple outlets are starting to recognize that what matters most is that students and teachers are having ongoing conversations about AI. I’m a firm supporter of this last approach. The technology continues to evolve, valuable uses of it take time to develop, societal norms shift, boundaries require tweaking and adjustment — these all happen best when we are in ongoing discussion.

    These and more, enjoy!

    Peter

    What do students use AI for? See the latest survey from the College Board in the Education section.
    • New York Times
    • 10/10/25

    “In the course of quantifying the risks of A.I., I was hoping that I would realize my fears were ridiculous. Instead, the opposite happened: The more I moved from apocalyptic hypotheticals to concrete real-world findings, the more concerned I became. All of the elements of Dr. Bengio’s doomsday scenario were coming into existence. A.I. was getting smarter and more capable. It was learning how to tell its overseers what they wanted to hear. It was getting good at lying. And it was getting exponentially better at complex tasks.”

    • Educating AI
    • 10/02/25

    “For decades, districts have used analytical tools. When you ran a report in your Student Information System or built a pivot table in Excel, the process was straightforward: You decided what to measure (attendance below 80%, GPA drop over 0.5 points), The system counted and calculated based on your rules, You got results you could verify (show me the 23 students who meet these criteria)… Here’s what’s different when an AI “analyzes” your data: It doesn’t find patterns in your data. It generates text about your data based on patterns from its training. Let me unpack that:”

TECH/AI: EDUCATION

TECH/AI: ETHICS AND RISK

TECH/AI: INDUSTRY DEVELOPMENT

TECH/AI: SOCIAL

TECH/AI: USES AND APPLICATIONS

TECH/AI: GENERAL

    • SSRN
    • 10/03/25

    “Given current AI capabilities, the public already supports automating 30% of occupations. When AI is described as outperforming humans at lower cost, support for automation nearly doubles to 58% of occupations. Yet a narrow subset (12%)—including caregiving, therapy, and spiritual leadership—remains categorically off-limits because such automation is seen as morally repugnant. This shift reveals that for most occupations, resistance to AI is rooted in performance concerns that fade as AI capabilities improve, rather than principled objections about what work must remain human.”

Issues

Every week I send out articles I encounter from around the web. Subject matter ranges from hard knowledge about teaching to research about creativity and cognitive science to stories from other industries that, by analogy, inform what we do as educators. This breadth helps us see our work in new ways.

Readers include teachers, school leaders, university overseers, conference organizers, think tank workers, startup founders, nonprofit leaders, and people who are simply interested in what’s happening in education. They say it helps them keep tabs on what matters most in the conversation surrounding schools, teaching, learning, and more.

Peter Nilsson

Subscribe

* indicates required