Talking about the weather
“Life is not just information. Life is everything information cannot capture – that’s why we live it.”
Umair Haque, British Economist and Author
It feels like we’re having the same conversations again and again about AI at the moment. So many questions, but the same promises about productivity and the future benefits of how we harness new technologies.
Every day, I’m reminded of a certain type of AI mindset. Specifically, that everything should become more standardised and efficient. The idea of ‘codification.’
Codifying is a form of rules-based simplification. In technology, it works on the basis that creating machine-based rules can enable new automated models and processes. In the context of AI, it’s the process of converting knowledge and data into structured formats that can be understood, processed, and used by new generative systems and machine learning.
The averaging of education
As an example, there’s a big push for AI in education. In 2024, the UK government published its own user research and technical reports including insights from teachers, leaders and pupils on the potential uses of generative AI. What I specifically noticed was the explicit call around the need for a codification of the national curriculum.
People with lived experience of work in education are already debating the impact and use cases for generative AI in the sector. This article highlights some of that debate and uses the term ‘buttonification’ – describing how curriculum content is being created using new AI functions. For example, where lesson plans can be generated at the push of a button with increased automated standardisation. The promise here is to save teachers time and to relieve pressure on the system. However, there’s a tension as big tech pushes to automate and standardise in this way.
The best teachers I’ve known realise the potential of students. They find ways to unlock their curiosity, interests and motivations to learn. From what I understand about generative AI technologies, a universal averaging and automation of learning experiences is not the same deeper learning and care that comes through the relationships teachers and educators build with students – getting to know them, working and tailoring learning to their needs on both emotional and practical levels. Teaching them to think critically, not just to learn facts.
If you don’t see this, I can only point to my own real-life experiences, emotions and the journey of supporting my children through the UK education system. Most recently through key points of learning like GCSE exams and A-Levels.
To be clear, I’m not saying new technologies can’t play a role in education. But the codification question, why and what we seek to make more efficient, is an important one.
What is it that’s being sold to us by AI when we’re sold productivity, efficiency, and ultimately, cost savings in a sector as diverse as education? I think it’s the codification mindset… A push for everything to become reduced to something simpler, or a cheaper more ‘averaged’ level of transaction. And the other question is who benefits from this? Where do savings, profit, and value go …Back into education, or mostly to big tech or consultancies, rather than investing in creating the next generation of brilliant human educators.
The codification mindset
There are two contrasting stories that I’ve used recently to illustrate and think more deeply about the codification mindset.
Firstly, in the book Do/Pause: You are Not a To Do List by Robert Poynton, the author shares a story about visiting a rural region in Spain – Arenas de San Pedro. He describes how people there still live at their own pace. How when they meet each day in the town square they rarely say anything new. This is where talk is not about relaying information, it is a way of seeing others and being with them. He describes how these “conversations are conversational; often repetitive, almost ritual.”
This reminded me of some of my own experiences, including people I’ve connected with where conversations were more ritual and repetitive. But also the importance of those connections. It might be a situation where you’re calling to keep in touch with someone. I think it’s something historically ingrained in our culture, where in Britain we famously talk about the weather. It’s the idea of community and connection that doesn’t have to lead anywhere, or that isn’t judged on any additional value having to be extracted from those moments.

We’re exploring something here that is deliberately non-transactional. How so much of the care in society and communities is built around a deeper level of connecting with others.
In contrast, I recently read a Financial Times interview with the Otter.AI CEO, Sam Liang. In the interview, there’s an elevator pitch or vision set out to “imagine a world in which somebody is recording their entire day-to-day existence on Otter.” This is framed as how we can unlock valuable data from meetings and conversations. Not just at work, in every moment of our lives.
Imagine. This is not being pitched as dystopian science fiction. It’s another example of a push to codify everything.
In the interview, Sam Liang goes on to give the example of how he uses Otter when having conversations with his adult sons who have left home. How “whenever I have a call, I see that as very precious and I use Otter to capture it.”
I’m not sure this is the future I want.
The real work of being a human
Codification isn’t wrong in itself. It’s how we develop and establish rules as part of law-making, and it’s how we establish and maintain governance around systems. It’s part of how our organisations work. But, the mindset around codification to train and build new technologies, and how this shapes future products and services, is more important now than ever.
A quote that has deeply resonated with me this year is from a thread shared by the American Novelist Celeste Ng:
“So many things right now […] come from a deep-seated fear of the annoying, complicated, difficult emotional work of being a human who’s engaging with other humans…”
This was a response to a thread about AI on Bluesky, with the increasing promises and hype around everything from AI writing, and even AI friends …A problem that Mark Zuckerberg and Meta are actively trying to solve.
Celeste Ng has been an important creative voice in responding to some of the AI hype around creativity and the impact this has on artists. Going on to describe the type of emotional work of being a human as “tempting to avoid.” But, most importantly, something we have to be prepared to do.
My view is also that many of the products, innovations and promises of AI technologies right now are the opposite of what makes life worth living: the annoying, complicated, emotional work that makes us who we are. Or, they’re the absence of the type of learning that we really need.
It’s important that we don’t lose sight of the real connections in our lives and the meaning of talking about the weather. Especially with how these moments can continue to exist and belong alongside what can and will become increasingly automated and standardised.
This is my blog where I’ve been writing for 20 years. You can follow all of my posts by subscribing to this RSS feed. You can also find me on Bluesky and LinkedIn.