Ben Holliday

Good admin and everything you don’t have to automate

I talked about the AI codification mindset in my last post. I wanted to share some further notes about productivity and, specifically, the importance of sensemaking in front-line work.

The current promises of AI investment for organisations are largely framed and sold as productivity and efficiency – the ability for professional practices to operate more cheaply, increase system capacity, and meet increasing service demand.

I’m not yet seeing public sector teams or organisations that are less busy. Applied in this context, so far the impact of AI has looked more like work expanding to fill any spare capacity that is being created, potentially helping to deal with existing backlogs in government. With the introduction of new technologies, this is also where new processes and tasks emerge, including the need for new checks and balances. All potentially offsetting true productivity gains.

The choice to automate

An argument I’ve heard made more recently is that it should be a choice to automate, at least at an individual productivity level. Most importantly, if processes are important to your work and if they’re how you or your teams think or deliberate, then you shouldn’t feel pressure to automate or standardise them.

This Tech Policy article talks about this point saying:

“The productivity myth suggests that anything we spend time on is up for automation — that any time we spend can and should be freed up for the sake of having even more time for other activities or pursuits — which can also be automated. The importance and value of thinking about our work and why we do it is waved away as a distraction. The goal of writing, this myth suggests, is filling a page rather than the process of thought that a completed page represents.”

The need to reimagine work

One of the issues here is when we fail to reimagine how we work before undertaking any investment to increase our productivity. I’ve written about this before as the need for service design before automation. How we will otherwise still be optimising processes reliant on spreadsheets, email chains and linear processes that copy information back and forth across teams and departments. This is compared to the potential of rethinking these processes through a lens of real-time collaboration and data sharing.

This type of ‘layered automation’ and lack of reimagining also points to how we’re increasingly defining our work: for a world that values outputs over process, rather than a reality of understanding and learning through doing – like how I think about my process of writing. It’s thinking.

Sensemaking and social work

Many of the jobs involved in front-line services and care navigate worlds of professional practice that require sensemaking. This is the process of making sense of an individual situation, especially given new developments, and building on a practitioner’s knowledge and prior experiences. It’s the work of people in front-line services to observe, consider and translate the needs of those they provide care for.

It’s been interesting to observe the growth of transcription AI tools like Magic Notes. I read yesterday in a post by Beam CEO Alex Stephany that Magic Notes is now being used by most social work teams.

This is an area that touches on early learnings from FutureGov* work focused on the significant admin burden of professionals in front-line social care. In discovery research with three London boroughs, our teams found that practitioners were spending on average 60% of their time managing data entry, note-taking, and dealing with other admin, leaving little time to spend with the families and young people most in need.

I’m not at all opposed to the use of these technologies. Auto-transcription was recognised as useful in early pilots with councils long before it was packaged and sold as AI – recognising that the latest transcription tools now have more advanced functionality to analyse, summarise and even create action plans from automated note-taking. However, there are important considerations, and I think there is now counter-evidence about how these technologies impact professional work.

Reflecting different viewpoints, this 2024 article and research from Community Care has interesting feedback from social workers. Importantly, it highlights how note-taking can be seen as a key social work skill which aids reflection and decision-making:

“It is often the act of writing notes that prompts the thinking and reflection that prompts action, that moves practitioners from simply recording ‘what [happened]’ to thinking about ‘so what’ and ‘now what’,”

The slowing down on note-taking. The process of deliberating over appropriate responses and actions is part of this work. That’s why it’s important. So simplifying and standardising has its own cost beyond immediate time savings.

A point was also made in this article that the original introduction of electronic patient records was intended to free up time, moving us away from paper record keeping. But this reached a point where social workers were spending more and more time in front of screens. It’s a reminder that we have a history where tech interventions don’t always lead to the changes and efficiencies we imagine.  

No clear path

Today, and with the latest tools and technologies, it’s important that we remember that these aren’t straightforward efficiencies. There is no clear path. Especially if the time saved from immediate admin tasks creates new demands or admin at other points in the system.

This reminds me of stories I’ve heard where other front-line workers are having to spend time fixing automated transcriptions and note-taking, sometimes in situations that could potentially have serious consequences. In one example, also from 2024, this Wired article talks about how OpenAI’s Whisper transcription tool fabricated text in medical and business settings despite warnings against such use. However, the counter-argument here will always be that the technology and its capabilities are improving all the time, reducing error rates.

Admin that requires care

To bring us back to where I started, human experience and judgement isn’t binary in the same way that we see a drive towards productivity with codification and standardisation.

Any AI-based productivity gains can not come at the expense of the equity of outcomes and experiences that our systems of care deliver. Good admin, including enough time for professional thinking and deliberation, is what this demands.


*FutureGov became TPXimpact in 2021.

This is my blog where I’ve been writing for 20 years. You can follow all of my posts by subscribing to this RSS feed. You can also find me on Bluesky and LinkedIn.