| It happened on a Sunday morning, which already felt like part of the problem. [Well, it didn’t, it happened the night before when l was out with Suze having a walk in the town] I noticed it Sunday morning! It was on the Sunday before Christmas when everything in Sandwich runs to a snail’s pace. I’d lost my business card somewhere in town. One of those quiet, unsettling moments where you stop what you’re doing and mentally retrace your steps, hoping the answer will simply present itself. It didn’t. The card was gone. Shops were closed, offices shut, phones unanswered. The town itself felt paused. The only thing still “open,” as it turned out, was AI. I should say from the outset that I don’t dislike AI. In fact, I’m quite the opposite. I admire how far it’s come and what it’s capable of. It’s clever, efficient, and often genuinely helpful. I’ve used it to save time, organise thoughts, untangle complicated ideas, and make sense of things that would otherwise feel overwhelming. So when I opened the bank’s online chat system, I wasn’t already frustrated or suspicious. I assumed it would be simple. Logical. Efficient. It wasn’t. What followed felt like a strange kind of digital obstacle course. A question. A menu. A clarification. Another menu. A slightly reworded version of the same question I’d already answered. Each step, on its own, made sense. Taken together, they formed a maze. I kept thinking, surely now I’ll be passed to a person. After all, this wasn’t a casual enquiry. I wasn’t checking a balance or asking about opening hours. I’d lost a card. Access to money. Security. Peace of mind. These aren’t abstract issues. Twenty-five minutes later, I finally reached a human being. By then, the relief was real, but so was the irritation. Not irritation at technology itself, but at the experience of being held at arm’s length from the one thing I actually needed in that moment: a calm, competent person who could simply say, “Right, let’s sort this.” And that was exactly what happened. Once I reached a human, the issue was resolved quickly and efficiently. This raised an uncomfortable question in my mind: if the solution required a person anyway, why did it take so long to get there? It reminded me of something Eugi mentioned in response to an earlier reflection — that while technology has made life simpler in many ways, it has also made it more complicated. I’ve found myself thinking about that comment more than once, because this situation felt like a perfect illustration. Nothing about the problem was especially complex. And yet the experience itself felt heavier than it needed to be. That’s the thing I keep circling back to. AI is excellent at handling processes, but it still struggles with context. It doesn’t feel urgent in the way people do. It doesn’t make sense that the quiet edge of anxiety, or the need for reassurance, rather than efficiency, is the issue. It follows its structure faithfully, even when the situation calls for flexibility or discretion. There’s something slightly unsettling about being in a moment of stress and having to prove, repeatedly, that your problem qualifies as serious enough. The system isn’t unkind — but it’s not kind either. It’s neutral, procedural, and patient in a way that doesn’t always align with how humans experience time when they’re worried. And maybe that’s where a lot of the modern discomfort creeps in. Technology has undeniably made life easier in countless ways. I can bank, book appointments, manage work, communicate, and plan without leaving my chair. Tasks that once took hours can now be completed in minutes. But alongside that convenience, a new kind of complexity has quietly taken root. One where we’re constantly navigating systems, interfaces, passwords, prompts, and automated decisions. One where efficiency sometimes comes at the cost of reassurance. In theory, these systems are designed to help. In practice, they can feel like gatekeepers. Particularly when they’re built to filter, delay, or reduce human contact rather than prioritise it when it truly matters. I don’t think the answer is less AI. That feels like the wrong conclusion. The technology itself isn’t the problem. The issue lies in how and where it’s deployed, as well as the assumptions built into it. A lost card, a health concern, a billing error, a moment of vulnerability — these are not ideal places for rigid automation. There are moments where people are already slightly off balance, already managing a quiet sense of risk. Progress shouldn’t feel like friction. And support shouldn’t feel like something you have to earn by selecting the correct combination of options. What struck me most, in the end, was how ordinary the human interaction felt once I finally reached it. Calm. Clear. Reassuring. No drama, no complexity. Just someone doing their job well. Which only reinforced the thought that sometimes the simplest solution is also the most appropriate one. I’m not longing for the past, and I’m not suggesting we roll back modern systems or conveniences. I’m simply noticing something. That in our rush to make everything smarter, faster, and more efficient, we may be overlooking the moments that still call for something slower and more human. Not everything needs optimising. Some things just need listening to. And sometimes, especially on a quiet Sunday when something important has gone missing, what you really want isn’t a perfect system — it’s a person on the other end who can say, “You’re through now. Let’s take care of this.” |
Why Human Touch Matters in a Digital World
Thanks for the mention, Rory, and I hear ya! I’ve been trying to resolve an issue with my cellphone carrier for a month. Sometimes, AI just doesn’t get the gist of the problem. I hate to think about the future where AI becomes our only “go to”.
LikeLike