ChatGPT cannot and should not replace friends giving advice
On disconnection and loneliness
The other day, I was having coffee alfresco on a quiet London street with a work acquaintance. We were both a sweaty mess due to the ongoing and surprisingly consistent hot weather, but because of the sado-masochistic relationship we British have with the sun, refused to waste a drop of it in air-con. As sweat dripped down our chests, we started talking about AI and how maybe it wasn’t in fact the devil.
“I asked it for advice the other day on whether a man likes me,” I said sotto voce, as if I was discussing armed robbery, “and the advice was actually pretty good.” She nodded and said in similar hushed tones that she’d found the same. “It’s sometimes better than the advice my friends give me,” she murmured.
I agreed, and forgot all about it, until yesterday when I came back from a short break abroad with some new friends. It was a rare, precious weekend where I came back fizzing with laughter, in-jokes and feeling as if I’d known them forever, rather than a few months – and in the case of one person – a few days.
We had talked about everything from the serious to the silly, and upon my arrival in London, I’d then received a series of messages from an old friend that were a bit upsetting. “Is anyone free to chat?” I’d texted and two of them called me immediately. We talked through what had upset me, they gave me some gentle advice and I’d felt held. I felt as if with them by my side, I was safe, I was listened to, and I was supported. I might not follow their advice exactly, but I had options, and that made me feel good.
No offence to AI, but there is no artificial intelligence that can replicate this feeling. And my worry is that as we start to use it for advice - first telling ourselves that it’s ‘for fun’ and then increasingly when more problems arise - we will either use it to replace a fundamental pillar of friendship (problem-solving) or to avoid a fundamental aspect of friendship (truth-telling).



