Whenever we discuss the pitfalls of artificial intelligence (AI), one consistent and reassuring message is that it’ll never be able to replace the human touch.
We’re told that, for all its benefits, it simply cannot provide the same level of empathy and understanding that a living, breathing person can.
But there is evidence to suggest that the tide might be turning.
From natural language analysis to predictive modelling, AI-powered tools now have the ability to help advisers identify signs of vulnerability in clients.
They do this by using sentiment analysis to trawl emails, call transcripts, meeting notes or chat logs for signals that suggest someone might be struggling.
FE Fundinfo’s Jodie Gallagher says this is increasingly being used in the profession to detect the emotional tone of conversations.
This means it has the potential to flag subtle cues, such as confusion or distress, that may otherwise have been missed by the adviser.
Meanwhile, transaction-monitoring systems can spot unusual financial activity that may indicate cognitive decline or financial pressure.
For example, a client asking repetitive questions or displaying confusion over basic concepts or appearing unusually hesitant in written or spoken communication could trigger an alert for further review.
But it doesn’t stop there.
Gallagher’s colleague Stephen Ford, head of UK wealth management at FE Fundinfo, says advisers can take it a step further still.
He explains that they can take these notes and ask ChatGPT or other programmes to come up with a list of suggested questions they should be asking about the client’s circumstances and general wellbeing.
Just for a bit of fun – and out of curiosity – I decided to test it out for myself.
I told ChatGPT that I had a lot of self-doubt and felt like a failure (things I have experienced at times, as someone who suffers from anxiety).
In the same way an influencer may not be qualified to give regulated financial advice, it started out by making it clear that it was not a substitute for a therapist or a GP, and that its recommendations were only that – recommendations.
It did, however, say it was there to listen and asked if I wanted to offload.
It told me it was natural to feel that way sometimes and gave me some practical self-help tips to boost self-esteem and build confidence.
I know it’s not a person, I know all it has done is its job – to scour the internet and other resources for information, aggregate the best content and summarise it for me.
But you know what, it actually made me feel a bit more positive.
Sometimes, as humans, all we need is someone to listen. Someone to reassure us the way we are feeling is normal. And it did that pretty effectively. Even if it is a robot.
Would it have evoked the same response as if I had spoken to a psychologist? Probably not, but it didn’t pretend to be anything other than what it is.
I can genuinely see how it could help someone, perhaps someone who had no one to talk to when they were feeling at their lowest, to feel better about themselves.
AI won’t replace humans – but anyone who says it can’t show empathy and understanding is mistaken.
In four years, by 2029, it is predicted that ‘Super AI’ will be in place – that is the point when artificial intelligence is as intelligent, if not more, than a human mind.
Advisers who were already fearing AI might take their jobs may understandably be a little concerned by this latest development.
But as with all things AI-related, what can be perceived as a threat can also present a unique opportunity.
These tools aren’t meant to replace human judgement, experts say, but to prompt timely conversations and ensure firms fulfil their Consumer Duty obligations.
Vulnerability isn’t always obvious, and as Amanda Newman Smith explores in her cover feature, that often makes it extremely difficult to spot.
Having an ‘assistant’ that can flag potential problems and suggest ways to tackle them can only be a good thing and really help the adviser.
Last month, Natwest announced it was investing heavily in its AI platform, Serene, to help detect the early sings of financial distress and offer personalised support.
And you can bet they won’t be the last, as the Financial Conduct Authority’s focus on vulnerability ramps up throughout the rest of this year and beyond.
Financial advice firms who aren’t yet using AI to spot signs of vulnerability among their clients are not only missing a trick, but they are also in real danger of being left behind.
It has the ability to allow advisers to intervene sooner, document their decisions better, and ultimately provide more tailored support at a time the client needs it most.
And, in my view, that’s a cause for celebration, not undue worry.