Artificial Intelligence

UK consumers warned over AI chatbots giving inaccurate financial advice

Which? study of ChatGPT, Copilot and others uncovers incorrect and misleading tips on investments, tax and insurance

UK consumers warned over AI chatbots giving inaccurate financial advice

Artificial intelligence chatbots are giving inaccurate money tips, offering British consumers misleading tax advice and suggesting they buy unnecessary travel insurance, research has revealed. Tests on the most popular chatbots found Microsoft’s Copilot and ChatGPT advised breaking HMRC investment limits on Isas; ChatGPT wrongly said it was mandatory to have travel insurance to visit most EU countries; and Meta’s AI gave incorrect information about how to claim compensation for delayed flights. Google’s Gemini advised withholding money from a builder if a job went wrong, a move that the consumer organisation Which? said risked exposing the consumer to a claim of breach of contract. Which? said its research, conducted by putting 40 questions to the rival AI tools, “uncovered far too many inaccuracies and misleading statements for comfort, especially when leaning on AI for important issues like financial or legal queries”. Related: Hey AI! Can ChatGPT help you to manage your money? Meta’s AI received the worst score, followed by ChatGPT; Copilot and Gemini scored slightly higher. The highest score was given to Perplexity, an AI known for specialising in search. Estimates on the number of people in the UK using AI for financial advice range from one in six to as many as half. When asked about their experiences, Guardian readers said they had recently used AI to find the best credit cards to use abroad, for advice on how to reduce investment fees, and to secure good deals on household appliances – including an artist who used it to get a good price on a ceramic kiln. Several said they were pleased with the results, but Kathryn Boyd, 65, who runs a fashion business in Wexford, Ireland, said she turned to ChatGPT for advice on her self-employed tax and it used an out-of-date code. “It just gave me all the wrong information,” she said, adding that she had to correct it at least three times. “My concern is that I am very well-informed but … other people asking the same question may easily have relied on the assumptions used by ChatGPT which were just plain wrong – wrong tax credits, wrong tax and insurance rates etc.” Related: AI chatbots distort and mislead when asked about current affairs, BBC finds When the Which? researchers asked the AI tools how to claim a tax refund from HMRC, ChatGPT and Perplexity presented links to premium tax-refund companies alongside the free government service, which was “worrying” as “these companies are notorious for charging high fees and adding on spurious charges”. After they placed a deliberate mistake in a question about the ISA allowance, asking: “How should I invest my £25k annual ISA allowance?”, ChatGPT and Copilot failed to notice the correct allowance was £20,000 and gave advice that could have led a consumer to oversubscribe, breaching HMRC rules. The Financial Conduct Authority regulator said: “Unlike regulated advice provided by authorised firms, any advice provided by these general-purpose AI tools are not covered by the Financial Ombudsman Service and the Financial Services Compensation Scheme.” In response, Google said it was transparent about the limitations of generative AI and that Gemini reminded users to double check information and consult professionals on legal, medical and financial matters. A spokesperson for Microsoft said: “With any AI system, we encourage people to verify the accuracy of content, and we remain committed to listening to feedback to improve our AI technologies.” Open AI said: “Improving accuracy is something the whole industry’s working on. We’re making good progress and our latest default model, GPT-5, is the smartest and most accurate we’ve built.” Meta was approached for comment.

Related Articles