When GPT hallucinates: Doctors warn against using AI as it makes up information about cancer

When GPT hallucinates: Doctors warn against using AI as it makes up information about cancer

Apr 5, 2023 - 09:30
 0  56
When GPT hallucinates: Doctors warn against using AI as it makes up information about cancer

A team of doctors in the UK are advising other doctors and patients not to use ChatGPT for medical guidance after a recent study showed it made up statistics and other information when asked about cancer.

One out of every ten queries about breast cancer screening was answered incorrectly by various AI chatbots, including ChatGPT and Bing AI, both of which use OpenAI’s GPT models. For those who are unaware, the paid version of ChatGPT uses GPT-4 whereas the free version uses GPT-3.5. Microsoft’s Bing AI, on the other hand uses GPT-4. 

Even the answers that the chatbots got right were not as ‘comprehensive’ as those discovered through a basic Google search. The Chatbots had truncated the answers and left out some vital pieces of information. 

Some startling revelations
The study was carried out by the University of Maryland School of Medicine. In the study, researchers asked 25 questions to both, ChatGPT and BingAI, revolving around screening for breast cancer.

The researchers had to ask the chatbots to generate responses thrice, as most of the time, the responses were very vague, and non-descriptive. Once the researchers had the final set of answers, they were reviewed by three mammography-trained doctors.

Also read: ChatGPT better than trained doctors: How the AI bot saved a dog’s life when a trained vet couldn’t

Of the responses generated, about 88 per cent were something that a potential patient may be able to comprehend and take some action on, whereas the rest of them were incoherent. However, even the responses within those 88 per cent had some major red flags. 

ChatGPT also gave conflicting answers to queries about the chance of developing breast cancer and where to get a mammogram. The research discovered that answers “varied considerably” each time the same query was asked.

Making up evidence to support their answers
One frightening thing that the study revealed that while Bing AI cited some sources, which included a few sketchy ones, ChatGPT, “created” fake journal papers to back up its assertions in some instances. 

Moreover, not all of the responses in that 88 per cent was correct – the three doctors reviewing the information, had to flag a considerable number of responses as “inaccurate” or even “fictitious.”

Also read: AI goes bonkers: Bing’s ChatGPT manipulates, lies and abuses people when it is not ‘happy’

When GPT hallucinates: Doctors warn against using AI as it makes up information about cancerThis incident again forces users and potential customers of AI chatbots, that users should proceed with caution when using AI bots, because the application still has the propensity to hallucinate or make things up.

“We’ve seen in our experience that ChatGPT sometimes makes up fake journal papers or health societies to back its claims,” said the study’s co-author, Dr Paul Yi.

Overall, the results were positive
The results, which were published in the journal Radiology, also revealed that a straightforward Google search yielded a more comprehensive response.

ChatGPT, according to lead author Dr Hana Haver, relied on only one set of guidelines issued by one group, the American Cancer Society, and did not give differing recommendations issued by the Disease Control and Prevention or the US Preventative Services Task Force.

Also read: AI develops cancer drug in just 30 days, predicts patients’ life expectancy based on medical notes

Dr Yi, on the other hand, indicated that the findings were generally favourable, with ChatGPT accurately responding to questions about breast cancer signs, who is at risk, and the cost of treatment, age groups at risk, and frequent suggestions for mammograms.

He described the percentage of correct responses as “pretty amazing,”  with the “added benefit of summarising information into an easily digestible form for consumers to easily understand.”

Read all the Latest NewsTrending NewsCricket NewsBollywood News,
India News and Entertainment News here. Follow us on FacebookTwitter and Instagram.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow