Can chatGPT give reliable cancer advice?

Credit: Jonathan Kemper /Unsplash

The Study in a Nutshell

Researchers from Brigham and Women’s Hospital wanted to see how reliable ChatGPT, a chatbot powered by artificial intelligence (AI), is when giving advice on cancer treatment.

They compared the chatbot’s suggestions to widely accepted guidelines doctors use, focusing on three common types of cancer: breast, prostate, and lung cancer.

What they found was that while the chatbot usually included some correct advice, it also often included incorrect or incomplete suggestions.

Why Does This Matter?

Cancer is a life-changing diagnosis. It’s natural for patients to want all the information they can get. Many turn to the internet for answers, and chatbots like ChatGPT offer an interactive way to ask questions.

But with health, the stakes are high. Bad advice could have serious consequences. The researchers found that ChatGPT was not 100% reliable in giving correct cancer treatment advice.

It’s not just about being right or wrong, though. Treating cancer is more complicated than that. Even the right treatment for one person may not be the best for someone else.

Each patient’s condition is unique, involving a host of factors like age, overall health, and how far along the disease has progressed.

That’s why it’s crucial to consult healthcare professionals, like doctors and specialists, for personalized advice.

What the Results Tell Us

The researchers looked closely at 26 unique cases and asked the chatbot for advice multiple times for each one. In most of the tests, ChatGPT did say something that matched the accepted guidelines.

But in about one-third of those, it also included suggestions that were either incomplete or flat-out wrong.

For instance, for advanced breast cancer, the chatbot might recommend surgery but leave out that other treatments like chemotherapy might also be needed.

In a few cases, the chatbot came up with advice that was totally off the mark and not part of any standard treatment guidelines, like recommending treatments that haven’t been proven yet.

The researchers called these “hallucinations,” and they could be risky. Imagine being told that a certain treatment could cure your cancer when, in reality, it couldn’t.

That could not only be a waste of time and money but also possibly dangerous.

Final Thoughts

The study brings attention to the limits of AI chatbots like ChatGPT in the medical field. It’s a wake-up call for everyone.

For patients, it’s a reminder that while chatbots can offer quick answers, they aren’t a replacement for real medical advice.

For healthcare providers, it’s a nudge to be aware that their patients might be using these tools and to caution them about their limitations.

This isn’t to say that AI has no place in healthcare. In fact, it’s been making a lot of progress. AI tools are being developed to help doctors read X-rays more accurately, to assist in research for new medicines, and much more.

But when it comes to patient-specific medical advice, especially something as critical as cancer treatment, chatbots still have a long way to go.

So, what’s next? The researchers are planning more studies to understand how both patients and doctors can tell the difference between advice given by a chatbot and that by a real medical professional.

They’re also going to put ChatGPT through more tests with different and more detailed cases.

All in all, the study shows us that while AI can be a helpful starting point for information, it should never be the final word, especially for something as serious as cancer. Always consult a healthcare professional for medical advice.

If you care about cancer, please read studies about dry shampoo and cancer risk, and vitamin D supplements strongly reduce cancer death.

For more information about cancer prevention, please see recent studies about nutrients in fish that can be a poison for cancer, and results showing this daily vitamin is critical to cancer prevention.

The study was published in JAMA Oncology.

Follow us on Twitter for more articles about this topic.

Copyright © 2023 Knowridge Science Report. All rights reserved.