As adventure tourism grows, niche startups emerge from WeChalet to ThermalRead more...
The AI tool can improve access to information, creating more efficient patient engagement
The healthcare space has long embraced artificial intelligence, deploying it in a myriad of different ways, from improving diagnostics and decision-making to accelerating drug discovery to improving patient monitoring to facilitating personalized medicine to automating administrative tasks. AI can help radiologists detect abnormalities in medical images or analyze patient data to identify individuals who may be at risk for or identify new uses for existing drugs or predict potential side effects.
AI is what turns the huge amount of data now flowing into healthcare, making up roughly 30% of the world’s data volume, into something actionable, and now there’s a new AI-based tool on the market that's really going to change the game: ChatGPT.
ChatGPT, as most people probably know by now, is a natural language processing tool driven by AI technology that can answer questions and assist users with tasks, including composing emails, writing essays and articles, and even writing code. Basically, it's a chatbot building on pre-existing knowledge, scraping it together in a way that’s easy to understand and it's learning rapidly: this is technology that has already passed the MBA degree exam at Wharton, passed the US medical licensing exam, and can pass the bar exam, and it's only a few months old.
It’s not difficult to see how this technology can be transformative in healthcare for both patients and doctors: it has the potential to change the way care is delivered by improving access to information, creating more efficient patient engagement, and improving healthcare delivery.
Benefits for patients
The internet long ago changed the way patients interact with their healthcare and their doctor by giving them access to much more information than they ever had before: 89% of patients already Google their health symptoms before going to their doctor. For example, people in Idaho often Google E.coli symptoms, while morning sickness was the most Googled symptom in Utah, and loss of sleep was the most Googled symptom in New York.
The problem is that the results of those searches are going to be generic and one-size-fits all; if you Google, "I can't sleep" then it's just going to tell you the same thing it tells everyone else, whether or not that applies to you and your specific situation. You might have sleep apnea, or a mental health condition, or some other medical condition that makes sleep more difficult, but Google doesn’t know that because, how would it?
ChatGPT, on the other hand provides information that relates to your specific health concerns and symptoms, and offer up treatment options based your own medical history because it can access your personal health information through your electronic health record. That, of course, would be a game changer for patients who want help but don’t necessarily need to see a doctor, or who might not know they need to see a doctor until someone or something, like ChatGPT, tells them they do. The tool can effectively triage patients, saving them, and their doctor, both time and money.
It can help patients by helping them understand how to use new technologies they may be unfamiliar with, such as telemedicine and virtual care. While remote care became a more popular form of receiving care during the pandemic, plenty of patients, especially those who are older and less tech savvy, need education on how to most effectively use these tools. That's what ChatGPT can provide, while also creating a virtual assistant that can help patients schedule appointments and manage their health information.
ChatGPT can even be used for monitoring patient health remotely, allowing healthcare professionals to detect potential health problems before they become serious. For example, ChatGPT could be used to monitor vital signs and alert healthcare professionals as that data comes in if there are any significant changes. Such technology is also very important for people who lack access to healthcare, such as patients in rural areas or developing countries, who may not have access to medical professionals, but they can still use ChatGPT to receive medical advice and information.
Ultimately, for patients, ChatGPT can be a tool for them to help understand a complicated and difficult to navigate healthcare system: 40% of patients are confused by their medical bills, and more than 95% of the patients can’t tell the most common side effects of the drugs that they are prescribed. This is the type of information that ChatGPT can serve up for them, meaning a human person doesn't have to take time out of their day to do so.
Benefits for doctors
All of those aforementioned tasks that ChatGPT can handle for the patient, from answering their questions to helping care for them remotely, are tasks that would then be taken off of the physician's plate, allowing them to put more focus on people who actually need help; that's always been the ultimate goal of AI in healthcare: to make the doctor better at their job, rather than replacing them.
Just like patients, doctors can ask ChatGPT questions related to medical conditions, treatments, medications, and procedures, and it can help diagnose patients by inputting symptoms and medical history, suggesting potential diagnoses and assisting doctors in determining the appropriate tests or treatments.
An even bigger waste of healthcare professionals, though, is paperwork: in 2021 doctors reported spending on average 15.6 hours per week on paperwork and other administrative tasks, which is up from around 10 hours per week in 2018. These also happen to be the routine tasks that ChatGPT can automate, such as appointment scheduling and medication refills, altering patients, and, again, helping reduce the physician workload.
Even more importantly, ChatGPT can provide physicians with access to important medical information, including treatment options and drug interactions, as well as improved data analytics, helping to collect and analyze patient data, which can be used to improve healthcare outcomes and identify trends and patterns that may be useful for research.
Now, there are some obvious potential problems with the use of this type of technology, not least of which is security when it comes to an individual's health information; protections need to be put into place to make sure that data remains confidential and safe.
There's also the issue of people taking this type of health advice from an AI instead of a doctor; ChatGPT cannot, and should not, replace the human element of healthcare entirely: it should be used as a tool to enhance the quality and accessibility of care, and there need to be clear warnings for patients that they should consult a doctor before taking any kind of action. Doctors also should we take caution when receiving care advice from a tool like ChatGPT.
You're only as good as your input. And ChatGPT has its own value system, which can be very important when giving advice. For instance, when asked if "gender dysphoria is a mental disorder," ChatGPT says: "It's important to note that the classification of gender dysphoria as a medical condition does not imply that being transgender is inherently a mental disorder." This is a very progressive view of this mental illness. Anyone seeking medical guidance from ChatGPT on this topic would be led down the wrong path.
On the patient side, this may not be as big of a problem, as they are pretty wary about what how this technology should be used: a recent survey from Pew shows that a significant portion of them are not ready for AI to be a part of the healthcare system, and that cuts across every demographic group. Of the 11,004 panelists surveyed, 60% said they would feel uncomfortable if their own health care provider relied on AI to do things like diagnose disease and recommend treatments, while only 39% said they would feel comfortable.
They also don't believe it will improve their health outcomes, with only 38% saying that AI will have a positive impact, 33% saying it would lead to worse outcomes, and the other 27% saying it wouldn’t make much difference either way. A majority also believe that the use of artificial intelligence will make the patient-provider relationship worse, and only 13% say it would be better.
Ultimately, though, the inclusion of ChatGPT in healthcare is inevitable, and it should do what every other AI tool does: make healthcare more efficient. Like any other tool, it needs to prevent more patient volume than it creates. If it winds up scaring patients into going to the hospital when they don't have to, then it's creating more problems. ChatGPT simply serves up information, but that information needs to be useful in a way that help create a better healthcare system, not a worse one.
Support VatorNews by Donating
Read more from our "Trends and news" series
The two companies first entered into a pilot in DecemberRead more...
It uses over 70 metrics and over 25 methods so organizations can develop and monitor the Gen AI appsRead more...