If you own a smartphone, or have been around someone who does, then you've had at least some experience with artificial intelligence at this point. Any robot that you can talk to, and who will then talk back to you, like Siri, is AI.
This type of technology is evolving extremely fast, and now we are starting to see if being used in more industries and verticals, everything from automotive to entertainment. One very important vertical that is beginning to take advantage of this technology is healthcare, and it is here where it may have the biggest impact.
Sense.ly, an avatar based, platform that helps clinicians better manage their chronic care patients, has announced that it has partnered with Expect Labs, the company behind MindMeld, platform for building AI-powered voice-driven applications, to bring artificial intelligence to its app.
The app currently uses a Virtual Medical Assistant, named Molly, to help patients with their questions regarding their care.
"The way she is deployed today, patients in various hospitals and clinics can use Molly to follow-on with their chronic disease for long term maintenance. She follows up to see how they are, access them for risk, offer them education and insights and can tell them if they need to see somebody, or go to the hospital," Adam Odessky, told me in an interview.
The app would then connect with a real doctor or nurse to answer any other questions, if necessary.
"With artificial intelligence, and the MindMeld integration, she is now much smarter. She can offer more help and more advice. Instead of being followed up with, and pre-screened, with a script, now patients can speak to the application, and ask any question."
That can be anything from where to find the nearest doctor, to more information regarding medications or symptoms. With AI, Molly can be more interactive, and give patients more information,
"We have increased the usage and versalitility so patients can interact more, and then reduce costs in the heathcare system because they are less likely to go to emergency room," Odessky said.
These questions that are being answered, he told me, don't actually require a real doctor or nurse to answer them. The app has already reduced patient time by 20% if places where it has been used, and the company expects that number to go even higher with this integration.
"We used to connect them to a live nurse or doctor, but a lot of questions can simply be answered by an algorithm. The more AI features, the higher the automation rate and savings we will achieve," he said. "We can also flag risk based on words choice, and this will benefit clinicians who can then intervene earlier."
This integration is not only a leap forward for the healthcare industry, but the artificial intelligence industry as well, Tim Tuttle, Chief Executive Officer and founder of Expect Labs, told me.
The company has been working for over three years on technology that can help understand both written and spoken language, he said, but it has only been in the past year that the space has really taken off.
"The market is exploding right now and people are just starting to become aware of this. Siri has been available since 2011, but it is not until the last six to 12 months that it has started to really grow," said Tuttle. "Google has said that as much as 10% of search is coming through voice, and Apple announced that Siri has just over 1 billion inquires a week, which is amazing when you see that that number was negligible a year ago. What this is causing in the industry is that you have to create voice driven functionality in all apps."
MindMeld had already been seeing integration from certain verticals, including automotive, set top boxes and e-commerce, but only recently with the medical industry. And there are a few reasons why healthcare is so important to the future of AI.
First, Tuttle, said, is the sheer size of the healthcare industry. In the U.S. alone, $3.8 trillion was spent on healthcare in 2014. But also because the two industries fit well together, just in terms of who uses them.
"Another reasons why AI tech works very well is that many types of questions that patients often ask their nurses or physicians are exactly the same questions that recent machines have gotten really good at answering, so the promise of improving quality of care is really strong," he told me.
"Also, the demographics of the people who seem to rely on voice activated machines, the ones who use Siri heavily, many of them are the people who need a lot of healthcare. They are elderly, they might have partial or full disability, and are visually impaired or have their motor skills challenged. It can be hard to type on a small keyboard for them."
Odessky agreed with this point, noting that the ones are already using Sense.ly are those who are on the older end of the spectrum.
"This is what we have seen in practice in hospitals. We primarily deal with patients 50, 60 or 70, who have one or more chronic conditions. They are not technically savvy, and their first computing device is a smartphone or tablet," he said. "For them a virtual nurse is a natural interface, and its better than tough."
The company has seen very high engagement results, with as high as 80% of patients using Molly completing all conversations assigned to them without dropping off.
Both Odessky and Tuttle see this collaboration between the two companies as just the beginning for both industries, and a sign of things to come.
For Odessky, putting AI into Molly is the first step toward making it so that technology can play a much larger role in the lives of patients, comparing it to the movie Her, in which everyone walks around with their own personal assitant, acting as their companion.
"We won't ever replace doctors, or people spending their lives working with patients, but we will get to a place where technology can understand nuance and where it is sufficiently intelligent and insightful enough to keep them happy and informed, especially elderly patients, who are lonely and need something like that," he told me.
"On the economic side, we are seeing a major graying of the population, especially in Japan. The same thing will happen in China in 10 to 20 years, and we are already seeing the baby boomers getting older in this country. The problem is already big enough that we have a major shortage of doctors, and even more so with nurses, to scale the current system. It is basically going to be impossible, and we will have to rely on machines to provide the necessary medical work to provide continuous care and attention, which we don't have right now."
As for artificial intelligence, this is another sign to Tuttle of what has already been happening, and will continue to happen: every industry is now starting to need to incorporate AI and voice recognition.
"Over the next decade AI is going to become a core component in every app and device that gets built. It will start accomplishing tasks that we thought only years ago that only humans were capable of. It will be like a Star Trek computer. You will be able to talk to it, and it will be able to understand with human-like accuracy. This is a problem that will laregely be solved in next five years," he said.
"As for where you'll see this tech show up first, we are already seeing it on smartphones, and we going to see that this tech becomes a fundamental feature of not just smartphones apps, but enterprise apps, for the connected home, and wearables. It's very likely that in five years or less it will be routine to talk to your smartphone and every app. You'll talk to your car, and most rooms in your house will be answer questions."