By Deborah Borfitz
July 23, 2020 | Amazon’s popular chatbot has progressed far beyond music streaming and weather reports since its 2014 market debut. When the COVID-19 pandemic hit and emergency rooms began filling with people worried they had caught the virus, the Mayo Clinic quickly developed an Alexa “skill”—an app-like, voice-driven capability that can be user-enabled on the device—to help stem the tide with answers to frequently asked questions about symptoms and who should get tested. It joined an earlier, Mayo-built first-aid skill offering Alexa users self-care guidance for everything from treating a burn to administering CPR.
Since “Mayo Clinic Answers on COVID-19" launched in the Amazon Skills store in mid-April (a separate Canadian version became available earlier this month), it has had 478 interactive sessions, while the nearly three-year-old first aid skill has racked up 7,892 sessions, according to a Mayo spokesperson. Mayo has the only COVID-19 skill on Alexa and “pursued a special approval for this based on our working relationship.”
The Mayo Clinic doesn’t have a business associate agreement (BAA) with Alexa that would make the skills compliant under the Health Insurance Portability and Accountability Act (HIPAA) because Mayo doesn’t gather protected heath information for the skills it has produced, the spokesperson says. But it may seek a BAA in the future, “once we determine how best to use voice technology within the practice.”
As has already been proven in industries outside of healthcare, “human beings want to interact with various businesses in the simplest way possible,” says cardiologist Steve Ommen, M.D., medical director of the Mayo Clinic Center for Connected Care who also leads Mayo Clinic’s digital care department. The ubiquity of smart speakers has also shown that people like the convenience of asking questions without having to use their thumbs.
“We think voice command interactions are likely to stay,” says Ommen. Multiple Alexa skills are currently undergoing testing to determine if they can help further Mayo’s quest to make healthcare less episodic and democratize the availability of expert care. Alexa also has research value in better understanding user experiences and concerns during the current pandemic.
The assumption is that technical solutions, including chatbots and virtual avatars, will become more integrated with telemedicine efforts moving forward, Ommen adds. Ideally, that integration will happen automatically, so patients don’t need to launch an app or otherwise re-explain everything a second time.
Such front-end simplicity requires handling a string of backend complexities, he continues. “To deliver healthcare or health advice probably requires multiple systems that don’t inherently work with one another, so we would have to build the wrapper that goes around them so to users it feels like one thing.”
Ommen uses the analogy of a car that has been seamlessly assembled by an auto manufacturer, so all the end consumer must do is step on the gas and go. Similarly, patients can click to make an appointment with a doctor after a symptom check by a virtual nurse because of several interconnected technical solutions in the backup queue.
Mayo has a long list of collaborators enabling this connected care universe, says Ommen, including Epic (electronic health record), Zoom (HIPAA-compliant web and video conferencing platform), InTouch (now part of Teledoc Health, another video solution used for some acute care telemedicine activities), Google (cloud computing) and various vendors for in-home monitors and sensors. Much of the interoperability work is done in-house, with assistance from healthcare platform developer Validic in aggregating and normalizing data from different remote monitoring devices.
Seemingly easy Q&A interactions with a chatbot requires building a higher level of compute behind the processing, either on the device or in the cloud, says David Ryan, general manager of health and life sciences at Intel. Artificial intelligence (AI) can “look for patterns in the data and highlight the anomalies for further review by a clinician. The software learns what normal looks like and, with enough data and processing, help to predict a downstream event, alerting the patient or the clinician in time to prevent the issue.”
At the physician practice level, Alexa is best suited to providing continuity of care when a software-as-a-service telehealth platform, such as Updox, is being used by doctors to manage their patients remotely, says Codrin Arsene, CEO of Digital Authority Partners. That is, the platform could be leveraged to “bridge the gap” between virtual visits by having it be the conduit for specific medical advice and recommendations as well as the feedback physicians need to make sound clinical decisions.
Physicians typically have a long-term relationship with their patients and a stake in how well their chronic diseases get managed, Arsene says. Providers working for a telemedicine service, such as Teledoc or Babylon Health, wouldn’t necessarily have the same incentive to invest in these types of tools because patient visits tend to be more transactional in nature.
Digital Authority Partners is currently working with a large insurance company to create Alexa skills embedded with doctors' orders so the digital assistant will remind patients to take specific drugs at precisely the right time. Similarly, it has been tapped by the manufacturer of a blood pressure device to ensure its Alexa skill will prompt people to take their blood pressure as frequently as instructed by their doctor.
Like any device with an internet connection, chatbots or home digital assistants could theoretically serve as a patient portal from which people make appointments, fill their prescriptions, and pay their bills, says Ryan. “People can be intimidated if they have to download an app or register on a new system, so as much as it can be what they are used to… simplifies the adoption.”
“We have [also] seen chatbots be an effective way for health systems to let patients self-triage and not overwhelm the nurse call lines during the pandemic,” Ryan says. In a report released in April, Gartner highlighted six virtual assistants being developed around the globe that are using natural language conversational interfaces and AI technologies to provide COVID-19-specific education, screening, triage and home monitoring.
‘Lane Drift Detection’
The human voice could perhaps be used as a biomarker for detecting disease, a possibility now under investigation at Mayo, says Ommen. Human ears are imprecise instruments relative to the inaudible raw digital signals that Alexa’s microphone can capture for processing by higher-level AI analysis.
These voice recordings could be used for “lane drift detection” to indicate when people aren’t doing as well as they should be, he adds, continuing with his car analogy. AI-read electrocardiograms would work much the same way by predicting if someone is going to have an abnormal heart rhythm in the next five years. “If your heart rate is going up, your blood pressure is going down, and your voice is stressed, maybe you’re veering out of your health lane and need a little nudge to get back [on track].”
A vocal biomarker might be developed in several ways, says Ommen, including having Alexa record and ship the signals to a different server for analysis and report back findings to patients. The algorithms could theoretically be baked into the Alexa software itself. The question is how much to load into a single device versus using it only for what it was purposefully built and relying on other software solutions to expand its capabilities. Mayo faces a similar dilemma with Molly, its self-service avatar.
Building an Alexa skill is a straightforward proposition based on the “if this, then that” software programming protocol and the availability of a database from which to fetch the information, says Arsene. Many scenario-specific skills are available on the open market, and some of their creators have signed a BAA with Alexa, allowing it to directly interact with patient information.
A single skill can be built to suit multiple Q&A-type scenarios—e.g., patients reminders to take their medications and their blood pressure and blood sugar readings, as well as count their calories or steps, as notified by the circling of Alexa’s light ring. Arsene’s approach would be to home in on a chronic disease frequently seen by a practice where patients struggle to comply with their doctor’s treatment plan. The practice could either home-build or outsource the building of the dashboard database to handle the information flow to and from specific patients who can verbally report at-home test results and confirm they’re following instructions.
The skill can also be programmed so Alexa instructs patients to re-take their blood pressure a second or third time if the first reading is abnormally high, Arsene notes, after which Alexa would alert the doctor of a potential impending heart attack. In this sense, Alexa would be “helping with literally preventing a death,” he says.
The skills Digital Authority Partners builds for medical practices are “diluted” because they don’t have access to data feeds from wearable devices tracking fitness-related metrics, such as blood pressure and heart rate, so data of that type can’t automatically flow into a dashboard, he notes.
Interactions with Alexa “shouldn’t be 20 questions back and forth,” advises Arsene, but quickly confirm that patients are doing what their physician instructed. Data collection at home tends to be more unbiased than in the office, where “white coat syndrome” can raise blood pressure levels above the normal range in otherwise healthy people. Patients are also more likely to fess up to Alexa than their doctor if they aren’t following orders, he adds.
User experience with an Alexa skill can be highly variable depending on who is handling the data analysis on the back end to be sure all the relevant questions are answerable by Alexa—and in the potentially numerous ways each of those questions might be asked, Arsene says. When Alexa doesn’t know an answer, or answers poorly, it’s only because no one is monitoring the program for unexpected errors and quickly fixing them.
“No one can anticipate all the different ways that a person is going to ask a question of Alexa,” he continues. “That’s why Alexa captures the specific words used to ask a question, so developers can see a variation of a question they didn’t plug in the correct answer for and… augment the skill [accordingly].”
The onus is on providers to monitor the information flow for specific patients (e.g., who is taking their meds or on the verge of a heart attack) and overall caseload (e.g., new ways people are asking the same question or worrisome trends), says Ryan, drawing a parallel with what has been happening with remote patient monitoring of many COVID-suspected and -confirmed patients. “The most efficient health systems have software in place to create a virtual command center, enabling monitoring at scale,” he adds, citing Houston Methodist’s use of Medical Informatic’s Sickbay platform for its own ICU. “We are seeing something similar to this being used by patients uploading data from their homes.”
Intel expects rapid growth of at-home monitoring solutions, and digital assistants could well be an “integral part of this ecosystem,” says Ryan. “Why have a separate monitoring gateway if a digital assistant is already present in the home?”
Many health systems are already using cloud-based services and hybrid models with on-premises activity, Ryan adds, although it is unclear if they will move quickly to take advantage of this sort of convenient care. Alternatively, third-party monitoring solutions may emerge to provide the backend infrastructure, in much the same way as telemedicine providers Teledoc, Doctors on Demand, MD Live and American Well have done.
The “home hospital” model—where most care happens outside of traditional healthcare settings—is already becoming a reality, says Ryan. “All forward-looking health systems are developing or implementing hospital-at-home strategies, some of which enable as-needed X-rays or blood draws in the home by trained clinicians.” According to a McKinsey & Company, he notes, up to $250 billion of current U.S. healthcare spending could potentially be virtualized. That’s equivalent to 20% of all office, outpatient, and home health spending across Medicare, Medicaid, and commercially insured populations.
The healthcare industry has been notoriously slow to innovate, but external pressures coming from outside the system—by groups like Haven (joint healthcare venture of Amazon, J.P. Morgan and Berkshire Hathaway) and the Partnership for Artificial Intelligence, Automation and Robotics in Healthcare (whose top leader is founding CEO of the American Telemedicine Association) as well as the ongoing pandemic—is forcing it, says Arsene. Data siloes have been hindering the delivery of care and efforts to lower costs. “What these partnerships are telling the world is that I’m fed up with your lack of interoperability and I’m going to step in and build my own thing with the assumption that you are going to catch up.”
Alexa represents the “dumbest” possible version of AI, continues Arsene, but meaningful data is amassing on the platform that could be used to create decision-support tools. More sophisticated algorithms in healthcare require systems for accessing the data, which is why success with AI is limited largely to radiology. Al can’t easily be layered on top of case notes in electronic medical records the way it can be CT scans because of inconsistencies in how data gets collected.
It could prove difficult for providers to integrate data pools from Alexa with those generated by telemedicine platforms, Arsene says. The way data are being collected and catalogued varies “substantially” from one doctor to the next because of the lengthy list of available telemedicine software options—none of which were designed to talk to one another.