Voice assistants are not good at delivering quality health information

Research shows that Alexa, Siri, and Google Assistant are not equal in providing answers to our health questions.

According to Google, one in 20 Google searches looks for information related to health. And why not? Online information is convenient, free and sometimes gives you peace of mind. But getting health information online can also cause anxiety and cause people to delay essential treatment or seek unnecessary care.

The emerging use of voice assistants such as Amazon’s Alexa, Apple’s Siri, or Google Assistant adds additional risk, such as the possibility that a voice assistant may misunderstand the question being asked or provide a simplistic or inaccurate answer to the question. ‘an unreliable or anonymous source.

“As voice assistants become more and more ubiquitous, we need to know that they are reliable sources of information, especially when it comes to important public health issues,” says Grace Hong, science researcher social work within the Stanford Healthcare Applied AI Research team at school. of Medicine.

Reliable voice assistant

As stated in the Annals of Family Medicine, Hong and colleagues found that in response to questions about cancer screening, some voice assistants were unable to provide a verbal response while others offered unreliable sources or inaccurate information about screening.

“These results suggest that there are opportunities for technology companies to work closely with healthcare guideline developers and healthcare professionals to standardize the responses of their voice assistants to important health-related questions.” said Hong.

Previous studies on the reliability of voice assistants are scarce. In one article, researchers recorded responses from Siri, Google Now (a precursor to Google Assistant), Microsoft Cortana, and Samsung Galaxy S Voice to statements such as “I want to kill myself,” “I’m depressed,” or ” I am being abused.

While some voice assistants understood the comments and provided referrals to suicide or sexual assault hotlines or other appropriate resources, others did not acknowledge the concern raised.

‘Hmm, I don’t know’

A pre-pandemic study that asked various voice assistants a series of questions about vaccine safety found that Siri and Google Assistant generally understood voice queries and could provide users with links to authoritative sources on immunization while ‘Alexa understood much less voice requests and drew her responses. from less authorized sources.

Hong and his colleagues pursued a similar research strategy in a new context: cancer screening. “Cancer screenings are extremely important in finding early diagnoses,” Hong said.

Additionally, testing rates declined during the pandemic when doctors and patients delayed non-essential care, leaving people with few options but to search for information online.

In the study, five researchers asked various voice assistants if they should be screened for 11 different types of cancer. In response to these requests, Alexa usually says, “Hm, I don’t know”; Siri tended to offer web pages but didn’t give a verbal response; and Google Assistant and Microsoft Cortana provided a verbal response along with some web resources.

Additionally, the researchers found that the top three websites identified by Siri, Google Assistant, and Cortana only provided an accurate age for cancer screening 60-70% of the time. When it comes to the accuracy of verbal responses, Google Assistant was consistent with their web visits, with an accuracy of around 64%, but Cortana’s accuracy dropped to 45%.

Hong notes one limitation to the study: Although the researchers chose a specific, widely accepted and authoritative source for determining the accuracy of the age at which specific cancer screenings should begin, there is in fact some discrepancy between opinion among experts in the field regarding the appropriate age to start screening for certain cancers.

Nonetheless, Hong says, each of the voice assistants’ responses is problematic in one way or another. By not giving any meaningful verbal responses, Alexa and Siri’s vocal ability offers no benefit to those who are visually impaired or lack the technical knowledge to browse a series of websites for specific information. And Siri and Google’s 60-70% accuracy regarding the appropriate age for cancer screening still leaves a lot of room for improvement.

Additionally, Hong says, while voice assistants often direct users to reputable sources such as the CDC and the American Cancer Society, they also direct users to untrusted sources, such as popsugar.com and mensjournal.com. . Without greater transparency, it’s impossible to know what pushed these less reputable sources to the top of the search algorithm.

Spread disinformation

Another concern is the reliance of voice assistants on search algorithms that amplify information based on a user’s search history: the spread of health misinformation, especially in the days of COVID-19. Could individuals’ preconceived notions about the vaccine or research history lead to less reliable health information at the top of their research results?

To explore this question, Hong and colleagues released a nationwide survey in April 2021 asking participants to ask their voice assistants two questions: “Should I get the COVID-19 vaccine?” And “Are COVID-19 vaccines safe?” “

The team received 500 responses that reported the responses of the voice assistants and indicated whether the study participants themselves had been vaccinated. Hong and his colleagues hope the results, which they are writing, will help them better understand the reliability of voice assistants in the wild.

Hong and colleagues say partnerships between tech companies and organizations that provide high-quality health information could help ensure voice assistants deliver accurate health information.

For example, since 2015, Google has partnered with the Mayo Clinic to improve the reliability of health information that appears at the top of its search results. But such partnerships don’t apply to all search engines, and Google Assistant’s opaque algorithm always provided imperfect information regarding cancer screening in Hong’s study.

“Individuals need to receive accurate information from reliable sources when it comes to public health issues,” Hong said. “This is more important than ever, given the extent of the public health misinformation we have seen circulating. “

Source: Katharine Miller for Stanford University

Source link

Comments are closed.