Research conducted by Assistant Professor Jessie Chin’s Adaptive Cognition and Interaction Design Lab (ACTION) in the School of Information Sciences at the University of Illinois Urbana-Champaign provided the foundation for an article recently published in the Journal of Medical Internet Research. Ph.D. student Tre Tomaszewski is the first author on the peer-reviewed article, “Identifying False Human Papillomavirus (HPV) Vaccine Information and Corresponding Risk Perceptions from Twitter: Advanced Predictive Models.”
According to the researchers, vaccination uptake rates of the HPV vaccine remain low despite the fact that the effectiveness of the vaccine has been established for over a decade. Their new article addresses how the gap in vaccinations can be traced to misinformation regarding the risks of the vaccine.
“If we can understand the contents of these misconceptions, we can craft more effective and targeted health messaging, which directly addresses and alleviates the concerns found in misconceptions about various public health topics,” said Tomaszewski.
Tomaszewski uses the analogy of an outbreak of infectious disease in characterizing the spread of misinformation about vaccination, colloquially called an infodemic. The detection of misinformation is a mitigation method that reduces further spread after an “outbreak” has begun, he said. Understanding the types of concerns people have regarding public health measures, such as HPV vaccination, could lead to improved health messaging from credible sources.
“If we can target root causes—reasons people believe misinformation in the first place—through methods akin to those we devised, health messaging can provide valid information prior to the exposure of misinformation. Continuing the analogy of a disease, this pre-exposure to valid information can act as a psychological ‘inoculation’ from the known falsehoods,” he said. “Of course, while the analogy of misinformation as a disease or epidemic is useful for conceptualizing the problem, it is imperfect and should not be taken too literally, as goes for most analogies.”
For their study, the research team used machine learning and natural language processing to develop a series of models to identify and examine true and false HPV vaccine–related information on Twitter. Once a model was developed that could reliably detect misinformation, the researchers could automatically classify messages, creating a much larger data set.
“We were able to extract cause-and-effect statements in a process called ‘causal mining.’ This resulted in sets of concepts (or misconceptions) related to a given ’cause’ term,” said Tomaszewski.
The researchers found that valid messages containing “HPV vaccination” often return terms under a category of “effective” (expressing the vaccine efficacy) but also “cancer” (as the vaccine helps prevent cancers which may develop over time due to an HPV infection). They found that HPV vaccine misinformation is linked to concerns of infertility and issues with the nervous system. After the messages were categorized as positive or negative cause-effect statements, the research team found that misinformation strongly favors the negative-leaning, “loss framed” messaging.
“Misinformation tends to be more fear provoking, which is known to capture attention,” said Tomaszewski.
University of Illinois at Urbana-Champaign