Computer Model Seeks to Explain the Spread of Misinformation, and Suggest Counter Measures

It starts with a super branch, works its way through a network of interactions, and ultimately leaves no one untouched. Those previously exposed may experience little effect when exposed to a different variant.

No, it is not a virus. It is the contagious spread of disinformation and disinformation that is entirely intended to deceive.

Tufts University researchers have now come up with a computer model that remarkably reflects the way misinformation spreads in real life. The researchers say the work may provide insight into how to protect people from the current contagion of misinformation that threatens public health and the health of democracy.

“Our society has been grappling with pervasive beliefs in conspiracy, increasing political polarization, and distrust of scientific findings,” said Nicholas Raab, Ph.D. A computer science student at Tufts School of Engineering and lead author of the study, which appeared Jan. 7 in the journal ONE . ​​Science Public Library. “This model can help us deal with how misinformation and conspiracy theories spread, to help come up with strategies to counter it.”

Scientists who study information dissemination often take a page from epidemiologists, and model the spread of false beliefs about how disease spreads through a social network. However, most of these models treat people in networks as all taking equally any new belief transmitted to them through contacts.

Instead, the Tufts University researchers based their model on the idea that our pre-existing beliefs can strongly influence our acceptance of new information. Many people reject factual, evidence-based information if it takes them too far from what they already believe. Healthcare workers have commented on the strength of this effect, noting that some patients who die from COVID stick to the belief that COVID does not exist.

To account for this in their model, the researchers assigned a “belief” to each individual in the synthetic social network. To do this, the researchers represented the individuals’ beliefs in a computer model with a number from 0 to 6, where 0 represented strong disbelief and 6 represented strong belief. Numbers can represent the set of beliefs in any issue.

For example, one might think of the number 0 representing the strong belief that COVID vaccines help and are safe, while the number 6 might be the strong belief that COVID vaccines are actually safe and effective.

The model then creates a vast network of virtual individuals, as well as virtual institutional sources that generate a lot of information cascading across the network. In real life, these could be the news media, churches, governments, social media influencers – basically the super publishers of information.

The model starts with an enterprise resource that injects information into the network. If an individual receives information close to their beliefs – eg, 5 compared to their current 6 – they are more likely to update that belief to 5. If the information received is significantly different from their current beliefs – say 2 compared to 6 – they are more likely to reject it outright They stick to their sixth level beliefs.

Other factors, such as the percentage of contacts sending them information (principally, peer pressure) or the level of trust in the source, can influence how individuals update their beliefs. The population-level network model then provides for these interactions an active display of the prevalence and survival strength of misinformation.

Future improvements to the model will take into account new knowledge from both network science and psychology, as well as compare results from the model with real-world surveys and network structures over time.

While the current model suggests that beliefs can only change incrementally, other scenarios that cause a greater shift in beliefs can be formulated – eg, a jump from 3 to 6 that can occur when a dramatic event occurs for an influencer and begs for it. Followers to change their mind.

Over time, the computer model can become more complex to accurately reflect what is happening on Earth, say the researchers, who in addition to Raab include faculty advisor Lenore Quinn, professor of computer science; computer scientist Matthias Schutz; JP Dereuter, Professor of Psychology and Computer Science.

“It has become very clear that simply broadcasting factual information may not be enough to make an impact on the public mindset, particularly among those who are locked into a belief system that is not based on facts.” Quinn said. “Our initial efforts to incorporate that insight into our models of the mechanisms of disinformation pervasive in society have taught us how to return the public conversation to facts and evidence.”

Leave a reply:

Your email address will not be published.