Monday, March 20, 2023

On social media platforms, more sharing means less caring about accuracy

"The revelation of the truth, even if it's unpleasant, is beneficial"  






The Role of Default Settings in Online Searches: Challenging Google Dominance

A meaningful level of onsumers will opt out of the default setting of Google as their search engine, particularly if prompted early and often


Duke University Libraries: “…ChatGPT is an Artificial Intelligence Chatbot developed by OpenAI and launched for public use in November 2022. While other AI chatbots are also in development by tech giants such as Google, Apple, and Microsoft, OpenAI’s early rollout has eclipsed the others for now – with the site reaching more than 100 million users in 2 months. For some perspective, this is faster widespread adoption than TikTok, Instagram, and many other popular apps.



On social media platforms, more sharing means less caring about accuracy MIT News: “..As a social media user, you can be eager to share content. You can also try to judge whether it is true or not. But for many people it is difficult to prioritize both these things at once. That’s the conclusion of a new experiment led by MIT scholars, which finds that even considering whether or not to share news items on social media reduces people’s ability to tell truths from falsehoods. The study involved asking people to assess whether various news headlines were accurate. But if participants were first asked whether they would share that content, they were 35 percent worse at telling truths from falsehoods.



 Participants were also 18 percent less successful at discerning truth when asked about sharing right after evaluating them. “Just asking people whether they want to share things makes them more likely to believe headlines they wouldn’t otherwise have believed, and less likely to believe headlines they would have believed,” says David Rand, a professor at the MIT Sloan School of Management and co-author of a new paper detailing the study’s results. “Thinking about sharing just mixes them up.” The results suggest an essential tension between sharing and accuracy in the realm of social media. While people’s willingness to share news content and their ability to judge it accurately can both be bolstered separately, the study suggests the two things do not positively reinforce each other when considered at the same time. “The second you ask people about accuracy, you’re prompting them, and the second you ask about sharing, you’re prompting them,” says Ziv Epstein, a PhD student in the Human Dynamics group at the MIT Media Lab and another of the paper’s co-authors. “If you ask about sharing and accuracy at the same time, it can undermine people’s capacity for truth discernment.”

  • The paper, “The social media context interferes with truth discernment,” is published today in Science Advances. The authors are Epstein; Nathaniel Sirlin, a research assistant at MIT Sloan; Antonio Arechar, a professor at the Center for Research and Teaching in Economics in Mexico; Gordon Pennycook, an associate professor at the University of Regina; and Rand, who is the Erwin H. Schell Professor, a professor of management science and of brain and cognitive sciences, and the director of MIT’s Applied Cooperation Team.”


How to Take Back Control of What You Read on the Internet

The Atlantic – “How to Take Back Control of What You Read on the Internet Social-media algorithms show us what they want us to see, not what we want to see. But there is an alternative. By Yair Rosenberg
“The social-media web is built on a lie. Platforms such as Facebook, Instagram, and Twitter enticed countless users to join with the promise that they could see everything their friends or favorite celebrities posted in one convenient location. Over time, though, the sites were carefully calibrated to filter what users saw—regardless of their stated preferences—in order to manipulate their attention and keep them on the platform. 
Algorithmic timelines quietly replaced chronological ones, until our social-media feeds no longer took direction from us, but rather directed us where they wanted us to go. Lately, this deception has become more transparent. Last month, Elon Musk reportedly had his engineers alterTwitter’s algorithm so that it fed his own tweets to the platform’s users, whether they followed him or not. (Musk denies having done so.) This might seem to say more about Musk’s vanity than about social media in its entirety. 
But in his typically crass way, Musk was just making obvious what was always the case for his industry. Meta did the same when it launched Meta Verified, a subscription service that promised it would provide paying users with “increased visibility and reach.” These developments underscore a stark reality: As long as we rely on social-media sites to curate what we read, we allow them to control what we read, and their interests are not our interests. 
Fortunately, there already exists a long-standing alternative that provides users with what social media does not deliver: RSS…” For researchers who have employed #RSSfor multidisciplinary research for decades, this is not news, but a good reminder that new applications are not necessarily better applications, and ownership and control over critical technology impacts the scope and validity of our work. We need to continue to be vigilant and exercise choice.
 See also beSpacific on Mastodon – Newsie Social – where I ‘toot’ daily on many issues including research and technology.