Thursday, March 05, 2020

Timeless Story of Human Stories


GETTING COVERAGE: Here are some of the most common mistakes when it comes to media pitching, and how you can stop tripping up.

 Every now and then, you’ll encounter a website that forces you to register to view it. Rather than give the website your real email address – often an invitation to spam – you can use one of these tricks instead.
bugmenot.com


Almost nobody living in Europe 4,500 ago is an ancestor of modern Europeans. Previous populations were replaced by the Yamnaya people, who migrated westward from the Asian steppes. (Who We Are and How We Got Here)

People born between 1963 and 1965 were 0.2% to 0.5% less likely to drive to work in the year 2000 than people born immediately before and after these years. This is because this cohort’s first experience with driving came during the price shocks of the 1979 oil crisis. The commuting habits developed during these formative years have lingered decades later. (“Formative Experiences and the Price of Gasoline”)

Native Americans are more closely related to the French than they are to people from East Asia. Both Native Americans and the French can trace their ancestry to a now-extinct population from northern Eurasia. Some of their descendants migrated east across the Bering Land Bridge; others migrated west to northern Europe. Those who remained were replaced by other populations. (Who We Are and How We Got Here)



What exactly was that Bloomberg video?

U.S. presidential candidate Michael Bloomberg’s recent campaign video portraying other Democratic candidates as dazed and confused in response to a question he posed during last week’s Las Vegas debate generated considerable discussion in the misinformation and media worlds. 
The video wasn’t a fake, exactly, but it was edited in a misleading way. The imagery in the footage wasn’t doctored, but it was a deceptive composite. Sound effects gave it a comic effect, but it didn’t quite feel like parody either.
“Is it political spin or disinformation?” asked Vox
During the debate, Bloomberg declared that he was the only one on the stage who has started a business, and asked the others if that was fair. The video then pasted together puzzled facial expressions of the candidates from other moments in the debate to make it seem like they were stumped. Crickets were added as background noise.
As The Washington Post’s Fact Checker noted, the video stretched what was in reality a brief moment into 22 seconds, making it appear as if the other candidates were stunned by Bloomberg’s question.
To its credit, The Post saw this kind of thing coming last year when it put together its guide to manipulated video. The guide offers a nomenclature for manipulations so that whenever one of these videos appears, viewers can use the appropriate terminology to identify it. These are important distinctions for fact-checkers, journalists and others who are trying to explain to audiences exactly how they’re being manipulated. And because video is getting easier to edit, we can expect to see more of these simple manipulations in the future.
In fact, we already are. The infamous slowed-down video of House Speaker Nancy Pelosi from last year was one example. More recently there was another mashup that showed Pelosi tearing up President Donald Trump’s State of the Union address at key moments in the speech. She did rip up her paper copy, but at the end, not in response to the passages shown in the video. 
In awarding the Bloomberg video four Pinocchios for deceptive editing, The Post Fact Checker noted the likelihood of more to come, suggesting its judgment was partly pre-emptive.
“We’re taking a tough line on manipulated campaign videos before viewers are flooded with so many fakes that they have trouble knowing what is true,” wrote The Post’s Glenn Kessler. “The Bloomberg campaign should label this as a parody or else take the video down.”
The Bloomberg campaign did tell another Post reporter that the video was meant to be humorous, saying “there were obviously no crickets on the debate stage.”
To date, much of the debate over these video manipulations has focused on how social media platforms should respond to them. Twitter and Facebook appear divided, The Verge reported. But while the platforms try to figure out how to react to what’s already out there, campaigns with money and digital savvy are pushing into new territory, creating more ways to expand the boundaries of acceptability in political advertising.
— Susan Benkelman, API

. . . technology

  • The assumption that video is more persuasive than text has fueled concern about deepfakes and their contribution to information pollution. But researchers from the Massachusetts Institute of Technology did an experiment that puts that assumption into doubt.
    • Video, they found, had a small positive effect on the persuasiveness of and engagement with non political content but not political content. For the TL;DR crowd, one of the researchers, David G. Rand, a professor of management science and brain and cognitive sciences, has a nice summary thread on Twitter.

. . . politics

·         U.S. intelligence officials worry that the two parties are distorting and weaponizing concerns about Russian election interference so much that the public will not have a full appreciation of the actual threat, NBC News reported
·         First Draft said this week that an operation it set up to investigate French online information leading up to the municipal elections in March shows that information disorder is quickly evolving and becoming more complex.

. . . the future of news

  • Official data about the number of people infected and/or killed by the 2019 coronavirus in Iran is ambiguous, frustrating the work of journalists and others tracking the spread of the illness. While some sources say that there are more than 50 deaths in the city of Qom alone, others say the country is denying that. The lack of reliable data resembles the situation in China.
    • Contributing to the suspicion, Iranian’s deputy health minister tested positive for the novel coronavirus just a day after downplaying the virus.
       
  • In the period around President Trump’s June 2017 announcement that the United States was leaving the Paris climate agreement, an army of Twitter bots influenced the online conversation about the issue, according to a new study by researchers at Brown University. 
    • The Guardian, which first reported on the study, said that on an average day during the period studied, 25% of all tweets about the climate crisis came from bots. 
    • Thomas Marlow, a doctoral candidate at Brown who led the study, told The Guardian it came about because he and his colleagues were “always kind of wondering why there’s persistent levels of denial about something that the science is more or less settled on.”
It’s been almost two months since the Wuhan Municipal Health Commission in China officially announced the first death caused by the 2019 coronavirus, and the world still doesn’t know who or what is responsible for this lethal disease. So there is plenty of room for falsehoods to grow.
One of many conspiracy theories surrounding the virus involves Charles Lieber, chair of Harvard University’s Department of Chemistry and Chemical Biology. Coincidentally, Lieber was arrested in January on charges of misleading federal authorities about funds he allegedly received from Wuhan University of Technology and his connections to a Chinese government-sponsored recruitment program. 
A gigantic conspiracy theory followed, making him responsible for the new coronavirus – as  a bioweapon developed in a lab. Last week, the fact-checker Snopes debunked the theory, concluding there is no evidence that Lieber has any  known connection to the outbreak of coronavirus.
What we liked: Fact-checkers can play an important role protecting reputations and even lives. Lieber is suffering online harassment and threats based on this conspiracy theory. On Twitter, it is easy to find people suggesting he “should be injected with his own coronavirus,” for example.
— Cristina Tardáguila, IFCN 
1.     In Turkey, the fact-checker Teyit developed a project to debunk an anti-vaccination bestseller. Now this effort will receive extra funding from the IFCN.
2.     The New York Times took a deep dive into news literacy among members of Generation Z.
3.     Experts from different health organizations met at the National Academies of Sciences, Engineering, and Medicine on Saturday to discuss ways to better communicate scientific facts. Cristina listed six tips she heard at the meeting.
4.     The Columbus (Ohio) Dispatch put together a guide for readers on election disinformation. It includes a handy glossary. 
5.     Facebook announced that it would spend $2 million to support social science research on misinformation and polarization related to social communication technologies. Quartz’s take: “Facebook threw some petty cash at misinformation research.” 
6.     Facebook is clashing with election officials in at least five states over getting accurate information about voting, USA Today reported.
via Cristina and Susan