Lake Tahoe Suffocates With Smoke New York Times. A wake up call that the squillionaires that hang there are guaranteed to ignore.
People who recovered from a bout of Covid-19 during one of the earlier waves of the pandemic appear to have a lower risk of contracting the delta variant than those who got two doses of the vaccine from Pfizer Inc. and BioNTech SE.
Pew: “Americans have long debated the boundaries of free speech, from what is and isn’t protected by the First Amendment to discussions about “political correctness” and, more recently, “cancel culture.” The internet has amplified these debates and fostered new questions about tone and tenor in recent years. Here’s a look at how adults in the United States see these and related issues, based on Pew Research Center surveys…”
Michael Pascoe: What happens next isn’t ‘freedom’ – it’s triage The New Daily. Australia. All it takes is one defector, and everybody in the defector’s network is “living with it.”
Stories on clip above: US Marines officer relieved of duties after video seeking ‘accountability’ over Afghanistan Guardian and Active duty Marine battalion commander is relieved of duties after posting furious video rant hammering senior leaders for not admitting ‘we messed this up’ Daily Mail (Alison L)
Who profits from the Kabul suicide bombing? Asia Times
Like Ordering Pizza London Review of Books (Anthony L)
Ars Technica – Researchers see if they can remove sensitive data without retraining AI from scratch: “Companies of all kinds use machine learning to analyze people’s desires, dislikes, or faces. Some researchers are now asking a different question: How can we make machines forget? A nascent area of computer science dubbed machine seeks ways to induce selective amnesia in artificial intelligence software. The goal is to remove all trace of a particular person or data point from a machine-learning system, without affecting its performance. If made practical, the concept could give people more control over their data and the value derived from it. Although users can already ask some companies to delete personal data, they are generally in the dark about what algorithms their information helped tune or train. Machine unlearning could make it possible for a person to withdraw both their data and a company’s ability to profit from it. Although intuitive to anyone who has rued what they shared online, that notion of artificial amnesia requires some new ideas in computer science. Companies spend millions of dollars training machine-learning algorithms to recognize faces or rank social posts, because the algorithms often can solve a problem more quickly than human coders alone. But once trained, a machine-learning system is not easily altered, or even understood. The conventional way to remove the influence of a particular data point is to rebuild a system from the beginning, a potentially costly exercise. “This research aims to find some middle ground,” says Aaron Roth, a professor at the University of Pennsylvania who is working on machine unlearning. “Can we remove all influence of someone’s data when they ask to delete it, but avoid the full cost of retraining from scratch?”..”