Thursday, February 27, 2025

DOGE Has ‘God Mode’ Access to Government Data


DOGE Claimed It Saved $8 Billion in One Contract. It Was Actually $8 Million.

The New York Times: “The biggest single line item on the website of Elon Musk’s cost-cutting team appears to include an error. The Department of Government Efficiency, the federal cost-cutting initiative championed by Elon Musk, published on Monday a list of government contracts it has canceled, together amounting to about $16 billion in savings itemized on a new “wall of receipts” on its website. 

Almost half of those line-item savings could be attributed to a single $8 billion contract for the Immigration and Customs Enforcement agency. But it appears that the DOGE list vastly overstated the actual intended value of that contract




The Atlantic DOGE Has ‘God Mode’ Access to Government Data [unpaywalled]: “The president’s special commission now has an unprecedented ability to view and manipulate information at many federal agencies. 

The employee’s account, along with the accounts of several others across federal agencies, offers the clearest portrait yet of just how deep DOGE has burrowed into the systems of the federal government—and the sensitive information of countless Americans. In the coming weeks, the team is expected to enter IT systems at the CDC and Federal Aviation Administration, and it already has done so at NASA, according to sources we’ve spoken with at each of those agencies. At least one DOGE ally appears to be working to open back doors into systems used throughout the federal government. 

Thomas Shedd, a former Tesla engineer who was recently appointed director of the Technology Transformation Services, requested privileged access to 19 different IT systems administered by teams within TTS, according to two federal workers we spoke with who are familiar with his request. With this level of control, Shedd would be able to not only view and modify federal data, but also grant and revoke access to other people

 (In a written statement, Will Powell, the acting press secretary for the General Services Administration, of which TTS is a part, said Shedd needs this level of access to rapidly identify “areas for optimization and efficiencies” and insisted that he is working with “appropriate GSA officials” to follow established protocols.)”


AI can now model and design the genetic code for all domains of life with Evo 2

L

“Arc Institute researchers have developed a machine learning model called Evo 2 that is trained on the DNA of over 100,000 species across the entire tree of life. Its deep understanding of biological code means that Evo 2 can identify patterns in gene sequences across disparate organisms that experimental researchers would need years to uncover. The model can accurately identify disease-causing mutations in human genes and is capable of designing new genomes that are as long as the genomes of simple bacteria. 

Evo 2’s developers—made up of scientists from Arc Institute and NVIDIA, convening collaborators across Stanford University, UC Berkeley, and UC San Francisco—will post details about the model as a preprint on February 19, 2025, accompanied by a user-friendly interface called Evo Designer. The Evo 2 code is publicly accessible from Arc’s GitHub, and is also integrated into the NVIDIA BioNeMo framework, as part of a collaboration between Arc Institute and NVIDIA to accelerate scientific research. Arc Institute also worked with AI research lab Goodfire to develop a mechanistic interpretability visualizer that uncovers the key biological features and patterns the model learns to recognize in genomic sequences. 

The Evo team is sharing its training data, training and inference code, and model weights to release the largest-scale, fully open source AI model to date. Building on its predecessor Evo 1, which was trained entirely on single-cell genomes, Evo 2 is the largest artificial intelligence model in biology to date, trained on over 9.3 trillion nucleotides—the building blocks that make up DNA or RNA—from over 128,000 whole genomes as well as metagenomic data. In addition to an expanded collection of bacterial, archaeal, and phage genomes, Evo 2 includes information from humans, plants, and other single-celled and multi-cellular species in the eukaryotic domain of life.

 “Our development of Evo 1 and Evo 2 represents a key moment in the emerging field of generative biology, as the models have enabled machines to read, write, and think in the language of nucleotides,” says Patrick Hsu(@pdhsu), Arc Institute Co-Founder, Arc Core Investigator, an Assistant Professor of Bioengineering and Deb Faculty Fellow at University of California, Berkeley, and a co-senior author on the Evo 2 preprint. “Evo 2 has a generalist understanding of the tree of life that’s useful for a multitude of tasks, from predicting disease-causing mutations to designing potential code for artificial life. We’re excited to see what the research community builds on top of these foundation models.”