Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

Wastewater Methane Gaps Found in National Climate Reports

“If you don’t know exactly how much emissions you have, then it’s really difficult to make effective policies or technologies or methods to reduce the emissions,” said Dr. Z. Jason Ren. [ https://www.labroots.com/trending/earth-and-the-environment/…-reports-2](https://www.labroots.com/trending/earth-and-the-environment/…-reports-2)


Are national climate reports missing crucial data points regarding wastewater greenhouse gas (GHG) emissions? This is what a recent study published in Nature Climate Change hopes to address as a team of researchers investigated the accuracy of national inventory reports (NIRs) for wastewater GHG. This study has the potential to help researchers, climate scientists, legislators, and the public better understand the methods for tracking climate change and steps that can be taken to fill the gaps in report lapses.

For the study, the researchers obtained data from 38 countries regarding wastewater GHG emissions with the goal of ascertaining existing data gaps in NIRs. The motivation of this study comes from the lack of consistent data methods and large changes that occur over many years and in global regions. The overarching goal of the study was to ascertain where the data gaps exist and how to fill them.

In the end, the researchers discovered massive data gaps in wastewater GHG emissions, including an unreported gap of 52.0–73.2 million metric tons (MMT) of CO2-equivalent (CO2e) annually across the 38 countries. Additionally, they found that global gap of 94–150 MMT CO2e annually.

Engineered immune therapy could help fight brain aging

Researchers at Stanford University engineered a modified version of the immune protein interleukin-10 (IL-10) that retains only its anti-inflammatory properties while eliminating its pro-inflammatory ones. When injected into aged mice, this modified protein stimulated the growth of new neurons and improved performance on memory and learning tasks, such as maze navigation and object recognition. The study, published in Immunity, suggests that age-related cognitive decline is linked to the accumulation of exhausted T-lymphocytes in the brain, chronic inflammation, and impaired microglial function — all of which reduce neurogenesis. The findings indicate that selectively modulating immune signaling could open new avenues for treating neurodegenerative diseases. The team plans to further investigate the protein’s mechanisms and explore ways to target specific cell types more precisely to minimize potential side effects.


A modified immune protein developed by Stanford researchers points to a novel strategy for combating age-related cognitive decline.

The unbearable hardness of deciding about magic

Identifying the boundary between classical and quantum computation is a central challenge in quantum information. In multi-qubit systems, entanglement and magic are the key resources underlying genuinely quantum behaviour. While entanglement is well understood, magic — essential for universal quantum computation — remains relatively poorly characterised. Here we show that determining membership in the stabilizer polytope, which defines the free states of magic-state resource theory, requires super-exponential time $\exp( n^2)$ in the number of qubits $n$, even approximately. We reduce the problem to solving a $3$-SAT instance on $n^2$ variables and, by invoking the exponential time hypothesis, the result follows. As a consequence, both quantifying and certifying magic are fundamentally intractable: any magic monotone for general states must be super-exponentially hard to compute, and deciding whether an operator is a valid magic witness is equally difficult. As a corollary, we establish the robustness of magic as computationally optimal among monotones. This barrier extends even to classically simulable regimes: deciding whether a state lies in the convex hull of states generated by a logarithmic number of non-Clifford gates is also super-exponentially hard. Together, these results reveal intrinsic computational limits on assessing classical simulability, distilling pathological magic states, and ultimately probing and exploiting magic as a quantum resource.

AI Is The 21st Century Force Multiplier

Ease see my latest Forbes article and have a great weekend! Chuck Brooks by Chuck Brooks.

#artificialIntelligence #ai #future #tech Forbes


AI is redefining power, productivity, security, and sovereignty. Dual-use, convergent, and autonomous AI is the 21st-century force multiplier. Not only is technology advancing, but civilization is about to change.

The 1956 Dartmouth Conference invented the term “artificial intelligence.” Alan Turing and other pioneers shaped the conceptualization of AI. The first systems used symbolic logic and determinism. Certain expert systems excelled but struggled in dynamic, uncertain environments. Fragility, computational capacity, and data accessibility caused “AI winters.”

⚖️ We Are All Middle Managers of Aliens Now: On the 2026 International AI Safety Report — and why you should read it

Review of International AI Safety Report 2026.


Heliox unpacks the 2026 International AI Safety Report — the definitive global scientific consensus on AI risk — in forty minutes of evidence-grounded, empathetically framed conversation. From jagged AI genius to geopolitical fracture to cognitive atrophy, this episode makes the most consequential technology document of 2026 genuinely accessible.

The Virtual Biotech: A Multi-Agent AI Framework for Therapeutic Discovery and Development

Drug discovery and development requires integrating diverse evidence across biological scales and data modalities. However, relevant data, tools, and expertise remain fragmented across teams and organizations, making integration difficult. To address these challenges, we introduce the Virtual Biotech, a coordinated team of AI agents that mirrors the structure of human therapeutic research organizations to support end-to-end computational discovery. The Virtual Biotech is led by a Chief Scientific Officer agent that receives scientific queries, delegates them to domain-specialized scientist agents, and integrates their outputs through data-driven reasoning. Scientist agents leverage complementary tools and knowledge sources spanning statistical genetics, functional genomics, pathways and interactions, chemoinformatics, disease biology, and clinical data. We showcase the Virtual Biotech across three translational applications. First, the agents autonomously annotated and analyzed outcomes from 55,984 clinical trials to identify genomic features of drug targets associated with trial success. More than 37,000 clinical-trialist agents curated structured trial outcomes and linked targets to multi-omic annotations, including cell-type-specific features derived by the agents from single-cell RNA-sequencing atlases. The agents discovered that drugs targeting cell-type-specific genes were 40% more likely to progress from Phase I to Phase II and 48% more likely to reach market (Phase IV), while exhibiting 32% lower adverse event rates. Second, the Virtual Biotech evaluated B7-H3 as a lung cancer target, integrating statistical genetics, single-cell, spatial, and clinicogenomic evidence to propose an antibody–drug conjugate strategy while identifying key liabilities and differentiation opportunities. Third, the platform analyzed a terminated ulcerative colitis trial targeting OSMR β to infer potential failure mechanisms and proposed biomarker-guided enrollment strategies to address precision-medicine gaps. Together, these results illustrate how the Virtual Biotech can enable more transparent, efficient, and comprehensive multi-scale therapeutic analyses, helping to accelerate early-stage drug discovery workflows while keeping human scientists in the loop.

The authors have declared no competing interest.

/* */