Toggle light / dark theme

Medicine’s next leap: Delivering gene therapies exactly where they’re needed

A quiet revolution is underway in modern medicine: Drug development is aiming to move from managing disease to correcting it through RNA and gene-editing therapies. But delivering these treatments safely and precisely to the right cells remains a major hurdle—especially in hard-to-target organs like the brain and kidneys.

Now, researchers led by a University of Ottawa Faculty of Medicine team offer highly compelling evidence that an elegant, nature-inspired solution lies in ultra-tiny, bubble-like structures called small extracellular vesicles (sEVs). These metabolic messengers, refined over millions of years of evolution, carry RNA—a nucleic acid that is a chemical cousin of DNA—and other molecules between cells.

In a nutshell, the research team’s new findings show that not all sEVs are alike: their cell of origin determines where they travel, with certain vesicles naturally targeting specific tissues in the body.

At just four nanometers thick, this metal starts behaving in a way physicists did not expect

Researchers in the University of Minnesota Twin Cities have discovered a powerful new way to control the electronic behavior of a metal—by manipulating the atomic properties of materials where they meet. The study, published in Nature Communications, demonstrates that interfacial polarization can tune the surface work function of metallic ruthenium dioxide (RuO2) by more than 1 electron volt (eV)—a tiny amount of energy—simply by adjusting film thickness at the nanometer scale.

“We often think of polarization as something that belongs to insulators or ferroelectrics—not metals,” said Bharat Jalan, professor and Shell Chair in the Department of Chemical Engineering and Materials Science at the University of Minnesota. “Our work shows that, through careful interface design, you can stabilize polarization in a metallic system and use it as a knob to tune electronic properties. This opens an entirely new way of thinking about controlling metals.”

This specific change is most powerful when the metal layer is about 4 nanometers thick—roughly the width of a single strand of DNA. At this precise size, the metal shifts from being “stretched” by the material underneath it to a more “relaxed” state. This transition proves that the physical way atoms are packed together has a direct, measurable impact on how the metal handles electricity.

Specially designed material combines light and electricity to remove PFAS from water without harmful byproducts

Researchers at Clarkson University have reported a breakthrough in tackling per- and polyfluoroalkyl substances (PFAS), a group of widely used “forever chemicals” that are difficult to remove from water and have raised growing environmental and public health concerns. The study, published in Nature Communications, was led by Associate Professor Yang Yang and his team in the Department of Civil and Environmental Engineering. It presents a new method for breaking down PFAS that could improve the treatment of contaminated water in real-world conditions.

A faster, greener method to recycle lithium-ion batteries can also ease supply chain issues

As global demand for lithium-ion batteries continues to surge, a team of Rice University researchers has developed a faster, more energy-efficient way to recover critical minerals from spent batteries, potentially easing supply chain pressures and reducing environmental harm.

In a new study published in Small, researchers from Rice’s Department of Materials Science and Nanoengineering introduce a class of water-based solutions that can extract valuable metals from battery waste in minutes rather than hours. The work centers on aqueous solutions of amino chlorides, which mimic the performance of commonly studied green solvents like deep eutectics, while avoiding their key limitations.

“Traditional recycling methods often rely on harsh acids or slow, energy-intensive processes,” said the study’s first author, Simon M. King, a sophomore studying chemical and biomolecular engineering who completed this work as a summer research fellow at the Rice Advanced Materials Institute. “What we’ve shown is that you can achieve rapid, high-efficiency metal recovery using a much simpler, water-based system.”

Scientists Teach AI To Think Like a Professional Chemist

Researchers have developed a framework that interprets chemical strategy as language, opening a new path for AI-assisted discovery. Designing molecules is one of the most difficult tasks in chemistry. Whether creating new medicines or advanced materials, each compound must be built through a care

The Next Chip Breakthrough Is Not a Machine

Go to https://sintra.ai/intech or use code INTECH to get an exclusive 72% off all plans. 14-day money-back guarantee.

Timestamps:
00:00 — The Limits of Light
07:44 — The Chemistry Hack. How It Works.

My Podcast on Apple: https://podcasts.apple.com/at/podcast… Podcast on Spotify: https://open.spotify.com/show/3drr7A8… Subscribe to my exclusive newsletter: Newsletter: https://anastasiintech.substack.com Let’s connect on LinkedIn: / anastasiintech Instagram: / anastasi.in.tech Patreon: / anastasiintech.

Newsletter: https://anastasiintech.substack.com.

Let’s connect on LinkedIn: / anastasiintech
Instagram: / anastasi.in.tech
Patreon: / anastasiintech.

Mars dust storms are sparking electricity and rewriting the planet’s chemistry

Mars may look like a quiet, dusty world, but it’s actually buzzing with hidden electrical activity. Powerful dust storms and swirling dust devils generate static electricity strong enough to spark faint glowing discharges across the planet, triggering chemical reactions that reshape its surface and atmosphere. Scientists have now shown that these tiny lightning-like events can create a surprising mix of chemicals—including chlorine compounds and carbonates—and even leave behind distinct isotopic “fingerprints.”

Mars is often portrayed as a dry, lifeless desert, but it is far more active than it appears. Its thin atmosphere and dusty terrain create an environment where constant motion generates electrical energy. Dust storms and spinning dust devils sweep across the surface, continually reshaping the landscape and driving processes that scientists are only beginning to fully understand.

Planetary scientist Alian Wang has been studying this phenomenon in depth. In a series of studies, including recent work published in Earth and Planetary Science Letters, she has examined how these electrically charged dust activities influence the chemistry of Mars, particularly through their impact on isotopes.

Solar reactor uses old battery acid to turn plastic waste into clean hydrogen

Researchers have developed a solar-powered reactor to break down hard-to-recycle forms of plastic waste—such as drink bottles, nylon textiles and polyurethane foams—using acid recovered from old car batteries, and converting it into clean hydrogen fuel and valuable industrial chemicals. The results are reported in the journal Joule.

The reactor, developed by researchers from the University of Cambridge, is powered by the energy from the sun, and could be a cheaper, more sustainable alternative to current chemical-based recycling methods. The team says their method could create a circular system where one waste stream solves another.

Global plastic production is more than 400 million tons per year, yet only 18% is recycled. The rest is burned, landfilled, or leaks into ecosystems. The researchers believe that their method, known as solar-powered acid photoreforming, could become part of the solution to the global mountain of plastic waste.

Quantum-informed machine learning for predicting spatiotemporal chaos with practical quantum advantage

Ultimately, QIML proves that we don’t need a fully fault-tolerant quantum computer to see results. By using quantum processors to learn the complex “rules” of chaos, we can give classical computers the boost they need to make reliable, long-term predictions about the most turbulent environments in the natural world.


Modeling high-dimensional dynamical systems remains one of the most persistent challenges in computational science. Partial differential equations (PDEs) provide the mathematical backbone for describing a wide range of nonlinear, spatiotemporal processes across scientific and engineering domains (13). However, high-dimensional systems are notoriously sensitive to initial conditions and the floating-point numbers used to compute them (47), making it highly challenging to extract stable, predictive models from data. Modern machine learning (ML) techniques often struggle in this regime: While they may fit short-term trajectories, they fail to learn the invariant statistical properties that govern long-term system behavior. These challenges are compounded in high-dimensional settings, where data are highly nonlinear and contain complex multiscale spatiotemporal correlations.

ML has seen transformative success in domains such as large language models (8, 9), computer vision (10, 11), and weather forecasting (1215), and it is increasingly being adopted in scientific disciplines under the umbrella of scientific ML (16). In fluid mechanics, in particular, ML has been used to model complex flow phenomena, including wall modeling (17, 18), subgrid-scale turbulence (19, 20), and direct flow field generation (21, 22). Physics-informed neural networks (23, 24) attempt to inject domain knowledge into the learning process, yet even these models struggle with the long-term stability and generalization issues that high-dimensional dynamical systems demand. To address this, generative models such as generative adversarial networks (25) and operator-learning architectures such as DeepONet (26) and Fourier neural operators (FNO) (27) have been proposed. While neural operators offer discretization invariance and strong representational power for PDE-based systems, they still suffer from error accumulation and prediction divergence over long horizons, particularly in turbulent and other chaotic regimes (28, 29). Recent work, such as DySLIM (30), enhances stability by leveraging invariant statistical measures. However, these methods depend on estimating such measures from trajectory samples, which can be computationally intensive and inaccurate in all forms of chaotic systems, especially in high-dimensional cases. These limitations have prompted exploration into alternative computational paradigms. Quantum machine learning (QML) has emerged as a possible candidate due to its ability to represent and manipulate high-dimensional probability distributions in Hilbert space (31). Quantum circuits can exploit entanglement and interference to express rich, nonlocal statistical dependencies using fewer parameters than their promising counterparts, which makes them well suited for capturing invariant measures in high-dimensional dynamical systems, where long-range correlations and multimodal distributions frequently arise (32). QML and quantum-inspired ML have already demonstrated potential in fields such as quantum chemistry (33, 34), combinatorial optimization (35, 36), and generative modeling (37, 38). However, the field is constrained on two fronts: Fully quantum approaches are limited by noisy intermediate-scale quantum (NISQ) hardware noise and scalability (39), while quantum-inspired algorithms, being classical simulations, cannot natively leverage crucial quantum effects such as entanglement to efficiently represent the complex, nonlocal correlations found in such systems. These challenges limit the standalone utility of QML in scientific applications today. Instead, hybrid quantum-classical models provide a promising compromise, where quantum submodules work together with classical learning pipelines to improve expressivity, data efficiency, and physical fidelity. In quantum chemistry, this hybrid paradigm has proven feasible, notably through quantum mechanical/molecular mechanical coupling (40, 41), where classical force fields are augmented with quantum corrections. Within such frameworks, techniques such as quantum-selected configuration interaction (42) have been used to enhance accuracy while keeping the quantum resource requirements tractable. In the broader landscape of quantum computational fluid dynamics, progress has been made toward developing full quantum solvers for nonlinear PDEs. Recent works by Liu et al. (43) and Sanavio et al. (44, 45) have successfully applied Carleman linearization to the lattice Boltzmann equation, offering a promising pathway for simulating fluid flows at moderate Reynolds numbers. These approaches, typically using algorithms such as Harrow-Hassidim-Lloyd (HHL) (46), promise exponential speedups but generally necessitate deep circuits and fault-tolerant hardware.

Quantum-enhanced machine learning (QEML) combines the representational richness of quantum models with the scalability of classical learning. By leveraging uniquely quantum properties such as superposition and entanglement, QEML can explore richer feature spaces and capture complex correlations that are challenging for purely classical models. Recent successes in quantum-enhanced drug discovery (37), where hybrid quantum-classical generative models have produced experimentally validated candidates rivaling state-of-the-art classical methods, demonstrate the practical potential of QEML even before full quantum advantage is achieved. Despite these strengths, practical barriers remain. QEML pipelines require repeated quantum-classical communication during training and rely on costly quantum data-embedding and measurement steps, which slow computation and limit accessibility across research institutions.

How a chemical reaction triggers brain inflammation in Alzheimer’s disease

The brain has its own immune system, which detects threats and mounts a defense. A growing body of evidence has shown that in Alzheimer’s disease, those immune cells are chronically overactivated, causing inflammation that damages the connections between brain cells.

Now, in a preclinical study using human Alzheimer’s brain cells, scientists at Scripps Research have identified a molecular switch—and potential drug target—responsible for driving that chronic inflammation.

The research, published in Cell Chemical Biology on April 23, 2026, centers on a protein called STING, which normally functions as part of the immune system’s early-warning system. In the brains of people with Alzheimer’s, the team discovered that STING undergoes a chemical modification known as S-nitrosylation (or SNO, a reaction involving sulfur, oxygen and nitrogen) that promotes its overactivation. Blocking this chemical change to STING in a mouse model of the disease decreased neuroinflammation.

/* */