Toggle light / dark theme

Earth’s Magnetic Field as Dark-Matter Sensor

One candidate for dark matter is a subatomic particle carrying a tiny electric charge many times smaller than that of the electron. This so-called millicharged dark matter would presumably interact with Earth’s magnetic field, generating potentially observable time variations in the magnetic field on Earth’s surface. A new study of archived data looked for this signal but came up empty [1]. The research has thus placed strict limits on the properties that a millicharged dark-matter particle could have if it has a small mass (in the range of 10–18 to 10–15 eV/c2).

Dark matter can’t have a typical electric charge, as it would interact too strongly with normal matter. But a small charge is possible and could produce features in-line with dark-matter models. Astrophysicists have looked for evidence of millicharged dark matter in stellar evolution data, as such particles could cause stars to cool faster than expected. No such signal has been seen, ruling out a large portion of millicharged-dark-matter parameter space.

Lei Wu from Nanjing Normal University in China and colleagues have explored another potential signal in the geomagnetic field. According to the team’s calculations, low-mass millicharged particles could annihilate each other in the presence of the planet’s magnetic-field background, producing an effective electric current that would generate its own magnetic field. This dark-matter-induced field would be small (roughly a million times less than Earth’s field), but it might be detectable owing to its peculiar time variation (at frequencies less than 1 Hz). The researchers failed to find such a signal in previously collected geomagnetic observations. The absence rules out low-mass dark-matter charges in a large range down to 10−30 times the electron charge. Such a small charge may seem implausible, but “nature sometimes surprises us,” Wu says.

‘Goldilocks size’ rhodium clusters advance reusable heterogeneous catalysts for hydroformylation

Recent research has demonstrated that a rhodium (Rh) cluster of an optimal, intermediate size—neither too small nor too large—exhibits the highest catalytic activity in hydroformylation reactions. Similar to the concept of finding the “just right” balance, the study identifies this so-called “Goldilocks size” as crucial for maximizing catalyst efficiency. The study is published in the journal ACS Catalysis and was featured as the cover story.

Led by Professor Kwangjin An from the School of Energy and Chemical Engineering at UNIST, in collaboration with Professor Jeong Woo Han from Seoul National University, the research demonstrates that when Rh exists as a cluster —comprising about 10 atoms—it outperforms both single-atom and nanoparticle forms in reaction speed and activity.

Hydroformylation is a vital industrial process used for producing raw materials for plastics, detergents, and other chemicals. Currently, many Rh catalysts are homogeneous—dissolved in liquids—which complicates separation and recycling. This challenge has driven efforts to develop solid, heterogeneous Rh catalysts that are easier to recover and reuse.

Thinking on different wavelengths: New approach to circuit design introduces next-level quantum computing

Quantum computing represents a potential breakthrough technology that could far surpass the technical limitations of modern-day computing systems for some tasks. However, putting together practical, large-scale quantum computers remains challenging, particularly because of the complex and delicate techniques involved.

In some quantum computing systems, single ions (charged atoms such as strontium) are trapped and exposed to electromagnetic fields including laser light to produce certain effects, used to perform calculations. Such circuits require many different wavelengths of light to be introduced into different positions of the device, meaning that numerous laser beams have to be properly arranged and delivered to the designated area. In these cases, the practical limitations of delivering many different beams of light around within a limited space become a difficulty.

To address this, researchers from The University of Osaka investigated unique ways to deliver light in a limited space. Their work revealed a power-efficient nanophotonic circuit with optical fibers attached to waveguides to deliver six different laser beams to their destinations. The findings have been published in APL Quantum.

A strange in-between state of matter is finally observed

When materials become just one atom thick, melting no longer follows the familiar rules. Instead of jumping straight from solid to liquid, an unusual in-between state emerges, where atomic positions loosen like a liquid but still keep some solid-like order. Scientists at the University of Vienna have now captured this elusive “hexatic” phase in real time by filming an ultra-thin silver iodide crystal as it melted inside a protective graphene sandwich.

Physicists built a perfect conductor from ultracold atoms

Scientists have built a quantum “wire” where atoms collide endlessly—but energy and motion never slow down. Researchers at TU Wien have discovered a quantum system where energy and mass move with perfect efficiency. In an ultracold gas of atoms confined to a single line, countless collisions occur—but nothing slows down. Instead of diffusing like heat in metal, motion travels cleanly and undiminished, much like a Newton’s cradle. The finding reveals a striking form of transport that breaks the usual rules of resistance.

In everyday physics, transport describes how things move from one place to another. Electric charge flows through wires, heat spreads through metal, and water travels through pipes. In each case, scientists can measure how easily charge, energy, or mass moves through a material. Under normal conditions, that movement is slowed by friction and collisions, creating resistance that weakens or eventually stops the flow.

Researchers at TU Wien have now demonstrated a rare exception. In a carefully designed experiment, they observed a physical system in which transport does not degrade at all.

Swimming in a shared medium makes particles synchronize without touching

Several years ago, scientists discovered that a single microscopic particle could rock back and forth on its own under a steady electric field. The result was curious, but lonely. Now, Northwestern University engineers have discovered what happens when many of those particles come together. The answer looks less like ordinary physics and more like mystifying, flawlessly timed choreography.

The study appears in the journal Nature Communications.

In the work, the team found that groups of tiny particles suspended in liquid oscillate together, keeping time as though they somehow sense one another’s motion. Nearby particles fall into sync, forming clusters that appear to sway in unison—rocking back and forth with striking coordination.

Collaboration of elementary particles: How teamwork among photon pairs overcomes quantum errors

Some things are easier to achieve if you’re not alone. As researchers from the University of Rostock, Germany have shown, this very human insight also applies to the most fundamental building blocks of nature.

At its very core, quantum mechanics postulates that everything is made out of elementary particles, which cannot be split up into even smaller units. This made Ph.D. candidate Vera Neef, first author of the recent publication “Pairing particles into holonomies,” wonder: “What can two particles only accomplish if they work as a team? Can they jointly achieve something, that is impossible for one particle alone?”

AI makes quantum field theories computable

An old puzzle in particle physics has been solved: How can quantum field theories be best formulated on a lattice to optimally simulate them on a computer? The answer comes from AI.

Quantum field theories are the foundation of modern physics. They tell us how particles behave and how their interactions can be described. However, many complicated questions in particle physics cannot be answered simply with pen and paper, but only through extremely complex quantum field theory computer simulations.

This presents exceptionally complex problems: Quantum field theories can be formulated in different ways on a computer. In principle, all of them yield the same physical predictions—but in radically different ways. Some variants are computationally completely unusable, inaccurate, or inefficient, while others are surprisingly practical. For decades, researchers have been searching for the optimal way to embed quantum theories in computer simulations. Now, a team from TU Wien, together with teams from the U.S. and Switzerland, has shown that artificial intelligence can bring about tremendous progress in this area. Their paper is published in Physical Review Letters.

From fleeting to stable: Scientists uncover recipe for new carbon dioxide-based energetic materials

When materials are compressed, their atoms are forced into unusual arrangements that do not normally exist under everyday conditions. These configurations are often fleeting: when the pressure is released, the atoms typically relax back to a stable low-pressure state. Only a few very specific materials, like diamond, retain their high-pressure structure after returning to room temperature and atmospheric pressure.

But locking those atomic arrangements in place under ambient conditions could create new classes of useful materials with a wide range of potential applications. One particularly compelling example is energetic materials, which are useful for propellants and explosives.

In a study published in Communications Chemistry, researchers at Lawrence Livermore National Laboratory (LLNL) identified a first-of-its-kind carbon dioxide-equivalent polymer that can be recovered from high-pressure conditions.

Watching atoms roam before they decay

Together with an international team, researchers from the Molecular Physics Department at the Fritz Haber Institute have revealed how atoms rearrange themselves before releasing low-energy electrons in a decay process initiated by X-ray irradiation. For the first time, they have gained detailed insights into the timing of the process—shedding light on related radiation damage mechanisms. Their research is published in the Journal of the American Chemical Society.

High-energy radiation, for example in the X-ray range, can cause damage to our cells. This is because energetic radiation can excite atoms and molecules, which then often decay—meaning that biomolecules are destroyed and larger biological units can lose their function. There is a wide variety of such decay processes, and studying them is of great interest in order to better understand and avert radiation damage.

In the study, researchers from the Molecular Physics Department, together with international partners, investigated a radiation-induced decay process that plays a key role in radiation chemistry and biological damage processes: electron-transfer-mediated decay (ETMD). In this process, one atom is excited by irradiation. Afterward, this atom relaxes by stealing an electron from a neighbor, while the released energy ionizes yet another nearby atom.

/* */