Toggle light / dark theme

Get the latest international news and world events from around the world.

Log in for authorized contributors

A New Model for Particle Charging

As flour, plastic dust, and other powdery particles get blown through factory ducts, they become charged through contact with each other and with duct walls. To avoid discharges that could ignite explosions, ducts are metallic and grounded. Still, particles remain an explosive threat if they reach a silo while charged. The microphysics of contact charging is an active area of research, as is the quest to understand the phenomenon as it plays out on larger scales in dust storms, volcanic plumes, and processing plants. Now Holger Grosshans of the German National Metrology Institute in Braunschweig and his collaborators have developed a contact-charging model that can cope with particles and walls made of different materials [1]. What’s more, the model is compatible with computational approaches used to analyze large-scale turbulent flows.

The model treats particles’ acquisition of electric charge from each other and their surroundings as a stochastic process—one that involves some randomness. The resulting charge distributions depend on the amount of charge transferred per impact and other nanoscale parameters that would be tedious to measure for each system. Fortunately, Grosshans and his collaborators found that if they determined all parameters for one system in a controlled experiment, they could readily adjust the parameters to suit other systems.

To test their model, the researchers coupled it to a popular fluid-dynamics solver and simulated 300,000 polymer microparticles stirred by a turbulent flow while confined between four walls. The combination reproduced the complex charging patterns observed in lab experiments—and it did so efficiently: The charging model added less than 0.01% to the simulation’s computational cost.

A Laser Built for Nuclear Timekeeping

Researchers have designed and demonstrated an ultraviolet laser that removes a major bottleneck in the development of a nuclear clock.

Whereas ordinary atomic clocks keep time using transitions of electrons in atoms, a prospective nuclear clock would harness a transition between states of the nucleus. Compared with electronic transitions, nuclear ones are much less sensitive to environmental disturbances, which would potentially give nuclear clocks unprecedented precision and stability. Such devices could improve GPS systems and enable more sensitive probes of fundamental physics. The main hurdle has been that nuclear transitions are extremely difficult to drive controllably using existing laser technology. Now Qi Xiao at Tsinghua University in China and colleagues have proposed and realized an intense single-frequency ultraviolet laser that can achieve such driving for thorium-229 nuclei [1, 2]. Beyond timekeeping, the team’s laser platform could find uses across quantum information science, condensed-matter physics, and high-resolution spectroscopy.

For most nuclear transitions, the energy difference between the two states lies in the kilo-electron-volt to mega-electron-volt range. Consequently, such transitions are inaccessible to today’s high-precision lasers, which can deliver photons of typically a few electron volts in energy. A long-known exception is the transition between the ground state and first excited state of thorium-229 nuclei. Indirect measurements over the past 50 years have gradually pinned down that transition’s energy difference to only about 8.4 eV. As a result, this transition is being actively investigated as a candidate for developing a nuclear clock.

Water-based electrolyte helps create safer and long-lasting Zn-Mn batteries

Many countries worldwide are increasingly investing in new infrastructure that enables the production of electricity from renewable energy sources, particularly wind and sunlight. To make the best of these energy solutions, one should also be able to reliably store the excess energy created during periods of intense sunlight or wind, so that it can be used later in times of need.

One promising type of battery for this purpose is based on zinc-manganese (Zn-Mn) and utilizes aqueous (i.e., water-based) electrodes instead of flammable organic electrolytes. These batteries rely on processes known as electrodeposition and dissolution, via which solid materials form and dissolve on electrodes as the battery is charging and discharging.

In Zn-Mn batteries, Zn serves as the anode (i.e., the electrode that releases electrons) and manganese dioxide (MnO₂) the cathode (i.e., the electrode from which electrons are gained). A key chemical reaction prompting their functioning, known as the MnO₂/Mn²⁺ conversion reaction, typically can only occur in acidic conditions.

Bio-inspired chip helps robots and self-driving cars react faster to movement

Robots and self-driving cars could soon benefit from a new kind of brain-inspired hardware that can allegedly detect movement and react faster than a human. A new study published in the journal Nature Communications details how an international team built their neuromorphic temporal-attention hardware system to speed up automated driving decisions.

The problem with current robotic vision and self-driving vehicles is a significant delay in processing what they see. While today’s top AI programs can recognize objects accurately, the calculations are so complex that they can take up to half a second to complete. That may not sound like a lot, but at highway speeds, even a one-second delay means a car travels 27 meters before it even begins to react. That is too long and too slow a reaction time.

To solve this problem, the team worked on a hardware solution rather than tinkering with software, modeling it on how human vision works. When we view a situation, our visual system doesn’t analyze every detail at once. It first detects changes in brightness and movement, then processes the more complex details later.

Only humans have chins: Study shows it’s an evolutionary accident

Dashiell Hammett mentioned Sam Spade’s jutting chin in the opening sentence of his novel, “The Maltese Falcon.” Spade’s chin was among the facial features Hammett used to describe his fictional detective’s appearance, but starting with that distinctive chin was—at least from an evolutionary perspective—an unintentional redundancy, since every chin is distinctive in the sense that humans are the only primates to possess that physical characteristic.

Chimpanzees, humans’ closest living relatives, do not have a chin. Neither did Neanderthals, Denisovans, or any other extinct human species. Humans, it turns out, have a unique capacity to “take it on the chin” because we’re uniquely in possession of that physical feature. That exclusive nature makes the chin well suited for identifying Homo sapiens in the fossil record.

In simplest terms, a chin is a bony projection of the lower jaw. So why is it there? How and why did it evolve?

New experiments suggest Earth’s core contains up to 45 oceans’ worth of hydrogen

Scientists have long known that Earth’s core is mostly made of iron, but the density is not high enough for it to be pure iron, meaning lighter elements exist in the core, as well. In particular, it’s suspected to be a major reservoir of hydrogen. A new study, published in Nature Communications, supports this idea with results suggesting the core contains up to 45 oceans’ worth of hydrogen. These results also challenge the idea that most of Earth’s water was delivered by comets early on.

Because of the extreme conditions in Earth’s core and its distance from the surface, analyzing its composition presents difficulties. Additionally, many techniques are inadequate for resolving hydrogen because it is the lightest and smallest element. Earlier estimates relied on indirect methods, such as inferring hydrogen composition from lattice expansion in iron hydrides. These difficulties have led to highly uncertain estimates of hydrogen in the core, spanning four orders of magnitude.

The team involved in the new study took a different approach, using laser-heated diamond anvil cells to simulate high-pressure, high-temperature core conditions, up to 111 GPa and around 5,100 Kelvin. The team placed core-like iron samples and hydrous silicate glass, representing Earth’s early magma oceans, in the diamond anvil cells to induce melting, similar to conditions in the core.

The origin of magic numbers: Why some atomic nuclei are unusually stable

For the first time, physicists have developed a model that explains the origins of unusually stable magic nuclei based directly on the interactions between their protons and neutrons. Published in Physical Review Letters, the research could help scientists better understand the exotic properties of heavy atomic nuclei and the fundamental forces that hold them together.

While every chemical element is defined by a fixed number of protons in its atomic nucleus, the number of neutrons it contains is far less constrained. For almost every known element, there are at least two different nuclear configurations, or isotopes, which vary only in their number of neutrons.

However, if the number of protons and neutrons becomes too unbalanced in either direction, the nucleus becomes unstable. Since heavier elements tend to have fewer stable isotopes, these radioactive nuclei grow increasingly rare as this imbalance increases. Yet for certain specific numbers of protons and neutrons (collectively known as “nucleons”), some isotopes are found to be exceptionally stable, for reasons that physicists have struggled to fully explain.

Ammonia leaks can be spotted in under two seconds using new alveoli-inspired droplet sensor

Researchers from Guangxi University, China have developed a new gas sensor that detects ammonia with a record speed of 1.4 seconds. The sensor’s design mimics the structure of alveoli—the tiny air sacs in human lungs—while relying on a triboelectric nanogenerator (TENG) that converts mechanical energy into electrical energy. The sensor uses a process that is driven by A-droplets, which are tiny water droplets containing a trapped air bubble. These droplets exploit ammonia’s affinity for water to rapidly capture NH₃ when it is present.

When an ammonia-laden droplet falls onto the sensor, its mechanical impact completes an electrical circuit, generating signals that are converted into accurate gas measurements at a speed that exceeds existing ammonia gas sensors.

To take detection precision a step further, the team integrated an AI model that analyzes electrical signals and converts them into time-frequency images. After training on these images, the system classified ammonia into five concentration levels (0–200 ppm), achieving up to 98.4% detection accuracy.

A familiar magnet gets stranger: Why cobalt’s topological states could matter for spintronics

The element cobalt is considered a typical ferromagnet with no further secrets. However, an international team led by HZB researcher Dr. Jaime Sánchez-Barriga has now uncovered complex topological features in its electronic structure. Spin-resolved measurements of the band structure (spin-ARPES) at BESSY II revealed entangled energy bands that cross each other along extended paths in specific crystallographic directions, even at room temperature. As a result, cobalt can be considered as a highly tunable and unexpectedly rich topological platform, opening new perspectives for exploiting magnetic topological states in future information technologies.

The findings are published in the journal Communications Materials.

Cobalt is an elementary ferromagnet, and its properties and crystal structure have long been known. However, an international team has now discovered that cobalt hosts an unexpectedly rich topological electronic structure that remains robust at room temperature, revealing a surprising new level of quantum complexity in this material.

Electronic friction can be tuned and switched off

Researchers in China have isolated the effects of electronic friction, showing for the first time how the subtle drag force it imparts at sliding interfaces can be controlled. They demonstrate that it can be tuned by applying a voltage, or switched off entirely simply by applying mechanical pressure. The results, published in Physical Review X, could inform new designs that allow engineers to fine-tune the drag forces materials experience as they slide over each other.

In engineering, friction causes materials to wear and degrade over time, and also causes useful energy to be wasted as heat. While this problem can be mitigated through lubricants and smoother surfaces, friction can also arise from deeper, more subtle effects.

Among these is an effect which can occur at metallic or chemically active surfaces as they slide past one another. In these cases, atomic nuclei in one surface can transfer some of their energy to electrons in the other surface, exciting them to higher energy levels. This lost energy produces a drag force that increases with sliding velocity: an effect known as “electronic friction.”

/* */