Toggle light / dark theme

Silicon quantum processor detects single-qubit errors while preserving entanglement

Quantum computers are alternative computing devices that process information, leveraging quantum mechanical effects, such as entanglement between different particles. Entanglement establishes a link between particles that allows them to share states in such a way that measuring one particle instantly affects the others, irrespective of the distance between them.

Quantum computers could, in principle, outperform classical computers in some optimization and computational tasks. However, they are also known to be highly sensitive to environmental disturbances (i.e., noise), which can cause quantum errors and adversely affect computations.

Researchers at the International Quantum Academy, Southern University of Science and Technology, and Hefei National Laboratory have developed a new approach to detect these errors in a silicon-based quantum processor. This error detection strategy, presented in a paper published in Nature Electronics, was found to successfully detect quantum errors in silicon qubits, while also preserving entanglement after their detection.

Using light to probe fractional charges in a fractional Chern insulator

In some quantum materials, which are materials governed by quantum mechanical effects, interactions between charged particles (i.e., electrons) can prompt the creation of quasiparticles called anyons, which carry only a fraction of an electron’s charge (i.e., fractional charge) and fractional quantum statistics.

A well-known phenomenon characterized by the emergence of anyons is the so-called fractional quantum Hall effect (FQHE). This effect can emerge in two-dimensional (2D) electron gases under strong magnetic fields and is marked by quantum states in which electrons strongly interact with each other.

Recent studies showed that a similar effect can also arise in the absence of magnetic fields, known as fractional quantum anomalous Hall (FQAH) effect, in quantum phases of matter fractional Chern insulators (FCIs). The FQAH effect was realized for the first time using bilayer twisted molybdenum ditelluride (tMoTe₂)—a moiré superlattice that has a characteristic lattice pattern and a slight twist angle between constituent layers.

Amaterasu Particle That Broke Physics Has Finally Been Explained

A mysterious, extremely energetic particle, known as the Amaterasu particle, was detected coming from a distant region of space, and scientists have proposed explanations for its origin, potentially tracing it back to a starburst galaxy like Messier 82 ##

## Questions to inspire discussion.

Understanding Ultra-High Energy Cosmic Rays.

🔬 Q: What makes the Amaterasu particle exceptionally powerful? A: The Amaterasu particle detected in Utah in 2021 carries energy 40 million times higher than anything produced on Earth, equivalent to a baseball traveling at 100 km/h compressed into a single subatomic particle, making it one of the most energetic particles ever detected.

Solving the Origin Mystery.

🎯 Q: Where did scientists determine the Amaterasu particle actually originated? A: A 2026 study by Max Planck Institute scientists using approximate Bayesian computation and 3D magnetic field simulations traced the particle’s origin to a starburst galaxy like Messier 82, located 12 million light-years away, rather than the initially suspected local void with only six known galaxies.

Scientists Continue to Trace the Origin of the Mysterious “Amaterasu” Cosmic Ray Particle

When the Amaterasu particle entered Earth’s atmosphere, the TAP array in Utah recorded an energy level of more than 240 exa-electronvolts (EeV). Such particles are exceedingly rare and are thought to originate in some of the most extreme cosmic environments. At the time of its detection, scientists were not sure if it was a proton, a light atomic nucleus, or a heavy (iron) atomic nucleus. Research into its origin pointed toward the Local Void, a vast region of space adjacent to the Local Group that has few known galaxies or objects.

This posed a mystery for astronomers, as the region is largely devoid of sources capable of producing such energetic particles. Reconstructing the energy of cosmic-ray particles is already difficult, making the search for their sources using statistical models particularly challenging. Capel and Bourriche addressed this by combining advanced simulations with modern statistical methods (Approximate Bayesian Computation) to generate three-dimensional maps of cosmic-ray propagation and their interactions with magnetic fields in the Milky Way.

The IceCube experiment is ready to uncover more secrets of the universe

The name “IceCube” not only serves as the title of the experiment, but also describes its appearance. Embedded in the transparent ice of the South Pole, a three-dimensional grid of more than 5,000 extremely sensitive light sensors forms a giant cube with a volume of one cubic kilometer. This unique arrangement serves as an observatory for detecting neutrinos, the most difficult elementary particles to detect.

In order to detect neutrinos, they must interact with matter, creating charged particles whose light can be measured. These light measurements can be used to determine information about the properties of neutrinos. However, the probability of neutrinos interacting with matter is extremely low, so they usually pass through it without leaving a trace, which makes their detection considerably more difficult.

For this reason, a large detector volume is required to increase the probability of interaction, and state-of-the-art technology is crucial for detecting such rare interactions.

IceCube upgrade adds six deep sensor strings to detect lower-energy neutrinos

Since 2010, the IceCube Observatory at the Amundsen-Scott South Pole Station has been delivering groundbreaking measurements of high-energy cosmic neutrinos. It consists of many detectors embedded in a volume of Antarctic ice measuring approximately one cubic kilometer. IceCube has now been upgraded with new optical modules to enable it to measure lower-energy neutrinos as well. Researchers at the Karlsruhe Institute of Technology (KIT) made a significant contribution to this expansion.

IceCube serves to measure high-energy neutrinos in an ice volume of one cubic kilometer. As neutrinos themselves do not emit any signals, the tracks of muons and other secondary particles are measured precisely. Muons are elementary particles sometimes produced by the interaction of neutrinos with ice. Contrary to neutrinos, muons carry an electric charge. On their way through the ice, they produce a characteristic light cone, which is detected by highly sensitive detectors.

Now, 51 researchers from around the world have installed six new strings of novel sensors up to 2,400 meters deep into the eternal ice, thereby expanding the IceCube experiment to also measure low-energy neutrinos.

AI captures particle accelerator behavior to optimize machine performance

Keeping high-power particle accelerators at peak performance requires advanced and precise control systems. For example, the primary research machine at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility features hundreds of fine-tuned components that accelerate electrons to 99.999% the speed of light.

The electrons get this boost from radiofrequency waves within a series of resonant structures known as cavities, which become superconducting at temperatures colder than deep space.

These cavities form the backbone of Jefferson Lab’s Continuous Electron Beam Accelerator Facility (CEBAF), a unique DOE Office of Science user facility supporting the research of more than 1,650 nuclear physicists from around the globe. CEBAF also holds the distinction of being the world’s first large-scale installation and application of this superconducting radiofrequency (SRF) technology.

Nanolaser on a chip could cut computer energy use in half

Researchers at DTU have developed a nanolaser that could be the key to much faster and much more energy-efficient computers, phones, and data centers. The technology offers the prospect of thousands of the new lasers being placed on a single microchip, thus opening a digital future where data is no longer transmitted using electrical signals, but using light particles, photons. The invention has been published in the journal Science Advances.

“The nanolaser opens up the possibility of creating a new generation of components that combine high performance with minimal size. This could be in information technology, for example, where ultra-small and energy-efficient lasers can reduce energy consumption in computers, or in the development of sensors for the health care sector, where the nanolaser’s extreme light concentration can deliver high-resolution images and ultrasensitive biosensors,” says DTU professor Jesper Mørk, who co-authored the paper together with, among others, Drs. Meng Xiong and Yi Yu from DTU Electro.

/* */