Toggle light / dark theme

Novel approach to quantum error correction portends a scalable future for quantum computing

A University of Sydney quantum physicist has developed a new approach to quantum error correction that could significantly reduce the number of physical qubits required to build large-scale, fault-tolerant quantum computers. The study, co-authored by Dr. Dominic Williamson from the School of Physics, is titled “Low-overhead fault-tolerant quantum computation by gauging logical operators” and published in Nature Physics.

The work was done while Dr. Williamson was on a sabbatical working at global technology firm IBM in California. Elements of the new design have been integrated into IBM’s plan to build large-scale quantum computing.

“We’re at a point where theory and experiment are beginning to align,” Dr. Williamson said. “The big question now is how to design quantum computers that can be scaled efficiently to solve useful problems. Our work provides a promising blueprint.”

Quantum coherence could be preserved at large scales in realistic environments

Quantum states are notoriously fragile, and can be destroyed simply through interactions, measurements, and exposure to their surrounding environments. In a new theoretical study published in Physical Review X, Rohan Mittal and colleagues at the University of Cologne have discovered a new way to protect quantum behavior on large scales within systems driven far from equilibrium. Their results could have promising implications for the design of more robust quantum devices.

When quantum many-body systems are driven out of equilibrium, they undergo decoherence, causing quantum correlations and superpositions to break down. Even when such a system is built from entirely quantum components, the effect can cause its behavior to become indistinguishable from that of a classical system on larger scales, making it unsuitable for technologies such as quantum computing or sensing.

So far, researchers have attempted to solve the decoherence problem by fine-tuning two independent parameters: one to push the system to the boundary between two distinct quantum phases, and another to ensure that quantum coherence is maintained at this boundary. In practice, however, the need to account for two parameters simultaneously has made this approach both fragile and experimentally daunting.

Researchers Uncover Mining Operation Using ISO Lures to Spread RATs and Crypto Miners

As recently observed in the FAUX#ELEVATE campaign, “WinRing0x64.sys,” a legitimate, signed, and vulnerable Windows kernel driver, is abused to obtain kernel-level hardware access and modify CPU settings to boost hash rates, thereby enabling performance improvement. The use of the driver has been observed in many cryptojacking campaigns over the years. The functionality was added to XMRig miners in December 2019.

Elastic said it also identified another campaign that leads to the deployment of SilentCryptoMiner. The miner, besides using direct system calls to evade detection, takes steps to disable Windows Sleep and Hibernate modes, set up persistence via a scheduled task, and uses the “Winring0.sys” driver to fine-tune the CPU for mining operations.

Another notable component of the attack is a watchdog process that ensures the malicious artifacts and persistence mechanisms are restored in the event they are deleted. The campaign is estimated to have accrued 27.88 XMR ($9,392) across four tracked wallets, indicating that the operation is yielding consistent financial returns to the attacker.

Ending the Sun’s Monopoly: The Future of Stellarator Fusion — Brian Berzin, CEO, Thea Energy

“with Brian Berzin — Co-Founder & CEO of Thea Energy.


What if we could build a fusion reactor that runs continuously—without the instability issues that have plagued the field for years?

Brian Berzin is the Co-Founder and CEO of Thea Energy (https://thea.energy/), a next-generation fusion company focused on advancing stellarator technology—one of the most promising but historically underexplored approaches to magnetic confinement fusion.

Brian brings a unique combination of deep technical and financial expertise, with a background spanning electrical engineering, venture capital, private equity, and investment banking.

Prior to founding Thea Energy, Brian served as Vice President of Strategy at General Fusion, where he helped shape commercialization strategy and led engagement with global capital markets during a pivotal period for privately funded fusion.

‘What’s your salary? I told him, and he said no problem, we’ll double. And those days are gone:’ Listening to game dev legends reminiscing in 1989 about the ‘golden days of computer games’ already being over is a trip

This would be like us saying ‘Remember the good old days of, uh, 2016?’

World’s largest quantum circuit simulation for quantum chemistry achieved on 1,024 GPUs

A joint research team between the Center for Quantum Information and Quantum Biology (QIQB) at The University of Osaka and Fixstars Corporation has demonstrated one of the world’s largest classical simulations of iterative quantum phase estimation (IQPE) circuits for quantum chemistry on up to 1,024 GPUs, surpassing the previous 40-qubit limit. The result expands the scale of molecular systems available for the development and validation of quantum algorithms for future fault-tolerant quantum computers, supporting progress toward industrial applications in drug discovery and materials development.

The paper was presented at NVIDIA GTC 2026, held in San Jose, California, March 16–19, 2026.

Overcoming unresolved challenges in drug discovery and developing new materials to address climate change will require advanced quantum chemical calculations beyond the reach of current technology. Against this backdrop, fault-tolerant quantum computers (FTQC) are widely anticipated as a key enabling technology, making it increasingly important to develop and validate, ahead of their deployment, the quantum algorithms that will eventually run on such systems.

Engineers improve infrared devices using century-old materials

After decades of intense research, surprises in the realm of semiconductors—materials used in microchips to control electrical currents—are few and far between. But with a pair of published papers, materials engineers at Stanford University debut a promising approach to using a well-studied semiconductor to improve infrared light-emitting diodes and sensors. They say the approach could lead to smaller, sleeker, and less expensive infrared technologies for environmental, medical, and industrial uses.

“We taught an old dog new tricks,” said senior author Kunal Mukherjee, an assistant professor of materials science and engineering at the Stanford School of Engineering, putting the work’s importance in perspective. “The so-called IV–VI materials we’re working with—lead selenide and lead tin selenide—are more than a hundred years old. They are among the oldest semiconductors historically recorded. We found a way to integrate them with modern technology to produce a new type of infrared diode and to control the infrared light in important ways.”

The new diode emits infrared light in a desirable range of longer wavelengths (4,000–5,000 nanometers) good for sensing gas in the air (think greenhouse gases in the sky) or in medical settings (think carbon dioxide meters).

Social media feeds: Algorithm redesign could break echo chambers and reduce online polarization

Scroll through social media long enough and a pattern emerges. Pause on a post questioning climate change or taking a hard line on a political issue, and the platform is quick to respond—serving up more of the same viewpoints, delivered with growing confidence and certainty.

That feedback loop is the architecture of an echo chamber: a space where familiar ideas are amplified, dissenting voices fade, and beliefs can harden rather than evolve.

But new research from the University of Rochester has found that echo chambers might not be a fact of online life. Published in IEEE Transactions on Affective Computing, the study argues that they are partly a design choice—one that could be softened with a surprisingly modest change: introducing more randomness into what people see.

Useful quantum computers could be built with as few as 10,000 qubits, team finds

Quantum computers of the future may be closer to reality thanks to new research from Caltech and Oratomic, a Caltech-linked start-up company. Theorists and experimentalists teamed up to develop a new approach for reducing the errors that riddle today’s rudimentary quantum computers. Whereas these machines were previously thought to require millions of qubits to work properly (qubits being the quantum equivalent to 1’s and 0’s in classical computers), the new results indicate that a fully realized quantum computer could be built with as few as 10,000 to 20,000 qubits. The need for fewer qubits means that quantum computers could, in theory, be operational by the end of the decade.

The team proposes a new quantum error-correction architecture that is significantly more efficient than previous approaches. Quantum error correction is a process by which extra, redundant qubits are introduced to correct errors, or faults, enabling the ultimate goal in the field: fault-tolerant quantum computing.

The results exploit special properties of quantum computing platforms built out of neutral atoms, which serve as the qubits. Alternative platforms in development include superconducting circuits and trapped ions (ions are charged whereas neutral atoms are not). In a neutral atom system, laser beams known as optical tweezers are used to arrange atoms into qubit arrays. Manuel Endres, a professor of physics at Caltech, and his colleagues recently created the largest qubit array ever assembled, containing 6,100 trapped neutral atoms.

/* */