Toggle light / dark theme

Direct 3D printing of nanolasers can boost optical computing and quantum security

In future high-tech industries, such as high-speed optical computing for massive AI, quantum cryptographic communication, and ultra-high-resolution augmented reality (AR) displays, nanolasers—which process information using light—are gaining significant attention as core components for next-generation semiconductors.

A research team has proposed a new manufacturing technology capable of high-density placement of nanolasers on semiconductor chips, which process information in spaces thinner than a human hair.

A joint research team led by Professor Ji Tae Kim from the Department of Mechanical Engineering and Professor Junsuk Rho from POSTECH, has developed an ultra-fine 3D printing technology capable of creating “vertical nanolasers,” a key component for ultra-high-density optical integrated circuits.

Advanced quantum detectors are reinventing the search for dark matter

When it comes to understanding the universe, what we know is only a sliver of the whole picture.

Dark matter and dark energy make up about 95% of the universe, leaving only 5% “ordinary matter,” or what we can see. Dr. Rupak Mahapatra, an experimental particle physicist at Texas A&M University, designs highly advanced semiconductor detectors with cryogenic quantum sensors, powering experiments worldwide and pushing the boundaries to explore this most profound mystery.

Mahapatra likens our understanding of the universe—or lack thereof—to an old parable: “It’s like trying to describe an elephant by only touching its tail. We sense something massive and complex, but we’re only grasping a tiny part of it.”

Solving quantum computing’s longstanding ‘no cloning’ problem with an encryption workaround

A team of researchers at the University of Waterloo have made a breakthrough in quantum computing that elegantly bypasses the fundamental “no cloning” problem. The research, “Encrypted Qubits can be Cloned,” appears in Physical Review Letters.

Quantum computing is an exciting technological frontier, where information is stored and processed in tiny units—called qubits. Qubits can be stored, for example, in individual electrons, photons (particles of light), atoms, ions or tiny currents.

Universities, industry, and governments around the world are spending billions of dollars to perfect the technology for controlling these qubits so that they can be combined into large, reliable quantum computers. This technology will have powerful applications, including in cybersecurity, materials science, medical research and optimization.

New framework unifies space and time in quantum systems

Quantum mechanics and relativity are the two pillars of modern physics. However, for over a century, their treatment of space and time has remained fundamentally disconnected. Relativity unifies space and time into a single fabric called spacetime, describing it seamlessly. In contrast, traditional quantum theory employs different languages: quantum states (density matrix) for spatial systems and quantum channels for temporal evolution.

A recent breakthrough by Assistant Professor Seok Hyung Lie from the Department of Physics at UNIST offers a way to describe quantum correlations across both space and time within a single, unified framework. Assistant Professor Lie is first author, with Professor James Fullwood from Hainan University serving as the corresponding author. Their collaboration creates new tools that could significantly impact future studies in quantum science and beyond. The study has been published in Physical Review Letters.

In this study, the team developed a new theoretical approach that treats the entire timeline as one quantum state. This concept introduces what they call the multipartite quantum states over time. In essence, it allows us to describe quantum processes at different points in time as parts of a single, larger quantum state. This means that both spatially separated systems and systems separated in time can be analyzed using the same mathematical language.

Electrons that lag behind nuclei in 2D materials could pave way for novel electronics

One of the great successes of 20th-century physics was the quantum mechanical description of solids. This allowed scientists to understand for the first time how and why certain materials conduct electric current and how these properties could be purposefully modified. For instance, semiconductors such as silicon could be used to produce transistors, which revolutionized electronics and made modern computers possible.

To be able to mathematically capture the complex interplay between electrons and atomic nuclei and their motions in a solid, physicists had to make some simplifications. They assumed, for example, that the light electrons in an atom follow the motion of the much heavier atomic nuclei in a crystal lattice without any delay. For several decades, this Born-Oppenheimer approximation worked well.

Error-correction technology to turn quantum computing into real-world power

Ripples spreading across a calm lake after raindrops fall—and the way ripples from different drops overlap and travel outward—is one image that helps us picture how a quantum computer handles information.

Unlike conventional computers, which process digital data as “0 or 1,” quantum computers can process information in an in-between state where it is “both 0 and 1.” These quantum states behave like waves: they can overlap, reinforcing one another or canceling one another out. In computations that exploit this property, states that lead to the correct answer are amplified, while states that lead to wrong answers are suppressed.

Thanks to this interference between waves, a quantum computer can sift through many candidate answers at once. Our everyday computers take time because they evaluate each candidate one by one. Quantum computers, by contrast, can narrow down the answer in a single sweep—earning them the reputation of “dream machines” that could solve in an instant problem that might take hundreds of years on today’s computers.

Making sense of quantum gravity in five dimensions

Quantum theory and Einstein’s theory of general relativity are two of the greatest successes in modern physics. Each works extremely well in its own domain: Quantum theory explains how atoms and particles behave, while general relativity describes gravity and the structure of spacetime. However, despite many decades of effort, scientists still do not have a satisfying theory that combines both into one clear picture of reality.

Most common approaches assume that gravity must also be described using quantum ideas. As physicist Richard Feynman once said, “We’re in trouble if we believe in quantum mechanics but don’t quantize gravity.” Yet quantum theory itself has deep unresolved problems. It does not clearly explain how measurements lead to definite outcomes, and it relies on strange ideas that clash with everyday experience, such as objects seemingly behaving like both waves and particles, and apparent nonlocal connections between distant systems.

These puzzles become even sharper because of Bell’s theorem. This theorem shows that no theory based on ordinary ideas—such as locality, an objective reality, and freely chosen measurements—can fully match the predictions of quantum theory within our usual four-dimensional view of space and time. These quantum predictions have been repeatedly confirmed in tests of entanglement, first discussed by Einstein, Podolsky, and Rosen (EPR). As a result, simple classical explanations limited to ordinary four-dimensional spacetime cannot fully account for what we observe.

Fault-tolerant quantum computing: Novel protocol efficiently reduces resource cost

Quantum computers, systems that process information leveraging quantum mechanical effects, could soon outperform classical computers on some complex computational problems. These computers rely on qubits, units of quantum information that share states with each other via a quantum mechanical effect known as entanglement.

Qubits are highly susceptible to noise in their surroundings, which can disrupt their quantum states and lead to computation errors. Quantum engineers have thus been trying to devise effective strategies to achieve fault-tolerant quantum computation, or in other words, to correct errors that arise when quantum computers process information.

Existing approaches work either by reducing the extra number of physical qubits needed per logical qubit (i.e., space overhead) or by reducing the number of physical operations needed to perform a single logical operation (i.e., time overhead). Effectively tackling both these goals together, which would enable more scalable systems and faster computations, has so far proved challenging.

Metal–metal bonded molecule achieves stable spin qubit state, opening path toward quantum computing materials

Researchers at Kumamoto University, in collaboration with colleagues in South Korea and Taiwan, have discovered that a unique cobalt-based molecule with metal–metal bonds can function as a spin quantum bit (spin qubit)—a fundamental unit for future quantum computers. The findings provide a new design strategy for molecular materials used in quantum information technologies.

The study is published in the journal Chemical Communications.

/* */