Toggle light / dark theme

Quantum Calculations Boosted By Doubling Computational Space For Complex Molecules

Researchers have developed a new computational method, DOCI-QSCI-AFQMC, which accurately simulates complex molecular systems by effectively doubling the number of orbitals considered in standard quantum simulations and overcoming limitations of existing single-reference techniques, as demonstrated through successful modelling of chemical bonds and reactions.

Major earthquakes are just as random as smaller ones

For obvious reasons, it would be useful to predict when an earthquake is going to occur. It has long been suspected that large quakes in the Himalayas follow a fairly predictable cycle, but nature, as it turns out, is not so accommodating. A new study published in the journal Science Advances shows that massive earthquakes are just as random as small ones. A team of researchers led by Zakaria Ghazoui-Schaus at the British Antarctic Survey reached this conclusion after analyzing sediments from Lake Rara in Western Nepal.

The team extracted a 4-meter-long tube from the bottom of the lake and identified 50 sediment layers spanning 6,000 years. Whenever a major quake shakes the region, underwater landslides create layers of sediment called turbidites. These deposits are characterized by coarse materials that settle first, followed by sand, then silt and finally clay. Each layer is essentially a snapshot of an individual earthquake, although they can also result from floods and slope failures.

To confirm that these layers were caused by quakes, the team compared them with modern records and computer models. They concluded that only a quake of magnitude 6.5 or higher could trigger underwater landslides. Radiocarbon dating of organic material within each layer revealed roughly when each of the major quakes occurred.

A microfluidic chip monitors gases using integrated, motionless pumps

A new microscale gas chromatography system integrates all fluidic components into a single chip for the first time. The design leverages three Knudsen pumps that move gas molecules using heat differentials to eliminate the need for valves, according to a new University of Michigan Engineering study published in Microsystems & Nanoengineering. The monolithic gas sampling and analysis system, or monoGSA system for short, could offer reliable, low-cost monitoring for industrial chemical or pharmaceutical synthesis, natural gas pipelines, or even at-home air quality.

Gas chromatography has long been considered the gold standard for measuring and quantifying volatile organic compounds—gases emitted from industrial processes, fuels, household products and more. Recently, micro gas chromatography miniaturized the technology to briefcase-size or smaller, bringing gas analysis from the laboratory to the source.

Most micro gas chromatography systems use pumps and valves to move gas molecules from an input port to a preconcentrator, which extracts and concentrates samples, then from the preconcentrator to a column for chemical separation, and then to the detector and finally to an exhaust port. Up to this point, pumps and valves have been fabricated and assembled separately, which increases device size, assembly cost and risk of failure at connection points.

Nanolaser on a chip could cut computer energy use in half

Researchers at DTU have developed a nanolaser that could be the key to much faster and much more energy-efficient computers, phones, and data centers. The technology offers the prospect of thousands of the new lasers being placed on a single microchip, thus opening a digital future where data is no longer transmitted using electrical signals, but using light particles, photons. The invention has been published in the journal Science Advances.

“The nanolaser opens up the possibility of creating a new generation of components that combine high performance with minimal size. This could be in information technology, for example, where ultra-small and energy-efficient lasers can reduce energy consumption in computers, or in the development of sensors for the health care sector, where the nanolaser’s extreme light concentration can deliver high-resolution images and ultrasensitive biosensors,” says DTU professor Jesper Mørk, who co-authored the paper together with, among others, Drs. Meng Xiong and Yi Yu from DTU Electro.

A new microscope for the quantum age: Single nanoscale scan measures four key material properties

Physicists in Leiden have built a microscope that can measure no fewer than four key properties of a material in a single scan, all with nanoscale precision. The instrument can even examine complete quantum chips, accelerating research and innovation in the field of quantum materials. The study is published in the journal Nano Letters.

Temperature, magnetism, structure, and electrical properties. These are the material characteristics that this new microscope reveals. “It almost feels like having a superpower,” says Matthijs Rog, a Ph.D. student in Kaveh Lahabi’s research group. “You look at a sample and see not only its shape but also the electrical currents, heat, and magnetism within it.”

Kaveh Lahabi, who leads the group, says, “This microscope removes the experimental bottlenecks that have long limited the study of quantum materials. This is not an idealized technique—it works on the systems we actually want to understand. Furthermore, the sensitivity of our measurements tends to impress a lot of my physicist colleagues.”

Physicists develop new protocol for building photonic graph states

Physicists have long recognized the value of photonic graph states in quantum information processing. However, the difficulty of making these graph states has left this value largely untapped. In a step forward for the field, researchers from The Grainger College of Engineering at the University of Illinois Urbana-Champaign have proposed a new scheme they term “emit-then-add” for producing highly entangled states of many photons that can work with current hardware. Published in npj Quantum Information, their strategy lays the groundwork for a wide range of quantum enhanced operations including measurement-based quantum computing.

Entanglement is a key driver in delivering faster and more secure computational and information systems. But creating large, entangled states of more than two photons is challenging because the losses inherent in optical systems mean most photon sources have a low probability of successfully producing a photon that survives to the point of detection. Therefore, any attempt to build a large entangled state is full of missing photons, breaking the state apart. And identifying the missing spots would mean attempting detection of the photons, which is a destructive process itself, and precludes going back to fill those spots.

To circumvent this challenge, a team led by Associate Professor of Physics Elizabeth Goldschmidt and Professor of Electrical and Computer Engineering Eric Chitambar began with a different mindset.

Cutting down on quantum-dot crosstalk: Precise measurements expose a new challenge

Devices that can confine individual electrons are potential building blocks for quantum information systems. But the electrons must be protected from external disturbances. RIKEN researchers have now shown how quantum information encoded into a so-called quantum dot can be negatively affected by nearby quantum dots. This has implications for developing quantum information devices based on quantum dots.

Quantum computers process information using so-called qubits: physical systems whose behavior is governed by the laws of quantum mechanics. An electron, if it can be isolated and controlled, is one example of a qubit platform with great potential.

One way of controlling an electron is to use a quantum dot. These tiny structures trap charged particles using electric fields at the tips of metal electrodes separated by just a few tens of nanometers.

Rolling out the carpet for spin qubits with new chip architecture

Researchers at QuTech in Delft, The Netherlands, have developed a new chip architecture that could make it easier to test and scale up quantum processors based on semiconductor spin qubits. The platform, called QARPET (Qubit-Array Research Platform for Engineering and Testing) and reported in Nature Electronics, allows hundreds of qubits to be characterized within the same test-chip under the same operating conditions used in quantum computing experiments.

“With such a complex, tightly packed quantum chip, things really start to resemble the traditional semiconductor industry,” states researcher Giordano Scappucci.

When viewed under a microscope, the structure of the QARPET chip appears almost woven. Fabrication was in fact a stress test for engineering capabilities.

Apple fixes zero-day flaw used in ‘extremely sophisticated’ attacks

Apple has released security updates to fix a zero-day vulnerability that was exploited in an “extremely sophisticated attack” targeting specific individuals.

Tracked as CVE-2026–20700, the flaw is an arbitrary code execution vulnerability in dyld, the Dynamic Link Editor used by Apple operating systems, including iOS, iPadOS, macOS, tvOS, watchOS, and visionOS.

Apple’s security bulletin warns that an attacker with memory write capability may be able to execute arbitrary code on affected devices.

NuMA promotes constitutive heterochromatin compaction by stabilizing linker histone H1 on chromatin

The nuclear repeat length (NRL) was calculated using NRLfinder as previous publication.33 Briefly, read lengths were extracted and converted into a frequency histogram, which was then smoothed using a digital 6th-order Butterworth filter with a zero-phase shift and a cutoff frequency of 0.04 cycles/read. This cutoff was empirically optimized to reduce noise from mononucleosomal DNA winding artifacts. Local minima and maxima were identified from the first derivative of the filtered histogram, with the second peak maximum corresponding to the dinucleosomal periodicity. The NRL shift between conditions (e.g., control vs. NuMA-depleted HCT116 cells) was calculated the mean difference between the first two peak maxima of each sample. All analyses were performed in Python 3.9 with NumPy, SciPy, and Matplotlib libraries.

For chromatin-state modeling, we used the ChromHMM (v.1.19).32 The input data of ATAC-seq and RNA-seq reported in this manuscript was generated as described above. Additional input data including ChIP-seq for CTCF, H3K4me3, H3K27me3, H3K4me1, H3K36me3 and H3K9me3 were download from ENCODE (https://www.encodeproject.org). briefly, raw bam files were download and replicates were combined. BinarizeBam and LearnModel tools in ChromHMM was used to generate chromatin state model with default settings. Emissions parameters were visualized in R.

/* */