Toggle light / dark theme

Back in 2018, a scientist from the University of Texas at Austin proposed a protocol to generate randomness in a way that could be certified as truly unpredictable. That scientist, Scott Aaronson, now sees that idea become a working reality. “When I first proposed my certified randomness protocol in 2018, I had no idea how long I’d need to wait to see an experimental demonstration of it,” said Aaronson, who now directs a quantum center at a major university.

The experiment was carried out on a cutting-edge 56-qubit quantum computer, accessed remotely over the internet. The machine belongs to a company that recently made a significant upgrade to its system. The research team included experts from a large bank’s tech lab, national research centers, and universities.

To generate certified randomness, the team used a method called random circuit sampling, or RCS. The idea is to feed the quantum computer a series of tough problems, known as challenge circuits. The computer must solve them by choosing among many possible outcomes in a way that’s impossible to predict. Then, classical supercomputers step in to confirm whether the answers are genuinely random or not.

The boundaries of computing are shifting as biology fuses with technology. At the center of this new frontier is an emerging concept: a liquid computer powered by DNA. With the ability to support more than 100 billion unique circuits, this system could soon transform how we detect and diagnose disease.

While DNA is best known for encoding life, researchers are now exploring its potential as a computing tool. A team led by Dr. Fei Wang at Shanghai Jiao Tong University believes DNA can do much more than carry genetic instructions.

Their study, recently published in Nature, reveals how DNA molecules could become the core components of new computing systems. Rather than just holding genetic data, DNA could behave like wires, instructions, or even electrons inside biological circuits.

In 2025, China tech is no longer just catching up—it’s rewriting the rules. From quantum computers that outperform U.S. supercomputers to humanoid robots priced for mass adoption, China tech is accelerating at a pace few imagined. In this video, Top 10 Discoveries Official explores the 8 cutting-edge breakthroughs that prove China tech is reshaping transportation, AI, clean energy, and even brain-computer interfaces. While the West debates and regulates, China tech builds—from driverless taxis and flying cars to homegrown AI chips and thorium reactors. Watch now to understand why the future might not be written in Silicon Valley, but in Shenzhen.

#chinatech #chinaai #chinanews #top10discoveriesofficial

Researchers at IBM and Lockheed Martin teamed up high-performance computing with quantum computing to accurately model the electronic structure of ‘open-shell’ molecules, methylene, which has been a hurdle with classic computing over the years. This is the first demonstration of the sample-based quantum diagonalization (SQD) technique to open-shell systems, a press release said.

Quantum computing, which promises computations at speeds unimaginable by even the fastest supercomputers of today, is the next frontier of computing. Leveraging quantum states of molecules to serve as quantum bits, these computers supersede computational capabilities that humanity has had access to in the past and open up new research areas.

We’re announcing the world’s first scalable, error-corrected, end-to-end computational chemistry workflow. With this, we are entering the future of computational chemistry.

Quantum computers are uniquely equipped to perform the complex computations that describe chemical reactions – computations that are so complex they are impossible even with the world’s most powerful supercomputers.

However, realizing this potential is a herculean task: one must first build a large-scale, universal, fully fault-tolerant quantum computer – something nobody in our industry has done yet. We are the farthest along that path, as our roadmap, and our robust body of research, proves. At the moment, we have the world’s most powerful quantum processors, and are moving quickly towards universal fault tolerance. Our commitment to building the best quantum computers is proven again and again in our world-leading results.

Plasma—the electrically charged fourth state of matter—is at the heart of many important industrial processes, including those used to make computer chips and coat materials.

Simulating those plasmas can be challenging, however, because millions of math operations must be performed for thousands of points in the simulation, many times per second. Even with the world’s fastest supercomputers, scientists have struggled to create a kinetic simulation—which considers individual particles—that is detailed and fast enough to help them improve those manufacturing processes.

Now, a new method offers improved stability and efficiency for kinetic simulations of what’s known as inductively coupled plasmas. The method was implemented in a developed as part of a private-public partnership between the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) and chip equipment maker Applied Materials Inc., which is already using the tool. Researchers from the University of Alberta, PPPL and Los Alamos National Laboratory contributed to the project.

A research team from the Department of Energy’s Oak Ridge National Laboratory, in collaboration with North Carolina State University, has developed a simulation capable of predicting how tens of thousands of electrons move in materials in real time, or natural time rather than compute time.

The project reflects a longstanding partnership between ORNL and NCSU, combining ORNL’s expertise in time-dependent quantum methods with NCSU’s advanced quantum simulation platform developed under the leadership of Professor Jerry Bernholc.

Using the Oak Ridge Leadership Computing Facility’s Frontier supercomputer, the world’s first to break the exascale barrier, the research team developed a real-time, time-dependent density functional theory, or RT-TDDFT, capability within the open-source Real-space Multigrid, or RMG, code to model systems of up to 24,000 electrons.

Analyzing massive datasets from nuclear physics experiments can take hours or days to process, but researchers are working to radically reduce that time to mere seconds using special software being developed at the Department of Energy’s Lawrence Berkeley and Oak Ridge national laboratories.

DELERIA—short for Distributed Event-Level Experiment Readout and Integrated Analysis—is a novel software platform designed specifically to support the GRETA spectrometer, a cutting-edge instrument for nuclear physics experiments. The Gamma Ray Energy Tracking Array (GRETA), is currently under construction at Berkeley Lab and is scheduled to be installed in 2026 at the Facility for Rare Isotope Beams (FRIB), at Michigan State University.

The software will enable GRETA to stream data directly to the nation’s leading computing centers with the goal of analyzing large datasets in seconds. The data will be sent via the Energy Sciences Network, or ESnet. This will allow researchers to make critical adjustments to the experiment as it is taking place, leading to increased scientific productivity with significantly faster, more accurate results.

Can AI speed up aspects of the scientific process? Microsoft appears to think so.

At the company’s Build 2025 conference on Monday, Microsoft announced Microsoft Discovery, a platform that taps agentic AI to “transform the [scientific] discovery process,” according to a press release provided to TechCrunch. Microsoft Discovery is “extensible,” Microsoft says, and can handle certain science-related workloads “end-to-end.”

“Microsoft Discovery is an enterprise agentic platform that helps accelerate research and discovery by transforming the entire discovery process with agentic AI — from scientific knowledge reasoning to hypothesis formulation, candidate generation, and simulation and analysis,” explains Microsoft in its release. “The platform enables scientists and researchers to collaborate with a team of specialized AI agents to help drive scientific outcomes with speed, scale, and accuracy using the latest innovations in AI and supercomputing.”

China has begun launching satellites for a giant computer network in space, according to the China Aerospace Science and Technology Corporation.

Newsweek contacted the company and the United States Space Force for comment.

Why It Matters

Space is an increasing frontier for competition between China and the United States. Putting a computer network in space marks a step change from using satellites for sensing and communications, but leaving them dependent on their connections to Earth for data processing.