Toggle light / dark theme

The world’s fastest supercomputer ‘El Capitan’ can reach a peak performance of 2.746 exaFLOPS, making it the planet’s third exascale computer.

At Argonne National Laboratory, scientists have leveraged the Frontier supercomputer to create an unprecedented simulation of the universe, encompassing a span of 10 billion light years and incorporating complex physics models.

This monumental achievement allows for new insights into galaxy formation and cosmic evolution, showcasing the profound capabilities of exascale computing.

Breakthrough in Universe Simulation.

Researchers have pioneered the use of parallel computing on graphics cards to simulate acoustic turbulence. This type of simulation, which previously required a supercomputer, can now be performed on a standard personal computer. The discovery will make weather forecasting models more accurate while enabling the use of turbulence theory in various fields of physics, such as astrophysics, to calculate the trajectories and propagation speeds of acoustic waves in the universe. The research was published in Physical Review Letters.

Turbulence is the complex chaotic behavior of fluids, gases or nonlinear waves in various physical systems. For example, at the ocean surface can be caused by wind or wind-drift currents, while turbulence of laser radiation in optics occurs as light is scattered by lenses. Turbulence can also occur in sound waves that propagate chaotically in certain media, such as superfluid helium.

In the 1970s, Soviet scientists proposed that turbulence occurs when sound waves deviate from equilibrium and reach large amplitudes. The theory of wave turbulence applies to many other wave systems, including magnetohydrodynamic waves in the ionospheres of stars and giant planets, and perhaps even in the early universe. Until recently, however, it has been nearly impossible to predict the propagation patterns of nonlinear (i.e., chaotically moving) acoustic and other waves because of the high computational complexity involved.

Astronomers have released a set of more than a million simulated images showcasing the cosmos as NASA’s upcoming Nancy Grace Roman Space Telescope will see it. This preview will help scientists explore Roman’s myriad science goals.

“We used a supercomputer to create a synthetic universe and simulated billions of years of evolution, tracing every photon’s path all the way from each cosmic object to Roman’s detectors,” said Michael Troxel, an associate professor of physics at Duke University in Durham, North Carolina, who led the simulation campaign. “This is the largest, deepest, most realistic synthetic survey of a mock universe available today.”

The project, called OpenUniverse, relied on the now-retired Theta supercomputer at the DOE’s (Department of Energy’s) Argonne National Laboratory in Illinois. In just nine days, the supercomputer accomplished a process that would take over 6,000 years on a typical computer.

In today’s AI news, this year coding might go from one of the most sought-after skills on the job market to one that can be fully automated. Mark Zuckerberg said that Meta and some of the biggest companies in the tech industry are already working toward this on an episode of the Joe Rogan Experience on Friday.

In other advancements, NovaSky, a team of researchers based out of UC Berkeley’s Sky Computing Lab, released Sky-T1-32B-Preview, a reasoning model that’s competitive with an earlier version of OpenAI’s o1. “Remarkably, Sky-T1-32B-Preview was trained for less than $450,” the team wrote in a blog post, “demonstrating that it is possible to replicate high-level reasoning capabilities affordably and efficiently.”

And, no company has capitalized on the AI revolution more dramatically than Nvidia. The world’s leading high-performance GPU maker has used its ballooning fortunes to significantly increase investments in all sorts of startups but particularly in AI startups.

Meanwhile, Sir Keir Starmer has green-lit a plan to use the immigration system to recruit a new wave of AI experts and loosen up data mining regulations to help Britain lead the world in the new technology. The recruitment of thousands of new AI experts by the government and private sector is part of a 50-point plan to transform Britain with the new technology.

In videos, newly deployed at Lawrence Livermore National Laboratory, El Capitan — the National Nuclear Security Administration’s (NNSA) first exascale supercomputer, is setting new benchmarks in computing power. At 2.79 exaFLOPs of peak performance El Capitan’s unprecedented capabilities are already impacting scientific computing and making the previously unimaginable a reality.

The mention of gravity and quantum in the same sentence often elicits discomfort from theoretical physicists, yet the effects of gravity on quantum information systems cannot be ignored. In a recently announced collaboration between the University of Connecticut, Google Quantum AI, and the Nordic Institute for Theoretical Physics (NORDITA), researchers explored the interplay of these two domains, quantifying the nontrivial effects of gravity on transmon qubits.

Led by Alexander Balatsky of UConn’s Quantum Initiative, along with Google’s Pedram Roushan and NORDITA researchers Patrick Wong and Joris Schaltegger, the study focuses on the gravitational redshift. This phenomenon slightly detunes the energy levels of qubits based on their position in a gravitational field. While negligible for a single qubit, this effect becomes measurable when scaled.

While quantum computers can effectively be protected from electromagnetic radiation, barring any innovative antigravitic devices expansive enough to hold a quantum computer, quantum technology cannot at this point in time be shielded from the effects of gravity. The team demonstrated that gravitational interactions create a universal dephasing channel, disrupting the coherence required for quantum operations. However, these same interactions could also be used to develop highly sensitive gravitational sensors.

“Our research reveals that the same finely tuned qubits engineered to process information can serve as precise sensors—so sensitive, in fact, that future quantum chips may double as practical gravity sensors. This approach is opening a new frontier in quantum technology.”

To explore these effects, the researchers modeled the gravitational redshift’s impact on energy-level splitting in transmon qubits. Gravitational redshift, a phenomenon predicted by Einstein’s general theory of relativity, occurs when light or electromagnetic waves traveling away from a massive object lose energy and shift to longer wavelengths. This happens because gravity alters the flow of time, causing clocks closer to a massive object to tick more slowly than those farther away.

A breakthrough in artificial intelligence.

Artificial Intelligence (AI) is a branch of computer science focused on creating systems that can perform tasks typically requiring human intelligence. These tasks include understanding natural language, recognizing patterns, solving problems, and learning from experience. AI technologies use algorithms and massive amounts of data to train models that can make decisions, automate processes, and improve over time through machine learning. The applications of AI are diverse, impacting fields such as healthcare, finance, automotive, and entertainment, fundamentally changing the way we interact with technology.