Toggle light / dark theme

The research conducted by Elena Hassinger, an expert in low-temperature physics working at ct.qmat—Complexity and Topology in Quantum Matter (a joint initiative by two universities in Würzburg and Dresden), has always been synonymous with extreme cold.

In 2021, she discovered the unconventional superconductor cerium-rhodium-arsenic CeRh2As2). Superconductors normally have just one phase of resistance-free electron transport, which occurs below a certain critical temperature. However, as reported in the academic journal Science, CeRh2As2 is so far the only quantum material to boast two certain superconducting states.

Lossless current conduction in superconductors has remained a central focus in solid-state physics for decades and has emerged as a significant prospect for the future of power engineering. The discovery of a second superconducting phase in CeRh2As2, which results from an asymmetric crystal structure around the cerium atom (the rest of the crystal structure is completely symmetrical), positions this compound as a prime candidate for use in topological quantum computing.

It is still unclear whether and how quantum computing might prove useful in solving known large-scale classical machine learning problems. Here, the authors show that variants of known quantum algorithms for solving differential equations can provide an advantage in solving some instances of stochastic gradient descent dynamics.

Before delving into the prospects of the Fifth Industrial Revolution, let’s reflect on the legacy of its predecessor. The Fourth Industrial Revolution, characterised by the fusion of digital, physical, and biological systems, has already transformed the way we live and work. It brought us AI, blockchain, the Internet of Things, and more. However, it also raised concerns about automation’s impact on employment and privacy, leaving us with a mixed legacy.

The promise of the Fifth Industrial Revolution.

The Fifth Industrial Revolution represents a quantum leap forward. At its core, it combines AI, advanced biotechnology, nanotechnology, and quantum computing to usher in a new era of possibilities. One of its most compelling promises is the extension of human life. With breakthroughs in genetic engineering, regenerative medicine, and AI-driven healthcare, we are inching closer to not just treating diseases but preventing them altogether. It’s a vision where aging is not an inevitability, but a challenge to overcome.

A new study in Physical Review Letters illuminates the intricacies of energy exchanges within bipartite quantum systems, offering profound insights into quantum coherence, pure dephasing effects, and the potential impact on future quantum technologies.

In quantum systems, the behavior of particles and are governed by probability distributions and wave functions, adding layers of complexity to the understanding of energy exchanges.

The exploration of energy exchanges in quantum systems inherently involves tackling the complexities arising from and the scales at which quantum systems operate, introducing sensitivity.

A study led by the University of Oxford has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the “reality gap”: the difference between predicted and observed behavior from quantum devices. The results have been published in Physical Review X.

Quantum computing could supercharge a wealth of applications, from climate modeling and financial forecasting to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual (also called qubits). A major barrier against this is inherent variability, where even apparently identical units exhibit different behaviors.

Functional variability is presumed to be caused by nanoscale imperfections in the materials from which quantum devices are made. Since there is no way to measure these directly, this internal disorder cannot be captured in simulations, leading to the gap in predicted and observed outcomes.

Although chaos theory can solve nearly anything that is unknown I basically think that in an infinite universe as made real from the infinite microchip that uses superfluid processing power is the real answer and we are off by factor of infinite parameters still.


When we look at scientific progress, especially in physics, it can seem like all the great discoveries lie behind us. Since the revolutions of Einstein’s theory of relativity and quantum mechanics, physicists have been struggling to find a way to make them fit together with little to no success. Tim Palmer argues that the answer to this stalemate lies in chaos theory.

Revisiting a book by John Horgan, science communicator and theoretical physicist Sabine Hossenfelder recently asked on her YouTube channel whether we are facing the end of science. It might seem like a rhetorical question — it’s not possible for science to really end — but she concludes that we are in dire need of some new paradigms in physics, and seemingly unable to arrive at them. We are yet to solve the deep ongoing mysteries of the dark universe and still haven’t convincingly synthesised quantum and gravitational physics. She suggests that ideas from chaos theory might hold some of the answers, and therefore the ability to rejuvenate science. I think she’s right.

Many physicists – perhaps most — might think this is surely a silly idea. After all, chaotic systems are describable by elementary classical Newtonian dynamics. The phenomenon of chaos can be illustrated by taking the simplest of dynamical systems, the pendulum, and simply adding a second pivot into its swinging arm. The motion of the tip of the pendulum arm is hard to predict, being sensitive to its exact starting conditions – the hallmark of chaos. Fascinating yes, but surely, if we have learned anything over the last 100 years it is this: we are not going to make progress in fundamental physics by going back to elementary classical dynamics.

The first functional semiconductor made from graphene has been created at the Georgia Institute of Technology. This could enable smaller and faster electronic devices and may have applications for quantum computing.

Credit: Georgia Institute of Technology.

Semiconductors, which are materials that conduct electricity under specific conditions, are foundational components of electronic devices like the chips in your computer, laptop, and smartphone. For many decades, their architecture has been getting smaller and more compact – a trend known as Moore’s Law. This has enabled gigantic leaps in a vast range of technologies, from general computing speeds and video game graphics, to the resolution of medical scans and the sensitivity of astronomical observatories.