Toggle light / dark theme

‘It seemed to defy the laws of physics’: The everlasting ‘memory crystals’ that could slash data centre emissions

In the face of rising emissions from data centres, researchers are turning to micro-explosions in glass, and using DNA to solve big data’s big problem.

Mathematicians make a breakthrough on 2,000-year-old problem of curves

From the article:

“A Rule for Every Curve”

That’s where the new proof comes in. Its authors present a formula that can be applied to any curve in the mathematical universe, whatever its degree. It doesn’t say precisely how many rational points that curve has, but it gives an upper limit on what that number can be.

Previous formulas of this kind either didn’t apply to all curves or depended on the specific equation used to define them. The new formula is something mathematicians have hoped for since Faltings’s proof, a “uniform” statement that applies to all curves without depending on the coefficients in their equations. “This one statement gives us a broad sweep of understanding,” Mazur says.

It depends on only two things. The first is the degree of the polynomial that defines the curve—the higher the degree is, the weaker the statement becomes. The second thing the formula depends on is called the “Jacobian variety,” a special surface that can be constructed from any curve. Jacobian varieties are interesting in their own right, and the formula offers a tantalizing path for studying them as well.”


Since ancient Greece, researchers have tried to isolate special rational points on curves. Now they have the first ever formula that applies uniformly to all curves.

Quantum algorithm beats classical tools on complement sampling tasks

Quantum computers—devices that process information using quantum mechanical effects—have long been expected to outperform classical systems on certain tasks. Over the past few decades, researchers have worked to rigorously demonstrate such advantages, ideally in ways that are provable, verifiable and experimentally realizable.

A team of researchers working at Quantinuum in the United Kingdom and QuSoft in the Netherlands has now developed a quantum algorithm that solves a specific sampling task—known as complement sampling—dramatically more efficiently than any classical algorithm. Their paper, published in Physical Review Letters, establishes a provable and verifiable quantum advantage in sample complexity: the number of samples required to solve a problem.

“We stumbled upon the core result of this work by chance while working on a different project,” Harry Buhrman, co-author of the paper, told Phys.org. “We had a set of items and two quantum states: one formed from half of the items, the other formed from the remaining half. Even though the two states are fundamentally distinct, we showed that a quantum computer may find it hard to tell which one it is given. Surprisingly, however, we then realized that transforming one state into the other is always easy, because a simple operation can swap between them.”

Quantum computers go high-dimensional with a four-state photon gate

The collaboration of TU Wien with research groups in China has resulted in a crucial building block for a new kind of quantum computer: The realization of a novel type of quantum logic gate makes it possible to carry out quantum computations on pairs of photons that are each in four different quantum states, or combinations thereof. The advancement is an important milestone for optical quantum computers. The study has now been published in Nature Photonics.

The basic idea of quantum computers is simple: While a classical computer only works with the values “0” and “1,” quantum physics allows for arbitrary combinations of these states. In a certain sense, a quantum bit (“qubit”) can be in the states 0 and 1 simultaneously. This makes it possible to develop algorithms that can solve some problems much faster than a comparable classical computer.

However, such superpositions can in principle involve more than two states. Depending on what degree of freedom one considers, a quantum system such as a photon may not just have two different settings—two different outcomes of a potential measurement—but many. In this case, one refers to the system as a “qudit” rather than a “qubit.”

GOOD LUCK, HAVE FUN, DON’T DIE — Welcome To The Perfect Prison

Gore Verbinski’s Good Luck, Have Fun, Dont Die hits like a nasty mirror held up at the worst possible angle. On paper, the setup sounds almost playful: a “Man From the Future” drops into a diner in Los Angeles and has to recruit the exact combination of disgruntled strangers for a one-night mission to stop a rogue AI. But the horror isn’t metal skeletons and laser fire. It’s the idea that the end of humanity doesn’t arrive with an explosion. It arrives with an upgrade. A perfectly tuned stream of algorithmic entertainment that doesn’t merely distract people—it replaces them. A manufactured paradise so frictionless, so gratifying, so chemically rewarding, that the messy, strenuous, inconvenient act of being human starts to feel obsolete.

#goodluckhavefundontdie #samrockwell #ai #algorithm.

Check out my playlists on film here — Film Explored — • Film Explored.

Check out my playlist on Alien here — • New to Aliens? Start Here.

Check out my playlist on Predator here — • New to Predator? Start Here.

AI in Pathology Fails Without Pathologists

🧠 AI in pathology cannot succeed without pathologists. As computational pathology advances, clinical expertise remains the critical link between algorithms and real-world impact.

In this discussion, Diana Montezuma, Pathologist and Head of R&D at IMP Diagnostics, explains why pathologist involvement is essential to building AI tools that are usable, clinically relevant, and truly valuable in practice.

👉 Read the discussion:


Pathologists play a key role in AI development for pathology – providing the expertise needed to bridge data and clinical application. To discuss this role and its importance in the development of computational pathology tools, we connected with Diana Montezuma, Pathologist and Head of the R&D Unit at IMP Diagnostics.

From your perspective, what is the most important contribution that diagnosticians bring to AI and algorithm development?

Pathologists bring essential clinical expertise and practical insight to any computational pathology project. Without their involvement, such initiatives risk becoming disconnected from real-world practice and ultimately failing to deliver meaningful clinical value.

Machine learning algorithm fully reconstructs LHC particle collisions

The CMS Collaboration has shown, for the first time, that machine learning can be used to fully reconstruct particle collisions at the LHC. This new approach can reconstruct collisions more quickly and precisely than traditional methods, helping physicists better understand LHC data. The paper has been submitted to the European Physical Journal C and is currently available on the arXiv preprint server.

Each proton–proton collision at the LHC sprays out a complex pattern of particles that must be carefully reconstructed to allow physicists to study what really happened. For more than a decade, CMS has used a particle-flow (PF) algorithm, which combines information from the experiment’s different detectors, to identify each particle produced in a collision. Although this method works remarkably well, it relies on a long chain of hand-crafted rules designed by physicists.

The new CMS machine-learning-based particle-flow (MLPF) algorithm approaches the task fundamentally differently, replacing much of the rigid hand-crafted logic with a single model trained directly on simulated collisions. Instead of being told how to reconstruct particles, the algorithm learns how particles look in the detectors, like how humans learn to recognize faces without memorizing explicit rules.

The Truth About Merging With AI

Will humans one day merge with artificial intelligence? Futurist Ray Kurzweil predicts a coming “singularity” where humans upload their minds into digital systems, expanding intelligence and potentially achieving immortality. But critics argue that consciousness, creativity, love, and spiritual awareness cannot be reduced to algorithms. This discussion explores brain-computer interfaces, quantum mechanics and the mind, the Ship of Theseus identity paradox, and whether a digital copy of your brain would actually be you. Is AI-driven immortality possible—or does it misunderstand what it means to be human?

Every year the Center sponsors COSM an exclusive national summit on the converging technologies remaking the world as we know it. Visit COSM.TECH (https://cosm.tech/) for information on COSM 2025, November 19–21 at the beautiful Hilton Scottsdale Resort and Spas in Scottsdale, AZ. For more information. Registration will launch mid-July.

The mission of the Walter Bradley Center for Natural and Artificial Intelligence at Discovery Institute is to explore the benefits as well as the challenges raised by artificial intelligence (AI) in light of the enduring truth of human exceptionalism. People know at a fundamental level that they are not machines. But faulty thinking can cause people to assent to views that in their heart of hearts they know to be untrue. The Bradley Center seeks to help individuals—and our society at large—to realize that we are not machines while at the same time helping to put machines (especially computers and AI) in proper perspective.

Be sure to subscribe to the Center for Natural and Artificial Intelligence.
on Youtube: / @discoverycnai.

Follow Walter Bradley Center for Natural and Artificial Intelligence on.
X: / cnaintelligence, @cnaintelligence.
Facebook: / bradleycenterdi.

Visit other Youtube channels connected to the Discovery Institute:

Comparative single-cell lineage bias in human and murine hematopoietic stem cells

A comparative single-cell analysis reveals similarities and differences in lineage bias between human and murine hematopoietic stem cells. This work deepens our understanding of how lineage commitment is regulated across species and provides a valuable framework for translating insights from mouse models to human hematopoiesis.


The commitment of hematopoietic stem cells (HSC) to myeloid, erythroid, and lymphoid lineages is influenced by microenvironmental cues, and governed by cell-intrinsic and epigenetic characteristics that are unique to the HSC population. To investigate the nature of lineage commitment bias in human HSC, mitochondrial single-cell assay for transposase-accessible chromatin (ATAC)-sequencing was used to identify somatic mutations in mitochondrial DNA to act as natural genetic barcodes for tracking the ex vivo differentiation potential of HSC to mature cells. Clonal lineages of human CD34+ cells and their mature progeny were normally distributed across the hematopoietic lineage tree without evidence of significant skewing. To investigate commitment bias in vivo, mice were transplanted with limited numbers of long-term HSC (LT-HSC). Variation in the ratio of myeloid and lymphoid cells between donors was suggestive of a skewed output but was not altered by increasing numbers of LT-HSC. These data suggest that the variation in myeloid and lymphoid engraftment is a stochastic process dominated by the irradiated recipient niche with minor contributions from cell-intrinsic lineage biases of LT-HSC.

Hematopoietic stem cells (HSC) are classically considered to have the capacity for complete regeneration of the hematopoietic compartment. More recent analyses indicate additional complexity and heterogeneity in the HSC compartment, with lineage-restricted or lineage-biased HSC considered a feature of mammalian hematopoiesis.1–13 A partial differential equation model to study relationships between hematopoietic stem and progenitor cells (HSPC) emphasizes that myeloid bias cannot be accounted for solely by short-term HSC bias during inflammation but rather involves a combination of HSC and progenitor cell biases.14 Central to the concept of lineage bias is an assumption that cells used for studying HSC commitment are HSC and not multipotent progenitors or lineage-committed progenitors. Changes in differentiation of cells downstream of the long-term HSC (LT-HSC) must also be evaluated when considering the potential lineage bias of a LT-HSC.

Q-Day: Catastrophic For Businesses Ignoring Quantum-Resistant Encryption

#Quantum #CyberSecurity


Quantum computing is not merely a frontier of innovation; it is a countdown. Q-Day is the pivotal moment when scalable quantum computers undermine the cryptographic underpinnings of our digital realm. It is approaching more rapidly than many comprehend.

For corporations and governmental entities reliant on outdated encryption methods, Q-Day will not herald a smooth transition; it may signify a digital catastrophe.

Comprehending Q-Day: The Quantum Reckoning

Q-Day arrives when quantum machines using Shor’s algorithm can dismantle public-key encryption within minutes—a task that classical supercomputers would require billions of years to accomplish.

/* */