Toggle light / dark theme

The case for an antimatter Manhattan project

Chemical rockets have taken us to the moon and back, but traveling to the stars demands something more powerful. Space X’s Starship can lift extraordinary masses to orbit and send payloads throughout the solar system using its chemical rockets, but it cannot fly to nearby stars at 30% of light speed and land. For missions beyond our local region of space, we need something fundamentally more energetic than chemical combustion, and physics offers, or, in other words, antimatter.

When antimatter encounters ordinary matter, they annihilate completely, converting mass directly into energy according to Einstein’s equation E=mc². That c² term is approximately 10¹⁷, an almost incomprehensibly large number. This makes antimatter roughly 1,000 times more energetic than nuclear fission, the most powerful energy source currently in practical use.

As a source of energy, antimatter can potentially enable spacecraft to reach nearby stars at significant fractions of the speed of light. A detailed technical analysis by Casey Handmer, CEO of Terraform Industries, outlines how humanity could develop practical antimatter propulsion within existing spaceflight budgets, requiring breakthroughs in three critical areas; production efficiency, reliable storage systems, and engine designs that can safely harness the most energetic fuel physically possible.

Advances in spacecraft control: New algorithm guarantees precision under extreme disturbances

An international team of researchers has unveiled a spacecraft attitude control system that can guarantee precise stabilization and maneuvering within a predefined time, even under extreme and unpredictable space disturbances.

Published in IEEE Transactions on Industrial Electronics, the study titled “Predefined-Time Disturbance Observer-Based Attitude Tracking Control for Spacecraft: A Solution for Arbitrary Disturbances” was led by Dr. Nguyen Xuan-Mung of Sejong University (South Korea), alongside colleagues from China and Taiwan.

This French company signs with a US data‑centre giant to build the world’s first reactor of its kind

As artificial intelligence devours electricity, a quiet nuclear revolution is taking shape deep below future data centers.

Across Europe, tech firms are staring at an uncomfortable equation: soaring digital demand, power grids near saturation, and climate goals that leave little room for more fossil fuels. A young French company now claims it can rewrite that equation with a compact reactor that hides underground and feeds on nuclear waste.

Quantum Algorithm Solves Metabolic Modeling Test

A Japanese research team from Keio University demonstrated that a quantum algorithm can solve a core metabolic-modeling problem, marking one of the earliest applications of quantum computing to a biological system. The study shows quantum methods can map how cells use energy and resources.

Flux balance analysis is a method widely used in systems biology to estimate how a cell moves material through metabolic pathways. It treats the cell as a network of reactions constrained by mass balance laws, finding reaction rates that maximize biological objectives like growth or ATP production.

No. The demonstration ran on a simulator rather than physical hardware, though the model followed the structure of quantum machines expected in the first wave of fault-tolerant systems. The simulation used only six qubits.

Mindscape 242 | David Krakauer on Complexity, Agency, and Information

Patreon: https://www.patreon.com/seanmcarroll.
Blog post with audio player, show notes, and transcript: https://www.preposterousuniverse.com/podcast/2023/07/10/242-…formation/

Complexity scientists have been able to make an impressive amount of progress despite the fact that there is not universal agreement about what “complexity” actually is. We know it when we see it, perhaps, but there are a number of aspects to the phenomenon, and different researchers will naturally focus on their favorites. Today’s guest, David Krakauer, is president of the Santa Fe Institute and a longtime researcher in complexity. He points the finger at the concept of agency. A ball rolling down a hill just mindlessly obeys equations of motion, but a complex system gathers information and uses it to adapt. We talk about what that means and how to think about the current state of complexity science.

David Krakauer received his D.Phil. in evolutionary biology from Oxford University. He is currently President and William H. Miller Professor of Complex Systems at the Santa Fe Institute. Previously he was at the University of Wisconsin, Madison, where he was the founding director of the Wisconsin Institute for Discovery and the Co-director of the Center for Complexity and Collective Computation. He was included in Wired magazine’s list of “50 People Who Will Change the World.”

Mindscape Podcast playlist: https://www.youtube.com/playlist?list=PLrxfgDEc2NxY_fRExpDXr87tzRbPCaA5x.
Sean Carroll channel: https://www.youtube.com/c/seancarroll.

#podcast #ideas #science #philosophy #culture

Association of blood-based DNA methylation of lncRNAs with Alzheimer’s disease diagnosis

DNA methylation has shown great potential in Alzheimer’s disease (AD) blood diagnosis. However, the ability of long non-coding RNAs (lncRNAs), which can be modified by DNA methylation, to serve as noninvasive biomarkers for AD diagnosis remains unclear.

We performed logistic regression analysis of DNA methylation data from the blood of patients with AD compared and normal controls to identify epigenetically regulated (ER) lncRNAs. Through five machine learning algorithms, we prioritized ER lncRNAs associated with AD diagnosis. An AD blood diagnosis model was constructed based on lncRNA methylation in Australian Imaging, Biomarkers, and Lifestyle (AIBL) subject and verified in two large blood-based studies, the European collaboration for the discovery of novel biomarkers for Alzheimer’s disease (AddNeuroMed) and the Alzheimer’s Disease Neuroimaging Initiative (ADNI). In addition, the potential biological functions and clinical associations of lncRNAs were explored, and their neuropathological roles in AD brain tissue were estimated via cross-tissue analysis.

We characterized the ER lncRNA landscape in AD blood, which is strongly related to AD occurrence and process. Fifteen ER lncRNAs were prioritized to construct an AD blood diagnostic and nomogram model. The receiver operating characteristic (ROC) curve and the decision and calibration curves show that the model has good prediction performance. We found that the targets and lncRNAs were correlated with AD clinical features. Moreover, cross-tissue analysis revealed that the lncRNA ENSG0000029584 plays both diagnostic and neuropathological roles in AD.

BrainBody-LLM algorithm helps robots mimic human-like planning and movement

Large language models (LLMs), such as the model underpinning the functioning of OpenAI’s platform ChatGPT, are now widely used to tackle a wide range of tasks, ranging from sourcing information to the generation of texts in different languages and even code. Many scientists and engineers also started using these models to conduct research or advance other technologies.

In the context of robotics, LLMs have been found to be promising for the creation of robot policies derived from a user’s instructions. Policies are essentially “rules” that a robot needs to follow to correctly perform desired actions.

Researchers at NYU Tandon School of Engineering recently introduced a new algorithm called BrainBody-LLM, which leverages LLMs to plan and refine the execution of a robot’s actions. The new algorithm, presented in a paper published in Advanced Robotics Research, draws inspiration from how the human brain plans actions and fine-tunes the body’s movements over time.

Laude × CSGE: Bill Joy — 50 Years of Advancements: Computing and Technology 1975–2025 (and beyond)

From the rise of numerical and symbolic computing to the future of AI, this talk traces five decades of breakthroughs and the challenges ahead.


Bill is the author of Berkeley UNIX, cofounder of Sun Microsystems, author of “Why the Future Doesn’t Need Us” (Wired 2000), ex-cleantech VC at Kleiner Perkins, investor in and unpaid advisor to Nodra. AI.

Talk Details.
50 Years of Advancements: Computing and Technology 1975–2025 (and beyond)

I came to UC Berkeley CS in 1975 as a graduate student expecting to do computer theory— Berkeley CS didn’t have a proper departmental computer, and I was tired of coding, having written a lot of numerical code for early supercomputers.

But it’s hard to make predictions, especially about the future. Berkeley soon had a Vax superminicomputer, I installed a port of UNIX and was upgrading the operating system, and the Internet and Microprocessor boom beckoned.

Diamond quantum sensors improve spatial resolution of MRI

This accomplishment breaks the previous record of 48 qubits set by Jülich scientists in 2019 on Japan’s K computer. The new result highlights the extraordinary capabilities of JUPITER and provides a powerful testbed for exploring and validating quantum algorithms.

Simulating quantum computers is essential for advancing future quantum technologies. These simulations let researchers check experimental findings and experiment with new algorithmic approaches long before quantum hardware becomes advanced enough to run them directly. Key examples include the Variational Quantum Eigensolver (VQE), which can analyze molecules and materials, and the Quantum Approximate Optimization Algorithm (QAOA), used to improve decision-making in fields such as logistics, finance, and artificial intelligence.

Recreating a quantum computer on conventional systems is extremely demanding. As the number of qubits grows, the number of possible quantum states rises at an exponential rate. Each added qubit doubles the amount of computing power and memory required.

Although a typical laptop can still simulate around 30 qubits, reaching 50 qubits requires about 2 petabytes of memory, which is roughly two million gigabytes. ‘Only the world’s largest supercomputers currently offer that much,’ says Prof. Kristel Michielsen, Director at the Jülich Supercomputing Centre. ‘This use case illustrates how closely progress in high-performance computing and quantum research are intertwined today.’

The simulation replicates the intricate quantum physics of a real processor in full detail. Every operation – such as applying a quantum gate – affects more than 2 quadrillion complex numerical values, a ‘2’ with 15 zeros. These values must be synchronized across thousands of computing nodes in order to precisely replicate the functioning of a real quantum processor.


The JUPITER supercomputer set a new milestone by simulating 50 qubits. New memory and compression innovations made this breakthrough possible. A team from the Jülich Supercomputing Centre, working with NVIDIA specialists, has achieved a major milestone in quantum research. For the first time, they successfully simulated a universal quantum computer with 50 qubits, using JUPITER, Europe’s first exascale supercomputer, which began operation at Forschungszentrum Jülich in September.

/* */