Toggle light / dark theme

Scientists advance efforts to create ‘virtual cell lab’ as testing ground for future research with live cells

Using mathematical analysis of patterns of human and animal cell behavior, scientists say they have developed a computer program that mimics the behavior of such cells in any part of the body. Led by investigators at Indiana University, Johns Hopkins Medicine, the University of Maryland School of Medicine and Oregon Health & Science University, the new work was designed to advance ways of testing and predicting biological processes, drug responses and other cell dynamics before undertaking more costly experiments with live cells.

With further work on the program, the researchers say it could eventually serve as a “digital twin” for testing any drug’s effect on cancer or other conditions, gene environment interactions during brain development, or any number of dynamic cellular molecular processes in people where such studies are not possible.

The new study and examples of cell simulations are described online July 25 in the journal Cell.

NVIDIA Brings Reasoning Models to Consumers Ranging from 1.5B to 32B Parameters

Today, NVIDIA unveiled OpenReasoning-Nemotron, a quartet of distilled reasoning models with 1.5B, 7B, 14B, and 32B parameters, all derived from the 671B-parameter DeepSeek R1 0528. By compressing that massive teacher into four leaner Qwen‑2.5-based students, NVIDIA is making advanced reasoning experiments accessible even on standard gaming rigs, without the need to worry about hefty GPU bills and cloud usage. The key is not some elaborate trick but raw data. Using the NeMo Skills pipeline, NVIDIA generated five million math, science, and code solutions, and then fine-tuned each one purely with supervised learning. Already, the 32B model hits an 89.2 on AIME24 and 73.8 on the HMMT February contest, while even the 1.5B variant manages a solid 55.5 and 31.5.

Nima Arkani-Hamed, Gopal Prasad Professor, School of Natural Sciences, Institute for Advanced Study

Beyond Space-Time and Quantum Mechanics.

Nima Arkani-Hamed.

(June 28, 2025)


A tribute to jim simons in celebration of the importance of basic science and mathematics.

Leaders in mathematics, science and philanthropy gathered on June 27, 2025, to remember the incredible contributions of Jim Simons and to inspire continued philanthropic support of basic research.

This Rope-Powered Robot Dog Built by a US Student Walks With Stunning Realism Thanks to a Brilliant Mathematical Design

IN A NUTSHELL 🐕 CARA is a robot dog created by a Purdue University student using innovative capstan drive technology. 🔧 The robot incorporates custom 3D-printed parts and high-strength materials like carbon fiber for durability and efficiency. 🤖 Advanced coding techniques such as Inverse Kinematics allow CARA to move with natural grace and agility. 🚀

Approach improves how new skills are taught to large language models

Researchers have developed a technique that significantly improves the performance of large language models without increasing the computational power necessary to fine-tune the models. The researchers demonstrated that their technique improves the performance of these models over previous techniques in tasks including commonsense reasoning, arithmetic reasoning, instruction following, code generation, and visual recognition.

Large language models are artificial intelligence systems that are pretrained on huge data sets. After pretraining, these models predict which words should follow each other in order to respond to user queries. However, the nonspecific nature of pretraining means that there is ample room for improvement with these models when the user queries are focused on specific topics, such as when a user requests the model to answer a math question or to write computer code.

“In order to improve a model’s ability to perform more specific tasks, you need to fine-tune the model,” says Tianfu Wu, co-corresponding author of a paper on the work and an associate professor of computer engineering at North Carolina State University.