Toggle light / dark theme

A Breakthrough That Cuts Blockchain Delays Nearly in Half

The idea of a fully connected digital world is quickly becoming real through the Internet of Things (IoT). This expanding network includes physical devices such as small sensors, autonomous vehicles, and industrial machines that collect and exchange data online.

Protecting this data from tampering is essential, which has led engineers to explore blockchain as a security solution. Although blockchain is widely known for its role in cryptocurrencies, its core function is as a decentralized digital ledger. Instead of data being controlled by a single organization, information is shared and maintained across many computers.

Microsoft links Windows 11 boot failures to failed December 2025 update

Microsoft has linked recent reports of Windows 11 boot failures after installing the January 2026 updates to previously failed attempts to install the December 2025 security update, which left systems in an “improper state.”

The boot failures were first reported earlier this month after users installed the January 2026 Patch Tuesday cumulative update, KB5074109, on Windows 11 versions 25H2 and 24H2.

After installing the update, impacted systems failed to start and displayed a BSOD crash screen with a stop error of “UNMOUNTABLE_BOOT_VOLUME” code.

Researchers Show AI Robots Vulnerable to Text Attacks

“I expect vision-language models to play a major role in future embodied AI systems,” said Dr. Alvaro Cardenas.


How can misleading texts negatively affect AI behavior? This is what a recently submitted study hopes to address as a team of researchers from the University of California, Santa Cruz and Johns Hopkins University investigated the potential security risks of embodied AI, which is AI fixed in a physical body that uses observations to adapt to its environment, as opposed to using text and data, and include cars and robots. This study has the potential to help scientists, engineers, and the public better understand the risks for AI and the steps to take to mitigate them.

For the study, the researchers introduced CHAI (Command Hijacking against embodied AI), which is designed to combat outside threats to embodied AI systems, including misleading text and imagery. Instead, CHAI employs counterattacks that embodied Ais can use to disseminate right from wrong regarding text and images. The researchers tested CHAI on a variety of AI-based systems, including drone emergency landing, autonomous driving, aerial object tracking, and robotic vehicles. In the end, the researchers discovered that CHAI successfully identified incoming attacks while emphasizing the need for enhancing security measures for embodied AI.

New sandbox escape flaw exposes n8n instances to RCE attacks

Two vulnerabilities in the n8n workflow automation platform could allow attackers to fully compromise affected instances, access sensitive data, and execute arbitrary code on the underlying host.

Identified as CVE-2026–1470 and CVE-2026–0863, the vulnerabilities were discovered and reported by researchers at DevSecOps company JFrog.

Despite requiring authentication, CVE-2026–1470 received a critical severity score of 9.9 out of 10. JFrog explained that the critical rating was due to arbitrary code execution occurring in n8n’s main node, which allows complete control over the n8n instance.

Radiowaves enable energy-efficient AI on edge devices without heavy hardware

As drones survey forests, robots navigate warehouses and sensors monitor city streets, more of the world’s decision-making is occurring autonomously on the edge—on the small devices that gather information at the ends of much larger networks.

But making that shift to edge computing is harder than it seems. Although artificial intelligence (AI) models continue to grow larger and smarter, the hardware inside these devices remains tiny.

Engineers typically have two options, neither are ideal. Storing an entire AI model on the device requires significant memory, data movement and computing power that drains batteries. Offloading the model to the cloud avoids those hardware constraints, but the back-and-forth introduces lag, burns energy and presents security risks.

/* */