Toggle light / dark theme

Machine learning accelerates plasma mirror design for high-power lasers

Plasma mirrors capable of withstanding the intensity of powerful lasers are being designed through an emerging machine learning framework. Researchers in Physics and Computer Science at the University of Strathclyde have pooled their knowledge of lasers and artificial intelligence to produce a technology that can dramatically reduce the time it takes to design advanced optical components for lasers—and could pave the way for new discoveries in science.

High-power lasers can be used to develop tools for health care, manufacturing and nuclear fusion. However, these are becoming large and expensive due to the size of their optical components, which is currently necessary to keep the laser beam intensity low enough not to damage them. As the peak power of lasers increases, the diameters of mirrors and other optical components will need to rise from approximately one meter to more than 10 meters. These would weigh several tons, making them difficult and expensive to manufacture.

New AI system fixes 3D printing defects in real time

Additive manufacturing has revolutionized manufacturing by enabling customized, cost-effective products with minimal waste. However, with the majority of 3D printers operating on open-loop systems, they are notoriously prone to failure. Minor changes, like adjustments to nozzle size or print speed, can lead to print errors that mechanically weaken the part under production.

Traditionally, manufacturers fix these issues on a case-by-case basis, ultimately “babysitting” the printer to manually adjust parameters and test samples in an effort to figure out what went wrong.

AI streamlines deluge of data from particle collisions

Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have developed a novel artificial intelligence (AI)-based method to dramatically tame the flood of data generated by particle detectors at modern accelerators. The new custom-built algorithm uses a neural network to intelligently compress collision data, adapting automatically to the density or “sparsity” of the signals it receives.

As described in a paper just published in the journal Patterns, the scientists used simulated data from sPHENIX, a particle detector at Brookhaven Lab’s Relativistic Heavy Ion Collider (RHIC), to demonstrate the algorithm’s potential to handle trillions of bits of detector data per second while preserving the fine details physicists need to explore the building blocks of matter.

The algorithm will help physicists gear up for a new era of streaming data acquisition, where every collision is recorded without pre-selecting which ones might be of interest. This will vastly expand the potential for more accurate measurements and unanticipated discoveries.

Researchers Find 341 Malicious ClawHub Skills Stealing Data from OpenClaw Users

A security audit of 2,857 skills on ClawHub has found 341 malicious skills across multiple campaigns, according to new findings from Koi Security, exposing users to new supply chain risks.

ClawHub is a marketplace designed to make it easy for OpenClaw users to find and install third-party skills. It’s an extension to the OpenClaw project, a self-hosted artificial intelligence (AI) assistant formerly known as both Clawdbot and Moltbot.

The analysis, which Koi conducted with the help of an OpenClaw bot named Alex, found that 335 skills use fake pre-requisites to install an Apple macOS stealer named Atomic Stealer (AMOS). This activity set has been codenamed ClawHavoc.

Mozilla announces switch to disable all Firefox AI features

In response to user feedback on AI integration, Mozilla announced today that the next Firefox release will let users disable AI features entirely or manage them individually.

The new “Block AI enhancements” toggle will be available in Firefox 148 on February 24 and will help block current and future generative AI features in the desktop browser from a single location. Users will also have the option to enable specific AI tools while keeping others disabled.

“We’ve heard from many who want nothing to do with AI. We’ve also heard from others who want AI tools that are genuinely useful. Listening to our community, alongside our ongoing commitment to offer choice, led us to build AI controls,” said Firefox head Ajit Varma.

Malicious MoltBot skills used to push password-stealing malware

More than 230 malicious packages for the personal AI assistant OpenClaw (formerly known as Moltbot and ClawdBot) have been published in less than a week on the tool’s official registry and on GitHub.

Called skills, the packages pretend to be legitimate tools to deliver malware that steals sensitive data, like API keys, wallet private keys, SSH credentials, and browser passwords.

Originally named ClawdBot and switching to Moltbot and now OpenClaw in under a month, the project is a viral open-source AI assistant designed to run locally, with persistent memory and integrate with various resources (chat, email, local file system). Unless configured properly, the assistant introduces security risks.

2024–2026 Global Memory Supply Shortage

Following a severe market downturn in 2022–2023, major memory manufacturers— Samsung Electronics, SK Hynix, and Micron Technology —implemented strategic production cuts to stabilize pricing. [ 4 ] By mid-2024, the rapid expansion of generative AI services triggered unprecedented demand for specialized memory products, particularly High Bandwidth Memory (HBM) used in AI accelerators and data center GPUs. [ 5 ] [ 6 ] [ 7 ] Specialized components of chip-making technology are also experiencing supply constraints due to high demand in AI application. For example, glass cloth, a high-performance glass fiber substrate used for power efficient high speed data transfer and a crucial component of chip-making, is experiencing supply crisis as Nitto Boseki, a Japanese firm having overwhelming monopoly in its production, is not able to meet increased demands making chip-makers such as Qualcomm, Apple, Nvidia and AMD compete for securing supply for their chips. [ 8 ] [ 9 ]

A 2024 McKinsey analysis projected that global demand for AI-ready data center capacity would grow at approximately 33% annually through 2030, with AI workloads consuming roughly 70% of total data center capacity by the decade’s end. [ 10 ]

[ edit ].

Vehique: Hi!

I’m Gemechu. I’m a software engineer and AI builder finishing my Master’s in CS at LMU Los Angeles this May.

I’m looking to join a team, full-time or internship!

For context, I have hands-on experience shipping AI-powered products to production. I recently built https://www.vehique.ai/, a conversational vehicle marketplace from scratch — designed the multi-agent architecture, built the full stack, and scaled it to over 3,000 monthly users. Prior to that I have a couple of research engineering experiences at seed-stage startups.

Have experience building end to end, whether that’s the AI layer, backend and infra, or full-stack product work.

Looking to join where I can create impactful products.

If you’re hiring or know someone who might be, please feel free to reach out.

🌿 All my projects and experience are on my portfolio — https://gemechu.xyz/

/* */