Toggle light / dark theme

A new method to better study microscopic plastics in the ocean

If you’ve been to your local beach, you may have noticed the wind tossing around litter such as an empty potato chip bag or a plastic straw. These plastics often make their way into the ocean, affecting not only marine life and the environment but also threatening food safety and human health.

Eventually, many of these plastics break down into microscopic sizes, making it hard for scientists to quantify and measure them. Researchers call these incredibly small fragments nanoplastics and microplastics because they are not visible to the naked eye. Now, in a multiorganizational effort led by the National Institute of Standards and Technology (NIST) and the European Commission’s Joint Research Centre (JRC), researchers are turning to a lower part of the food chain to solve this problem.

The researchers have developed a novel method that uses a filter-feeding marine species to collect these tiny plastics from ocean water. The team published its findings as a proof-of-principle study in the scientific journal Microplastics and Nanoplastics.

Nanotechnology Advance Enables Tinier Transistors With Extraordinary Performance

Atomically thin materials are a promising alternative to silicon-based transistors; now researchers can connect them more efficiently to other chip elements.

Moore’s Law, the famous prediction that the number of transistors that can be packed onto a microchip will double every couple of years, has been bumping into basic physical limits. These limits could bring decades of progress to a halt, unless new approaches are found.

One new direction being explored is the use of atomically thin materials instead of silicon as the basis for new transistors, but connecting those “2D” materials to other conventional electronic components has proved difficult.

Apple’s rivals may never be able to catch up to its powerful new chip

Early in the testing phase of Apple’s M1 chipset, a milestone new product for the company, the processor was installed in a batch of Mac computers and given to staffers working on applications that demanded heavy processing power. It was a pivotal moment: the first time Apple had made its own chip for any of its computers, shifting away from years of using a one-size-fits-all option from Intel.

After multiple teams tested the devices for a few hours while working on tasks, they reported lightning-fast performance but nearly all flagged an apparent problem. The MacBook Pro’s battery indicator, featured on the upper right hand corner of the computers, was broken. It had barely moved despite running power-hungry programs, the company told CNN Business.

The gag, of course, is that the battery indicator was working just fine. The M1 chip was so efficient, according to Apple, that it showed no real strain — one of several major selling points for products that now carry the chip. (Apple promises 20 hours of battery life for its 13-inch M1 MacBook Pro — what it says is the longest battery in any Mac to date).

Silicon chips combine light and ultrasound for better signal processing

The continued growth of wireless and cellular data traffic relies heavily on light waves. Microwave photonics is the field of technology that is dedicated to the distribution and processing of electrical information signals using optical means. Compared with traditional solutions based on electronics alone, microwave photonic systems can handle massive amounts of data. Therefore, microwave photonics has become increasingly important as part of 5G cellular networks and beyond. A primary task of microwave photonics is the realization of narrowband filters: The selection of specific data, at specific frequencies, out of immense volumes that are carried over light.

Many photonic systems are built of discrete, separate components and long optical fiber paths. However, the cost, size, and production volume requirements of advanced networks call for a new generation of microwave photonic systems that are realized on a chip. Integrated microwave photonic filters, particularly in silicon, are highly sought after. There is, however, a fundamental challenge: Narrowband filters require that signals are delayed for comparatively long durations as part of their processing.

“Since the is so fast,” says Prof. Avi Zadok from Bar-Ilan University, Israel, “we run out of chip space before the necessary delays are accommodated. The required delays may reach over 100 nanoseconds. Such delays may appear to be short considering daily experience; however, the optical paths that support them are over ten meters long. We cannot possibly fit such long paths as part of a silicon chip. Even if we could somehow fold over that many meters in a certain layout, the extent of optical power losses to go along with it would be prohibitive.”

Despite Chip Shortage, Chip Innovation Is Booming

Even as a chip shortage is causing trouble for all sorts of industries, the semiconductor field is entering a surprising new era of creativity, from industry giants to innovative start-ups seeing a spike in funding from venture capitalists that traditionally avoided chip makers.


While a variety of industries struggle with supplies, semiconductor experts say there are plenty of new ideas and, most surprising, start-ups.

Governments are deploying ‘wartime-like’ efforts to win the global semiconductor race

These clever semiconductors make our internet-connected world go round. In addition to iPhones and PlayStations, they underpin key national infrastructure and sophisticated weaponry.

But recently there haven’t been enough of them to meet demand.

The reasons for the ongoing global chip shortage, which is set to last into 2022 and possibly 2023, are complex and multifaceted. However, nations are planning to pump billions of dollars into semiconductors over the coming years as part of an effort to sure up supply chains and become more self-reliant, with money going toward new chip plants, as well as research and development.

Brain-Computer Interface Translates Brain Signals Associated with Handwriting into Text

Researchers with the BrainGate Collaboration have deciphered the brain activity associated with handwriting: working with a 65-year-old (at the time of the study) participant with paralysis who has sensors implanted in his brain, they used an algorithm to identify letters as he attempted to write them; then, the system displayed the text on a screen; by attempting handwriting, the participant typed 90 characters per minute — more than double the previous record for typing with a brain-computer interface.

So far, a major focus of brain-computer interface research has been on restoring gross motor skills, such as reaching and grasping or point-and-click typing with a computer cursor.

/* */