Menu

Blog

Archive for the ‘information science’ category: Page 74

May 18, 2023

A programmable surface plasmonic neural network to detect and process microwaves

Posted by in categories: information science, robotics/AI

AI tools based on artificial neural networks (ANNs) are being introduced in a growing number of settings, helping humans to tackle many problems faster and more efficiently. While most of these algorithms run on conventional digital devices and computers, electronic engineers have been exploring the potential of running them on alternative platforms, such as diffractive optical devices.

A research team led by Prof. Tie Jun Cui at Southeast University in China has recently developed a new programmable neural network based on a so-called spoof surface plasmon polariton (SSPP), which is a surface that propagates along planar interfaces. This newly proposed surface plasmonic neural network (SPNN) architecture, introduced in a paper in Nature Electronics, can detect and process microwaves, which could be useful for wireless communication and other technological applications.

“In digital hardware research for the implementation of , optical neural networks and diffractive deep neural networks recently emerged as promising solutions,” Qian Ma, one of the researchers who carried out the study, told Tech Xplore. “Previous research focusing on optical neural networks showed that simultaneous high-level programmability and nonlinear computing can be difficult to achieve. Therefore, these ONN devices usually have been limited to without programmability, or only applied for simple recognition tasks (i.e., linear problems).”

May 17, 2023

How is human behaviour impacted by an unfair AI? A game of Tetris reveals all

Posted by in categories: entertainment, information science, robotics/AI

A team of researchers give a spin to Tetris, and make observations as people play the game.

We live in a world run by machines. They make important decisions for us, like who to hire, who gets approved for a loan, or recommending user content on social media. Machines and computer programs have an increasing influence over our lives, now more than ever, with artificial intelligence (AI) making inroads in our lives in new ways. And this influence goes far beyond the person directly interacting with machines.


A Cornell University-led experiment in which two people play a modified version of Tetris revealed that players who get fewer turns perceived the other player as less likable, regardless of whether a person or an algorithm allocated the turns.

Continue reading “How is human behaviour impacted by an unfair AI? A game of Tetris reveals all” »

May 16, 2023

Compression algorithms run on AI hardware to simulate nature’s most complex systems

Posted by in categories: climatology, information science, robotics/AI, space

High-performance computing (HPC) has become an essential tool for processing large datasets and simulating nature’s most complex systems. However, researchers face difficulties in developing more intensive models because Moore’s Law—which states that computational power doubles every two years—is slowing, and memory bandwidth still cannot keep up with it. But scientists can speed up simulations of complex systems by using compression algorithms running on AI hardware.

A team led by computer scientist Hatem Ltaief are tackling this problem head-on by employing designed for (AI) to help scientists make their code more efficient. In a paper published in the journal High Performance Computing, they now report making simulations up to 150 times faster in the diverse fields of climate modeling, astronomy, seismic imaging and wireless communications.

Previously, Ltaief and co-workers showed that many scientists were riding the wave of hardware development and “over-solving” their models, carrying out lots of unnecessary calculations.

May 16, 2023

Supercomputing simulations spot electron orbital signatures

Posted by in categories: information science, mathematics, particle physics, quantum physics, supercomputing

Something not musk:


No one will ever be able to see a purely mathematical construct such as a perfect sphere. But now, scientists using supercomputer simulations and atomic resolution microscopes have imaged the signatures of electron orbitals, which are defined by mathematical equations of quantum mechanics and predict where an atom’s electron is most likely to be.

Scientists at UT Austin, Princeton University, and ExxonMobil have directly observed the signatures of electron orbitals in two different transition-metal atoms, iron (Fe) and cobalt (Co) present in metal-phthalocyanines. Those signatures are apparent in the forces measured by atomic force microscopes, which often reflect the underlying orbitals and can be so interpreted.

Continue reading “Supercomputing simulations spot electron orbital signatures” »

May 16, 2023

Quantum Computing Algorithm Breakthrough Brings Practical Use Closer to Reality

Posted by in categories: chemistry, computing, information science, quantum physics

Out of all common refrains in the world of computing, the phrase “if only software would catch up with hardware” would probably rank pretty high. And yet, software does sometimes catch up with hardware. In fact, it seems that this time, software can go as far as unlocking quantum computations for classical computers. That’s according to researchers with the RIKEN Center for Quantum Computing, Japan, who have published work on an algorithm that significantly accelerates a specific quantum computing workload. More significantly, the workload itself — called time evolution operators — has applications in condensed matter physics and quantum chemistry, two fields that can unlock new worlds within our own.

Normally, an improved algorithm wouldn’t be completely out of the ordinary; updates are everywhere, after all. Every app update, software update, or firmware upgrade is essentially bringing revised code that either solves problems or improves performance (hopefully). And improved algorithms are nice, as anyone with a graphics card from either AMD or NVIDIA can attest. But let’s face it: We’re used to being disappointed with performance updates.

May 15, 2023

Powering AI On Mobile Devices Requires New Math And Qualcomm Is Pioneering It

Posted by in categories: information science, mathematics, mobile phones, robotics/AI, transportation

The feature image you see above was generated by an AI text-to-image rendering model called Stable Diffusion typically runs in the cloud via a web browser, and is driven by data center servers with big power budgets and a ton of silicon horsepower. However, the image above was generated by Stable Diffusion running on a smartphone, without a connection to that cloud data center and running in airplane mode, with no connectivity whatsoever. And the AI model rendering it was powered by a Qualcomm Snapdragon 8 Gen 2 mobile chip on a device that operates at under 7 watts or so.

It took Stable Diffusion only a few short phrases and 14.47 seconds to render this image.


This is an example of a 540p pixel input resolution image being scaled up to 4K resolution, which results in much cleaner lines, sharper textures, and a better overall experience. Though Qualcomm has a non-algorithmic version of this available today, called Snapdragon GSR, someday in the future, mobile enthusiast gamers are going to be treated to even better levels of image quality without sacrificing battery life and with even higher frame rates.

Continue reading “Powering AI On Mobile Devices Requires New Math And Qualcomm Is Pioneering It” »

May 15, 2023

Generative AI Breaks The Data Center: Data Center Infrastructure And Operating Costs Projected To Increase To Over $76 Billion By 2028

Posted by in categories: business, information science, mobile phones, physics, robotics/AI

Update: The image for the ChatGPT 3.5 and vicuna-13B comparison has been updated for readability.

With the launch of Large Language Models (LLMs) for Generative Artificial Intelligence (GenAI), the world has become both enamored and concerned with the potential for AI. The ability to hold a conversation, pass a test, develop a research paper, or write software code are tremendous feats of AI, but they are only the beginning to what GenAI will be able to accomplish over the next few years. All this innovative capability comes at a high cost in terms of processing performance and power consumption. So, while the potential for AI may be limitless, physics and costs may ultimately be the boundaries.

Tirias Research forecasts that on the current course, generative AI data center server infrastructure plus operating costs will exceed $76 billion by 2028, with growth challenging the business models and profitability of emergent services such as search, content creation, and business automation incorporating GenAI. For perspective, this cost is more than twice the estimated annual operating cost of Amazon’s cloud service AWS, which today holds one third of the cloud infrastructure services market according to Tirias Research estimates. This forecast incorporates an aggressive 4X improvement in hardware compute performance, but this gain is overrun by a 50X increase in processing workloads, even with a rapid rate of innovation around inference algorithms and their efficiency. Neural Networks (NNs) designed to run at scale will be even more highly optimized and will continue to improve over time, which will increase each server’s capacity. However, this improvement is countered by increasing usage, more demanding use cases, and more sophisticated models with orders of magnitude more parameters. The cost and scale of GenAI will demand innovation in optimizing NNs and is likely to push the computational load out from data centers to client devices like PCs and smartphones.

May 15, 2023

Dr. Emre Ozcan & Walid Mehanna — Merck KGaA, Darmstadt, Germany — Tech As A Force For Good In Health

Posted by in categories: biotech/medical, business, governance, health, information science

Technology As A Force For Good In People’s Lives — Dr. Emre Ozcan, PhD, VP, Global Head of Digital Health & Walid Mehanna, Group Data Officer And Senior Vice President, Merck KGaA, Darmstadt, Germany.


EPISODE DISCLAIMER — At any time during this episode when anyone says Merck, in any context, it shall always be referring to Merck KGaA, Darmstadt, Germany.

Continue reading “Dr. Emre Ozcan & Walid Mehanna — Merck KGaA, Darmstadt, Germany — Tech As A Force For Good In Health” »

May 15, 2023

New Quantum Computer Algorithm Unlocks the Power of Atomic-Level Interactions

Posted by in categories: chemistry, computing, information science, quantum physics

A novel protocol for quantum computers could reproduce the complex dynamics of quantum materials.

RIKEN researchers have created a hybrid quantum-computational algorithm that can efficiently calculate atomic-level interactions in complex materials. This innovation enables the use of smaller quantum computers or conventional ones to study condensed-matter physics and quantum chemistry, paving the way for new discoveries in these fields.

A quantum-computational algorithm that could be used to efficiently and accurately calculate atomic-level interactions in complex materials has been developed by RIKEN researchers. It has the potential to bring an unprecedented level of understanding to condensed-matter physics and quantum chemistry—an application of quantum computers first proposed by the brilliant physicist Richard Feynman in 1981.

May 14, 2023

IBM announces end-to-end solution for quantum-safe cryptography

Posted by in categories: computing, encryption, information science, quantum physics, security

During its ongoing Think 2023 conference, IBM today announced an end-to-end solution to prepare organisations to adopt quantum-safe cryptography. Called Quantum Safe technology, it is a set of tools and capabilities that integrates IBM’s deep security expertise. Quantum-safe cryptography is a technique to identify algorithms that are resistant to attacks by both classical and quantum computers.

Under Quantum Safe technology, IBM is offering three capabilities. First is the Quantum Safe Explorer to locate cryptographic assets, dependencies, and vulnerabilities and aggregate all potential risks in one central location. Next is the Quantum Safe Advisor which allows the creation of a cryptographic inventory to prioritise risks. Lastly, the Quantum Safe Remidiator lets organisations test quantum-safe remediation patterns and deploy quantum-safe solutions.

In addition, the company has also announced IBM Safe Roadmap, which will serve as the guide for industries to adopt quantum technology. IBM Quantum Safe Roadmap is the company’s first blueprint to help companies in dealing with anticipated cryptographic standards and requirements and protect systems from vulnerabilities.

Page 74 of 319First7172737475767778Last