Blog

Archive for the ‘supercomputing’ category

May 17, 2019

HP Enterprise is acquiring supercomputing giant Cray

Posted by in categories: business, government, supercomputing

Hewlett Packard Enterprise (HPE) has a shiny new toy. The information technology firm announced today that is spending $1.3 billion to acquire supercomputer manufacturer Cray. HPE, which is a business-facing spin-off of the Hewlett Packard company, will instantly become a bigger presence in the world of academia and the federal government, where Cray has a number of significant contracts. It will also enable HPE to start selling supercomputer components to corporate clients and others.

Read more

May 17, 2019

High School Student Uses AI to Detect Gravitational Waves

Posted by in categories: cosmology, education, employment, physics, robotics/AI, supercomputing

Before he could legally drive, high school student Adam Rebei was already submitting jobs on the Blue Waters supercomputer at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA) to run complex simulations of black holes.

“My first time using Blue Waters, we did a tour first and got to see the computer, which is a very amazing thing because it’s a very powerful machine,” Rebei told the NCSA, “and I just remember thinking, ‘All of the GPUs!’ It’s an insane amount of GPUs, and I’ve never seen anything like it.”

To get there, Rebei first took an astronomy class that led him to his work with the NCSA. Once there, he teamed up with research scientist Eliu Huerta, who leads the group’s Gravity Group.

Continue reading “High School Student Uses AI to Detect Gravitational Waves” »

May 16, 2019

Researchers discover an unexpected phase transition in the high explosive TATB

Posted by in categories: evolution, particle physics, supercomputing

Lawrence Livermore National Laboratory (LLNL) scientists in collaboration with University of Nevada Las Vegas (UNLV) have discovered a previously unknown pressure induced phase transition for TATB that can help predict detonation performance and safety of the explosive. The research appears in the May 13 online edition of the Applied Physics Letters and it is highlighted as a cover and featured article.

1,3,5-Triamino-2,4,6- trinitrobenzene (TATB), the industry standard for an insensitive high explosive, stands out as the optimum choice when safety (insensitivity) is of utmost importance. Among similar materials with comparable explosive energy release, TATB is remarkably difficult to shock-initiate and has a low friction sensitivity. The causes of this unusual behavior are hidden in the high-pressure structural evolution of TATB. Supercomputer simulations of explosives detonating, running on the world’s most powerful machines at LLNL, depend on knowing the exact locations of the atoms in the crystal structure of an explosive. Accurate knowledge of atomic arrangement under pressure is the cornerstone for predicting the detonation performance and safety of an explosive.

The team performed experiments utilizing a diamond anvil cell, which compressed TATB single crystals to a pressure of more than 25 GPa (250,000 times atmospheric pressure). According to all previous experimental and theoretical studies, it was believed that the atomic arrangement in the crystal structure of TATB remains the same under pressure. The project team challenged the consensus in the field aiming to clarify the high-pressure structural behaviour of TATB.

Continue reading “Researchers discover an unexpected phase transition in the high explosive TATB” »

May 6, 2019

Razer Is Building a Toaster, Possibly With LED Support

Posted by in category: supercomputing

Razer is building a toaster. Mattel is probably building an exascale supercomputer. I hear Raytheon just got into baby toys. Dogs and cats, living together! Chaos reigns.

Read more

May 6, 2019

MIT Cryptographers Are No Match For A Determined Belgian

Posted by in categories: robotics/AI, supercomputing

Twenty years ago, a cryptographic puzzle was included in the construction of a building on the MIT campus. The structure that houses what is now MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) includes a time capsule designed by the building’s architect, [Frank Gehry]. It contains artifacts related to the history of computing, and was meant to be opened whenever someone solved a cryptographic puzzle, or after 35 years had elapsed.

The puzzle was not expected to be solved early, but [Bernard Fabrot], a developer in Belgium, has managed it using not a supercomputer but a run-of-the-mill Intel i7 processor. The capsule will be opened later in May.

The famous cryptographer, [Ronald Rivest], put together what we now know is a deceptively simple challenge. It involves a successive squaring operation, and since it is inherently sequential there is no possibility of using parallel computing techniques to take any shortcuts. [Fabrot] used the GNU Multiple Precision Arithmetic Library in his code, and took over 3 years of computing time to solve it. Meanwhile another team is using an FPGA and are expecting a solution in months, though have been pipped to the post by the Belgian.

Continue reading “MIT Cryptographers Are No Match For A Determined Belgian” »

Apr 25, 2019

A breakthrough in the study of laser/plasma interactions

Posted by in categories: biotech/medical, supercomputing

A new 3D particle-in-cell (PIC) simulation tool developed by researchers from Lawrence Berkeley National Laboratory and CEA Saclay is enabling cutting-edge simulations of laser/plasma coupling mechanisms that were previously out of reach of standard PIC codes used in plasma research. More detailed understanding of these mechanisms is critical to the development of ultra-compact particle accelerators and light sources that could solve long-standing challenges in medicine, industry, and fundamental science more efficiently and cost effectively.

In laser-plasma experiments such as those at the Berkeley Lab Laser Accelerator (BELLA) Center and at CEA Saclay—an international research facility in France that is part of the French Atomic Energy Commission—very large electric fields within plasmas that accelerate particle beams to over much shorter distances when compared to existing accelerator technologies. The long-term goal of these laser-plasma accelerators (LPAs) is to one day build colliders for high-energy research, but many spin offs are being developed already. For instance, LPAs can quickly deposit large amounts of energy into solid materials, creating dense plasmas and subjecting this matter to extreme temperatures and pressure. They also hold the potential for driving free-electron lasers that generate light pulses lasting just attoseconds. Such extremely short pulses could enable researchers to observe the interactions of molecules, atoms, and even subatomic particles on extremely short timescales.

Supercomputer simulations have become increasingly critical to this research, and Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) has become an important resource in this effort. By giving researchers access to physical observables such as particle orbits and radiated fields that are hard to get in experiments at extremely small time and length scales, PIC simulations have played a major role in understanding, modeling, and guiding high-intensity physics experiments. But a lack of PIC codes that have enough computational accuracy to model laser-matter interaction at ultra-high intensities has hindered the development of novel particle and light sources produced by this interaction.

Continue reading “A breakthrough in the study of laser/plasma interactions” »

Apr 16, 2019

Optimizing network software to advance scientific discovery

Posted by in categories: mathematics, particle physics, supercomputing

High-performance computing (HPC)—the use of supercomputers and parallel processing techniques to solve large computational problems—is of great use in the scientific community. For example, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory rely on HPC to analyze the data they collect at the large-scale experimental facilities on site and to model complex processes that would be too expensive or impossible to demonstrate experimentally.

Modern science applications, such as simulating , often require a combination of aggregated computing power, high-speed networks for data transfer, large amounts of memory, and high-capacity storage capabilities. Advances in HPC hardware and software are needed to meet these requirements. Computer and computational scientists and mathematicians in Brookhaven Lab’s Computational Science Initiative (CSI) are collaborating with physicists, biologists, and other domain scientists to understand their data analysis needs and provide solutions to accelerate the scientific discovery process.

Read more

Apr 15, 2019

Even more frightening than military AI: an AI President of the Republic?

Posted by in categories: government, military, robotics/AI, supercomputing

A recent survey by the IE University in Madrid reveals that one in four Europeans would be ready to put an artificial intelligence in power. Should we be concerned for democracy or, on the contrary, welcome Europeans’ confidence in technology?

Europeans ready to elect an AI?

According to the study in question, about one in four out of the 25,000 Europeans surveyed would be prepared to be governed by an AIt worth noting that there are significant variations between countries, because where the European average is around 30%, respondents in the Netherlands are much more open to having a government run by a supercomputer (+ 43%) than in France (+ 25%). “The idea of a pragmatic machine, impervious to fraud and corruption” is one of the reasons that seems most compelling to the interviewees. Added to this are the options that Machine Learning would enable: in fact, the AI described would be able to improve by studying and selecting the best political decisions in the world… It would then be able to make better decisions than existing politicians.

Continue reading “Even more frightening than military AI: an AI President of the Republic?” »

Apr 10, 2019

Human Brain/Cloud Interface

Posted by in categories: biotech/medical, education, internet, nanotechnology, Ray Kurzweil, robotics/AI, supercomputing

The Internet comprises a decentralized global system that serves humanity’s collective effort to generate, process, and store data, most of which is handled by the rapidly expanding cloud. A stable, secure, real-time system may allow for interfacing the cloud with the human brain. One promising strategy for enabling such a system, denoted here as a “human brain/cloud interface” (“B/CI”), would be based on technologies referred to here as “neuralnanorobotics.” Future neuralnanorobotics technologies are anticipated to facilitate accurate diagnoses and eventual cures for the ∼400 conditions that affect the human brain. Neuralnanorobotics may also enable a B/CI with controlled connectivity between neural activity and external data storage and processing, via the direct monitoring of the brain’s ∼86 × 10 neurons and ∼2 × 1014 synapses. Subsequent to navigating the human vasculature, three species of neuralnanorobots (endoneurobots, gliabots, and synaptobots) could traverse the blood–brain barrier (BBB), enter the brain parenchyma, ingress into individual human brain cells, and autoposition themselves at the axon initial segments of neurons (endoneurobots), within glial cells (gliabots), and in intimate proximity to synapses (synaptobots). They would then wirelessly transmit up to ∼6 × 1016 bits per second of synaptically processed and encoded human–brain electrical information via auxiliary nanorobotic fiber optics (30 cm) with the capacity to handle up to 1018 bits/sec and provide rapid data transfer to a cloud based supercomputer for real-time brain-state monitoring and data extraction. A neuralnanorobotically enabled human B/CI might serve as a personalized conduit, allowing persons to obtain direct, instantaneous access to virtually any facet of cumulative human knowledge. Other anticipated applications include myriad opportunities to improve education, intelligence, entertainment, traveling, and other interactive experiences. A specialized application might be the capacity to engage in fully immersive experiential/sensory experiences, including what is referred to here as “transparent shadowing” (TS). Through TS, individuals might experience episodic segments of the lives of other willing participants (locally or remote) to, hopefully, encourage and inspire improved understanding and tolerance among all members of the human family.

“We’ll have nanobots that… connect our neocortex to a synthetic neocortex in the cloud… Our thinking will be a… biological and non-biological hybrid.”

— Ray Kurzweil, TED 2014

Continue reading “Human Brain/Cloud Interface” »

Apr 5, 2019

Getting a big look at tiny particles

Posted by in categories: biotech/medical, nuclear energy, quantum physics, supercomputing

At the turn of the 20th century, scientists discovered that atoms were composed of smaller particles. They found that inside each atom, negatively charged electrons orbit a nucleus made of positively charged protons and neutral particles called neutrons. This discovery led to research into atomic nuclei and subatomic particles.

An understanding of these ’ structures provides crucial insights about the forces that hold matter together and enables researchers to apply this knowledge to other scientific problems. Although electrons have been relatively straightforward to study, protons and neutrons have proved more challenging. Protons are used in medical treatments, scattering experiments, and fusion energy, but nuclear scientists have struggled to precisely measure their underlying structure—until now.

In a recent paper, a team led by Constantia Alexandrou at the University of Cyprus modeled the location of one of the subatomic particles inside a , using only the basic theory of the strong interactions that hold matter together rather than assuming these particles would act as they had in experiments. The researchers employed the 27-petaflop Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and a method called lattice quantum chromodynamics (QCD). The combination allowed them to map on a grid and calculate interactions with high accuracy and precision.

Continue reading “Getting a big look at tiny particles” »

Page 1 of 2912345678Last