Menu

Blog

Archive for the ‘information science’ category: Page 229

Oct 26, 2019

Using Quantum Computers to Test the Fundamentals of Physics

Posted by in categories: computing, information science, quantum physics

A newly developed algorithm opens a window into understanding the transition from quantum to classical objects.

Oct 23, 2019

We’re Stuck Inside the Universe. Lee Smolin Has an Idea for How to Study It Anyway

Posted by in categories: cosmology, education, information science, mathematics, quantum physics

The universe is kind of an impossible object. It has an inside but no outside; it’s a one-sided coin. This Möbius architecture presents a unique challenge for cosmologists, who find themselves in the awkward position of being stuck inside the very system they’re trying to comprehend.

It’s a situation that Lee Smolin has been thinking about for most of his career. A physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, Smolin works at the knotty intersection of quantum mechanics, relativity and cosmology. Don’t let his soft voice and quiet demeanor fool you — he’s known as a rebellious thinker and has always followed his own path. In the 1960s Smolin dropped out of high school, played in a rock band called Ideoplastos, and published an underground newspaper. Wanting to build geodesic domes like R. Buckminster Fuller, Smolin taught himself advanced mathematics — the same kind of math, it turned out, that you need to play with Einstein’s equations of general relativity. The moment he realized this was the moment he became a physicist. He studied at Harvard University and took a position at the Institute for Advanced Study in Princeton, New Jersey, eventually becoming a founding faculty member at the Perimeter Institute.

Continue reading “We’re Stuck Inside the Universe. Lee Smolin Has an Idea for How to Study It Anyway” »

Oct 12, 2019

New compiler makes quantum computers two times faster

Posted by in categories: information science, quantum physics, robotics/AI

A new paper from researchers at the University of Chicago introduces a technique for compiling highly optimized quantum instructions that can be executed on near-term hardware. This technique is particularly well suited to a new class of variational quantum algorithms, which are promising candidates for demonstrating useful quantum speedups. The new work was enabled by uniting ideas across the stack, spanning quantum algorithms, machine learning, compilers, and device physics. The interdisciplinary research was carried out by members of the EPiQC (Enabling Practical-scale Quantum Computation) collaboration, an NSF Expedition in Computing.

Adapting to a New Paradigm for Quantum Algorithms

The original vision for dates to the early 1980s, when physicist Richard Feynman proposed performing molecular simulations using just thousands of noise-less qubits (quantum bits), a practically impossible task for traditional computers. Other algorithms developed in the 1990s and 2000s demonstrated that thousands of noise-less qubits would also offer dramatic speedups for problems such as database search, integer factoring, and matrix algebra. However, despite recent advances in quantum hardware, these algorithms are still decades away from scalable realizations, because current hardware features noisy qubits.

Oct 11, 2019

Be the first to comment on “Engineers Solve 50-Year-Old Puzzle in Signal Processing – Inverse Chirp Z-Transform”

Posted by in categories: computing, information science, mobile phones, virtual reality

Something called the fast Fourier transform is running on your cell phone right now. The FFT, as it is known, is a signal-processing algorithm that you use more than you realize. It is, according to the title of one research paper, “an algorithm the whole family can use.”

Alexander Stoytchev – an associate professor of electrical and computer engineering at Iowa State University who’s also affiliated with the university’s Virtual Reality Applications Center, its Human Computer Interaction graduate program and the department of computer science – says the FFT algorithm and its inverse (known as the IFFT) are at the heart of signal processing.

And, as such, “These are algorithms that made the digital revolution possible,” he said.

Oct 11, 2019

Engineers solve 50-year-old puzzle in signal processing

Posted by in categories: computing, information science, mobile phones, virtual reality

Something called the fast Fourier transform is running on your cell phone right now. The FFT, as it is known, is a signal-processing algorithm that you use more than you realize. It is, according to the title of one research paper, “an algorithm the whole family can use.”

Alexander Stoytchev—an associate professor of electrical and computer engineering at Iowa State University who’s also affiliated with the university’s Virtual Reality Applications Center, its Human Computer Interaction graduate program and the department of computer science—says the FFT and its inverse (known as the IFFT) are at the heart of signal processing.

And, as such, “These are algorithms that made the digital revolution possible,” he said.

Oct 11, 2019

Biologically-inspired skin improves robots’ sensory abilities

Posted by in categories: cyborgs, information science, robotics/AI

Sensitive synthetic skin enables robots to sense their own bodies and surroundings—a crucial capability if they are to be in close contact with people. Inspired by human skin, a team at the Technical University of Munich (TUM) has developed a system combining artificial skin with control algorithms and used it to create the first autonomous humanoid robot with full-body artificial skin.

The developed by Prof. Gordon Cheng and his team consists of hexagonal about the size of a two-euro coin (i.e. about one inch in diameter). Each is equipped with a microprocessor and sensors to detect contact, acceleration, proximity and temperature. Such artificial enables robots to perceive their surroundings in much greater detail and with more sensitivity. This not only helps them to move safely. It also makes them safer when operating near people and gives them the ability to anticipate and actively avoid accidents.

Continue reading “Biologically-inspired skin improves robots’ sensory abilities” »

Oct 9, 2019

Bio-Mimetic Real-Time Cortex Project — Whole Brain Emulation — Dr. Alice Parker — University of Southern California — ideaXme — Ira Pastor

Posted by in categories: big data, bioengineering, complex systems, driverless cars, drones, electronics, engineering, information science, neuroscience, robotics/AI

Oct 8, 2019

An AI Pioneer Wants His Algorithms to Understand the ‘Why’

Posted by in categories: information science, robotics/AI

Deep learning is good at finding patterns in reams of data, but can’t explain how they’re connected. Turing Award winner Yoshua Bengio wants to change that.

Oct 8, 2019

Are Black Holes Made of Dark Energy? Error Made When Applying Einstein’s Equations to Model Growth of the Universe?

Posted by in categories: cosmology, information science, physics

Two University of Hawaiʻi at Mānoa researchers have identified and corrected a subtle error that was made when applying Einstein’s equations to model the growth of the universe.

Physicists usually assume that a cosmologically large system, such as the universe, is insensitive to details of the small systems contained within it. Kevin Croker, a postdoctoral research fellow in the Department of Physics and Astronomy, and Joel Weiner, a faculty member in the Department of Mathematics, have shown that this assumption can fail for the compact objects that remain after the collapse and explosion of very large stars.

“For 80 years, we’ve generally operated under the assumption that the universe, in broad strokes, was not affected by the particular details of any small region,” said Croker. “It is now clear that general relativity can observably connect collapsed stars—regions the size of Honolulu—to the behavior of the universe as a whole, over a thousand billion billion times larger.”

Oct 5, 2019

How Will We Store Three Septillion Bits of Data? Your Metabolome May Have the Answer

Posted by in categories: biological, computing, information science, neuroscience

For the “big data” revolution to continue, we need to radically rethink our hard drives. Thanks to evolution, we already have a clue.

Our bodies are jam-packed with data, tightly compacted inside microscopic structures within every cell. Take DNA: with just four letters we’re able to generate every single molecular process that keeps us running. That sort of combinatorial complexity is still unheard of in silicon-based data storage in computer chips.

Add this to the fact that DNA can be dehydrated and kept intact for eons—500,000 years and counting—and it’s no surprise that scientists have been exploiting its properties to encode information. To famed synthetic biologist Dr. George Church, looking to biology is a no-brainer: even the simple bacteria E. Coli has a data storage density of 1019 bits per cubic centimeter. Translation? Just a single cube of DNA measuring one meter each side can meet all of the world’s current data storage needs.