Menu

Blog

Archive for the ‘information science’ category: Page 92

Feb 19, 2023

3 Reasons Technosignatures Detected by AI-Trained Algorithm Can Be Extraterrestrial Activities

Posted by in categories: information science, robotics/AI, space

Astronomers picked up extraterrestrial signals which they previously missed in an area they thought was devoid of potential ET activity. It could be the first hint that humans are not alone in the universe.

Mysterious Signals Detected

Continue reading “3 Reasons Technosignatures Detected by AI-Trained Algorithm Can Be Extraterrestrial Activities” »

Feb 18, 2023

Why are small black holes more dangerous than big ones?

Posted by in categories: cosmology, information science

Why would someone falling into a stellar-mass black hole be spaghettified, but someone crossing the event horizon of a supermassive black hole would not feel much discomfort?

As it turns out, there is a relatively simple equation that describes the tidal acceleration that a body of length d would feel, based on its distance from a given object with mass M: a = 2GMd/R3, where a is the tidal acceleration, G is the gravitational constant, and R is the body’s distance to the center of the object (with mass M).

Feb 18, 2023

Reinforcement Learning Course — Full Machine Learning Tutorial

Posted by in categories: information science, policy, robotics/AI, space

This is NOT for ChatGPT, but instead its the AI tech used in beating GO, Chess, DOTA, etc. In other words, not just generating the next best word based on reading billions of sentences, but planning out actions to beat real game opponents (and winning.) And it’s free.


Reinforcement learning is an area of machine learning that involves taking right action to maximize reward in a particular situation. In this full tutorial course, you will get a solid foundation in reinforcement learning core topics.

Continue reading “Reinforcement Learning Course — Full Machine Learning Tutorial” »

Feb 17, 2023

6 Quantum Algorithms That Will Change Computing Forever

Posted by in categories: computing, information science, quantum physics, security

Here is a list of some of the most popular quantum algorithms highlighting the significant impact quantum can have on the classical world:

Shor’s Algorithm

Our entire data security systems are based on the assumption that factoring integers with a thousand or more digits is practically impossible. That was until Peter Shor in 1995 proposed that quantum mechanics allows factorisation to be performed in polynomial time, rather than exponential time achieved using classical algorithms.

Feb 16, 2023

Grid of atoms is both a quantum computer and an optimization solver

Posted by in categories: computing, information science, mathematics, particle physics, quantum physics

Quantum computing has entered a bit of an awkward period. There have been clear demonstrations that we can successfully run quantum algorithms, but the qubit counts and error rates of existing hardware mean that we can’t solve any commercially useful problems at the moment. So, while many companies are interested in quantum computing and have developed software for existing hardware (and have paid for access to that hardware), the efforts have been focused on preparation. They want the expertise and capability needed to develop useful software once the computers are ready to run it.

For the moment, that leaves them waiting for hardware companies to produce sufficiently robust machines—machines that don’t currently have a clear delivery date. It could be years; it could be decades. Beyond learning how to develop quantum computing software, there’s nothing obvious to do with the hardware in the meantime.

But a company called QuEra may have found a way to do something that’s not as obvious. The technology it is developing could ultimately provide a route to quantum computing. But until then, it’s possible to solve a class of mathematical problems on the same hardware, and any improvements to that hardware will benefit both types of computation. And in a new paper, the company’s researchers have expanded the types of computations that can be run on their machine.

Feb 16, 2023

Google AI generates musical backing tracks to accompany singers

Posted by in categories: information science, media & arts, robotics/AI

Google has trained an artificial intelligence, named SingSong, that can generate a musical backing track to accompany people’s recorded singing.

To develop it, Jesse Engel and his colleagues at Google Research used an algorithm to separate the instrumental and vocal parts from 46,000 hours of music and then fine-tuned an existing AI model – also created by Google Research, but for generating speech and piano music – on those pairs of recordings.

Feb 14, 2023

1950s Fighter Jet Air Computer Shows What Analog Could Do

Posted by in categories: computing, information science, military

Imagine you’re a young engineer whose boss drops by one morning with a sheaf of complicated fluid dynamics equations. “We need you to design a system to solve these equations for the latest fighter jet,” bossman intones, and although you groan as you recall the hell of your fluid dynamics courses, you realize that it should be easy enough to whip up a program to do the job. But then you remember that it’s like 1950, and that digital computers — at least ones that can fit in an airplane — haven’t been invented yet, and that you’re going to have to do this the hard way.

The scenario is obviously contrived, but this peek inside the Bendix MG-1 Central Air Data Computer reveals the engineer’s nightmare fuel that was needed to accomplish some pretty complex computations in a severely resource-constrained environment. As [Ken Shirriff] explains, this particular device was used aboard USAF fighter aircraft in the mid-50s, when the complexities of supersonic flight were beginning to outpace the instrumentation needed to safely fly in that regime. Thanks to the way air behaves near the speed of sound, a simple pitot tube system for measuring airspeed was no longer enough; analog computers like the MG-1 were designed to deal with these changes and integrate them into a host of other measurements critical to the pilot.

Continue reading “1950s Fighter Jet Air Computer Shows What Analog Could Do” »

Feb 14, 2023

How One of the Most Important Algorithms in Math Made Color TV Possible

Posted by in categories: information science, mathematics

A key algorithm that quietly empowers and simplifies our electronics is the Fourier transform, which turns the graph of a signal varying in time into a graph that describes it in terms of its frequencies.

Packaging signals that represent sounds or images in terms of their frequencies allows us to analyze and adjust sound and image files, Richard Stern, professor of electrical and computer engineering at Carnegie Mellon University, tells Popular Mechanics. This mathematical operation also makes it possible for us to store data efficiently.

The invention of color TV is a great example of this, Stern explains. In the 1950s, television was just black and white. Engineers at RCA developed color television, and used Fourier transforms to simplify the data transmission so that the industry could introduce color without tripling the demands on the channels by adding data for red, green, and blue light. Viewers with black-and-white TVs could continue to see the same images as they saw before, while viewers with color TVs could now see the images in color.

Feb 14, 2023

Is quantum machine learning ready for primetime?

Posted by in categories: information science, quantum physics, robotics/AI

Experts are divided on whether practical applications are just around the corner.

Feb 14, 2023

Future computer You WON’T See Coming…(analog computing)

Posted by in categories: education, information science, robotics/AI

Future computers You WON’T See Coming…(analog computing)

An emerging technology called analogue AI accelerators has the potential to completely change the AI sector. These accelerators execute computations using analogue circuits, which are distinct from digital circuits. They have advantages in handling specific kinds of AI algorithms, speed, and energy efficiency. We will examine the potential of this technology, its present constraints, and the use of analogue computing in AI in the future. Join us as we explore the realm of analogue AI accelerators and see how they’re influencing computing’s future. Don’t miss this engaging and educational film; click the subscribe button and check back for additional information about the newest developments in AI technology.

Continue reading “Future computer You WON’T See Coming…(analog computing)” »

Page 92 of 322First8990919293949596Last