Menu

Blog

Archive for the ‘information science’ category: Page 51

Oct 30, 2023

Research claims novel algorithm can exactly compute information rate for any system

Posted by in category: information science

75 years ago Claude Shannon, the “father of information theory,” showed how information transmission can be quantified mathematically, namely via the so-called information transmission rate.

Yet, until now this quantity could only be computed approximately. AMOLF researchers Manuel Reinhardt and Pieter Rein ten Wolde, together with a collaborator from Vienna, have now developed a simulation technique that—for the first time—makes it possible to compute the information rate exactly for any system. The researchers have published their results in the journal Physical Review X.

To calculate the information rate exactly, the AMOLF researchers developed a novel simulation algorithm. It works by representing a complex physical system as an interconnected network that transmits the information via connections between its nodes. The researchers hypothesized that by looking at all the different paths the information can take through this network, it should be possible to obtain the information rate exactly.

Oct 28, 2023

Memes, Genes, and Brain Viruses

Posted by in categories: biotech/medical, information science, robotics/AI

Go to https://brilliant.org/EmergentGarden to get a 30-day free trial + the first 200 people will get 20% off their annual subscription.

Continue reading “Memes, Genes, and Brain Viruses” »

Oct 27, 2023

New AI Model Counters Bias In Data With A DEI Lens

Posted by in categories: information science, robotics/AI

AI has exploded onto the scene in recent years, bringing both promise and peril. Systems like ChatGPT and Stable Diffusion showcase the tremendous potential of AI to enhance productivity and creativity. Yet they also reveal a dark reality: the algorithms often reflect the same systemic prejudices and societal biases present in their training data.

While the corporate world has quickly capitalized on integrating generative AI systems, many experts urge caution, considering the critical flaws in how AI represents diversity. Whether it’s text generators reinforcing stereotypes or facial recognition exhibiting racial bias, the ethical challenges cannot be ignored.


From generating text that furthers stereotypes to producing discriminatory facial recognition results, biased AI poses ethical and social challenges.

Continue reading “New AI Model Counters Bias In Data With A DEI Lens” »

Oct 27, 2023

AI-ready architecture doubles power with FeFETs

Posted by in categories: drones, information science, robotics/AI

Hussam Amrouch has developed an AI-ready architecture that is twice as powerful as comparable in-memory computing approaches. As reported in the journal Nature Communications (“First demonstration of in-memory computing crossbar using multi-level Cell FeFET”), the professor at the Technical University of Munich (TUM) applies a new computational paradigm using special circuits known as ferroelectric field effect transistors (FeFETs). Within a few years, this could prove useful for generative AI, deep learning algorithms and robotic applications.

  • The new architecture enables both data storage and calculations to be carried out on the same transistors, boosting efficiency and reducing heat.
  • The chip performs at 885 TOPS/W, significantly outperforming current CMOS chips which operate in the range of 10–20 TOPS/W, making it ideal for applications like real-time drone calculations, generative AI, and deep learning algorithms.
  • Oct 25, 2023

    Atom Computing is the first to announce a 1,000+ qubit quantum computer

    Posted by in categories: computing, information science, particle physics, quantum physics

    How many qubits do we have to have in a quantum computer and accessble to a wide market to trully have something scfi worthy?


    Today, a startup called Atom Computing announced that it has been doing internal testing of a 1,180 qubit quantum computer and will be making it available to customers next year. The system represents a major step forward for the company, which had only built one prior system based on neutral atom qubits—a system that operated using only 100 qubits.

    The error rate for individual qubit operations is high enough that it won’t be possible to run an algorithm that relies on the full qubit count without it failing due to an error. But it does back up the company’s claims that its technology can scale rapidly and provides a testbed for work on quantum error correction. And, for smaller algorithms, the company says it’ll simply run multiple instances in parallel to boost the chance of returning the right answer.

    Continue reading “Atom Computing is the first to announce a 1,000+ qubit quantum computer” »

    Oct 24, 2023

    Eureka: With GPT-4 overseeing training, robots can learn much faster

    Posted by in categories: information science, robotics/AI, space

    On Friday, researchers from Nvidia, UPenn, Caltech, and the University of Texas at Austin announced Eureka, an algorithm that uses OpenAI’s GPT-4 language model for designing training goals (called “reward functions”) to enhance robot dexterity. The work aims to bridge the gap between high-level reasoning and low-level motor control, allowing robots to learn complex tasks rapidly using massively parallel simulations that run through trials simultaneously. According to the team, Eureka outperforms human-written reward functions by a substantial margin.

    “Leveraging state-of-the-art GPU-accelerated simulation in Nvidia Isaac Gym,” writes Nvidia on its demonstration page, “Eureka is able to quickly evaluate the quality of a large batch of reward candidates, enabling scalable search in the reward function space.

    Oct 24, 2023

    Finding flows of a Navier–Stokes fluid through quantum computing

    Posted by in categories: computing, information science, quantum physics

    face_with_colon_three This looks awesome :3.


    There is great interest in using quantum computers to efficiently simulate a quantum system’s dynamics as existing classical computers cannot do this. Little attention, however, has been given to quantum simulation of a classical nonlinear continuum system such as a viscous fluid even though this too is hard for classical computers. Such fluids obey the Navier–Stokes nonlinear partial differential equations, whose solution is essential to the aerospace industry, weather forecasting, plasma magneto-hydrodynamics, and astrophysics. Here we present a quantum algorithm for solving the Navier–Stokes equations. We test the algorithm by using it to find the steady-state inviscid, compressible flow through a convergent-divergent nozzle when a shockwave is (is not) present.

    Oct 24, 2023

    Artificial intelligence predicts the future of artificial intelligence research

    Posted by in categories: information science, robotics/AI

    It has become nearly impossible for human researchers to keep track of the overwhelming abundance of scientific publications in the field of artificial intelligence and to stay up-to-date with advances.

    Scientists in an international team led by Mario Krenn from the Max-Planck Institute for the Science of Light have now developed an AI algorithm that not only assists researchers in orienting themselves systematically but also predictively guides them in the direction in which their own research field is likely to evolve. The work was published in Nature Machine Intelligence.

    In the field of artificial intelligence (AI) and (ML), the number of is growing exponentially and approximately doubling every 23 months. For human researchers, it is nearly impossible to keep up with progress and maintain a comprehensive overview.

    Oct 23, 2023

    Redefining the Fabric of Reality: The Growing Evidence for a Simulated Universe

    Posted by in categories: alien life, computing, information science

    New research on information entropy may offer evidence for the theory that our universe is a sophisticated simulation, with deep implications for various fields, from biology to cosmology.

    The simulated universe theory implies that our universe, with all its galaxies, planets and life forms, is a meticulously programmed computer simulation. In this scenario, the physical laws governing our reality are simply algorithms. The experiences we have are generated by the computational processes of an immensely advanced system.

    While inherently speculative, the simulated universe theory has gained attention from scientists and philosophers due to its intriguing implications. The idea has made its mark in popular culture, across movies, TV shows, and books – including the 1999 film The Matrix.

    Oct 22, 2023

    Do we live in a computer simulation like in The Matrix? Proposed new law of physics backs up the idea

    Posted by in categories: alien life, computing, information science, physics

    The simulated universe theory implies that our universe, with all its galaxies, planets and life forms, is a meticulously programmed computer simulation. In this scenario, the physical laws governing our reality are simply algorithms. The experiences we have are generated by the computational processes of an immensely advanced system.

    While inherently speculative, the simulated theory has gained attention from scientists and philosophers due to its intriguing implications. The idea has made its mark in popular culture, across movies, TV shows and books—including the 1999 film “The Matrix.”

    The earliest records of the concept that reality is an illusion are from ancient Greece. There, the question “What is the nature of our reality?” posed by Plato (427 BC) and others, gave birth to idealism. Idealist ancient thinkers such as Plato considered mind and spirit as the abiding reality. Matter, they argued, was just a manifestation or illusion.

    Page 51 of 319First4849505152535455Last