Toggle light / dark theme

In this video, Dr. Ardavan (Ahmad) Borzou will discuss a rising technology in constructing bio-computers for AI tasks, namely Brainoware, which is made of brain organoids interfaced by electronic arrays.

Need help for your data science or math modeling project?
https://compu-flair.com/solution/

🚀 Join the CompuFlair Community! 🚀
📈 Sign up on our website to access exclusive Data Science Roadmap pages — a step-by-step guide to mastering the essential skills for a successful career.
đŸ’ȘAs a member, you’ll receive emails on expert-engineered ChatGPT prompts to boost your data science tasks, be notified of our private problem-solving sessions, and get early access to news and updates.
👉 https://compu-flair.com/user/register.

Comprehensive Python Checklist (machine learning and more advanced libraries will be covered on a different page):
https://compu-flair.com/blogs/program
 — Introduction 02:16 — Von Neumann Bottleneck 03:54 — What is brain organoid 05:09 — Brainoware: reservoir computing for AI 06:29 — Computing properties of Brainoware: Nonlinearity & Short-Memory 09:27 — Speech recognition by Brainoware 12:25 — Predicting chaotic motion by Brainoware 13:39 — Summary of Brainoware research 14:35 — Can brain organoids surpass the human brain? 15:51 — Will humans evolve to a body-less stage in their evolution? 16:30 — What is the mathematical model of Brainoware?

00:00 — Introduction.
02:16 — Von Neumann Bottleneck.
03:54 — What is brain organoid.
05:09 — Brainoware: reservoir computing for AI
06:29 — Computing properties of Brainoware: Nonlinearity & Short-Memory.
09:27 — Speech recognition by Brainoware.
12:25 — Predicting chaotic motion by Brainoware.
13:39 — Summary of Brainoware research.
14:35 — Can brain organoids surpass the human brain?
15:51 — Will humans evolve to a body-less stage in their evolution?
16:30 — What is the mathematical model of Brainoware?

A team of quantum computer researchers at quantum computer maker D-Wave, working with an international team of physicists and engineers, is claiming that its latest quantum processor has been used to run a quantum simulation faster than could be done with a classical computer.

In their paper published in the journal Science, the group describes how they ran a quantum version of a mathematical approximation regarding how matter behaves when it changes states, such as from a gas to a liquid—in a way that they claim would be nearly impossible to conduct on a traditional computer.

Over the past several years, D-Wave has been working on developing quantum annealers, which are a subtype of quantum computer created to solve very specific types of problems. Notably, landmark claims made by researchers at the company have at times been met with skepticism by others in the field.

In research inspired by the principles of quantum mechanics, researchers from Pompeu Fabra University (UPF) and the University of Oxford reveal new findings to understand why the human brain is able to make decisions quicker than the world’s most powerful computer in the face of a critical risk situation. The human brain has this capacity despite the fact that neurons are much slower at transmitting information than microchips, which raises numerous unknown factors in the field of neuroscience.

The research is published in the journal Physical Review E.

It should be borne in mind that in many other circumstances, the human brain is not quicker than technological devices. For example, a computer or calculator can resolve mathematical operations far faster than a person. So, why is it that in critical situations—for example, when having to make an urgent decision at the wheel of a car—the human brain can surpass machines?

Quantum systems hold the promise of tackling some complex problems faster and more efficiently than classical computers. Despite their potential, so far only a limited number of studies have conclusively demonstrated that quantum computers can outperform classical computers on specific tasks. Most of these studies focused on tasks that involve advanced computations, simulations or optimization, which can be difficult for non-experts to grasp.

Researchers at the University of Oxford and the University of Sevilla recently demonstrated a over a classical scenario on a cooperation task called the odd-cycle game. Their paper, published in Physical Review Letters, shows that a team with can win this game more often than a team without.

“There is a lot of talk about quantum advantage and how will revolutionize entire industries, but if you look closely, in many cases, there is no mathematical proof that classical methods definitely cannot find solutions as efficiently as quantum algorithms,” Peter Drmota, first author of the paper, told Phys.org.

Mathematicians from New York University and the University of British Columbia have resolved a decades-old geometric problem, the Kakeya conjecture in 3D, which studies the shape left behind by a needle moving in multiple directions.

The research is published on the arXiv preprint server.

The Kakeya conjecture was inspired by a problem asked in 1917 by Japanese mathematician Sƍichi Kakeya: What is the region of smallest possible area in which it is possible to rotate a needle 180 degrees in the plane? Such regions are called Kakeya needle sets.

Scientists at the Okinawa Institute of Science and Technology (OIST), the National Institute of Information and Communications Technology, and the University of Tokyo have found a mathematical connection between spatial navigation and language processing, creating a model called “Disentangled Successor Information” (DSI).

This model generates patterns that closely resemble the activity of actual brain cells involved in both spatial awareness (place cells and grid cells) and concept recognition (concept cells).

The DSI model shows that the hippocampus and entorhinal cortex— previously known primarily for —likely use comparable computational processes to handle both physical spaces and meaningful ideas or words. Using this shared framework, both types of information can be processed through similar mathematical computations, which could be achieved in the brain by partial activation of specific groups of neurons.

In today’s AI news, ByteDance cofounder Zhang Yiming has become China’s richest man as investors bet on companies with AI potential. Zhang’s fortune has grown to $65.5 billion, ahead of beverage giant Nongfu Spring founder Zhong Shanshan’s $56.5 billion, according to Forbes estimates. Zhang, 41, derives his net worth from a 21% stake in the privately held tech behemoth 


And, OpenAI, Google, Meta, Microsoft, and smaller firms like Anthropic are losing massive amounts of money by giving away their AI products or selling them at a loss. “We are in the era of $5 Uber rides anywhere across San Francisco but for LLMs,” wrote early OpenAI engineer Andrej Karpathy. Chatbots are free, programming assistance is cheap, and attention-grabbing, money-losing AI toys are everywhere. AI is in its free(ish) trial era.

Meanwhile, the world’s largest contract electronics maker, Foxconn, said it has built its own large language model with reasoning capabilities, developed in-house and trained in four weeks. Initially designed for internal use within the company, the artificial-intelligence model, called FoxBrain, is capable of data analysis, mathematics, reasoning and code generation. Foxconn said Nvidia provided support 


Then, once upon a time, software ate the world. Now, AI is here to digest what’s left. The old model of computing, where apps ruled, marketplaces controlled access and platforms took their cut, is unraveling. What’s emerging is an AI-first world where software functions aren’t trapped inside apps but exist as dynamic, on-demand services accessible through AI-native interfaces.

In videos, learn how to integrate ElevenLabs Conversational AI platform with Cal. com for automated meeting scheduling. Angelo, takes you through the process with step-by-step instructions, and you can view and use the complete guide with Eleven Labs full documentation.

In other advancements, Anton Osika is the co-founder and CEO of Lovable, which is building what they call “the last piece of software”—an AI-powered tool that turns descriptions into working products without requiring any coding knowledge. Since launching three months ago, Lovable hit $4 million ARR in the first four weeks and $10 million ARR in two months with a team of just 15 people.

S first frame, influencing the entire clip. We We close out with, Jason Calacanis sitting down with Harrison Chase, CEO of LangChain, to explore how AI-powered agents are transforming the way startups operate. They discuss the shift from traditional entry-level roles to AI-driven automation, the importance of human-in-the-loop systems, and the future of AI-powered assistants in business. Harrison shares insights on how companies like Replit, Klarna, and GitLab are leveraging AI agents.