Menu

Blog

Page 3096

Mar 16, 2023

Sentience: The Invention of Consciousness

Posted by in categories: innovation, neuroscience

Author Nick Humphrey shares 5 key insights from his new book, Sentience: The Invention of Consciousness.

Mar 16, 2023

A Growing Number of Scientists Are Convinced the Future Influences the Past

Posted by in categories: futurism, physics

“Our instincts of time and causation are our deepest, strongest instincts that physicists and philosophers—and humans—are loath to give up,” said one scientist.

Mar 16, 2023

The Reality Of Deep Sea Mining Is Getting Closer As Are The Consequences

Posted by in category: futurism

A two-week meeting starting in Kingston, Jamaica, today could lead to the beginning of deep-sea mining of the ocean floor this year.


A meeting of 167 nations of the International Seabed Authority in Jamaica this week plans to finalize regulating ocean seafloor mining.

Mar 16, 2023

Substitution or Silent or Neutral Mutations

Posted by in category: futurism

This video explains substitution or silent or neutral mutations.

Thank You For Watching.

Continue reading “Substitution or Silent or Neutral Mutations” »

Mar 16, 2023

AI Image Generation Using DALL-E 2 Has Promising Future in Radiology

Posted by in categories: biotech/medical, health, internet, robotics/AI

Summary: Text-to-image generation deep learning models like OpenAI’s DALL-E 2 can be a promising new tool for image augmentation, generation, and manipulation in a healthcare setting.

Source: JMIR Publications

A new paper published in the Journal of Medical Internet Research describes how generative models such as DALL-E 2, a novel deep learning model for text-to-image generation, could represent a promising future tool for image generation, augmentation, and manipulation in health care.

Mar 16, 2023

Meet Petals: An Open-Source Artificial Intelligence (AI) System That Can Run 100B+ Language Models At Home Bit-Torrent Style

Posted by in category: robotics/AI

The NLP community has recently discovered that pretrained language models may accomplish various real-world activities with the help of minor adjustments or direct assistance. Additionally, performance usually becomes better as the size grows. Modern language models often include hundreds of billions of parameters, continuing this trend. Several research groups published pretrained LLMs with more than 100B parameters. The BigScience project most recently made BLOOM available, a 176 billion parameter model that supports 46 natural and 13 computer languages. The public availability of 100B+ parameter models makes them more accessible, yet due to memory and computational expenses, most academics and practitioners still find it challenging to use them. For inference, OPT-175B and BLOOM-176B require more than 350GB of accelerator RAM and even more for finetuning.

As a result, running these LLMs typically requires several powerful GPUs or multi-node clusters. These two alternatives are relatively inexpensive, restricting the potential study topics and language model applications. By “offloading” model parameters to slower but more affordable memory and executing them on the accelerator layer by layer, several recent efforts seek to democratize LLMs. By loading parameters from RAM just in time for each forward pass, this technique enables executing LLMs with a single low-end accelerator. Although offloading has high latency, it can process several tokens in parallel. For instance, they are producing one token with BLOOM-176B requires at least 5.5 seconds for the fastest RAM offloading system and 22 seconds for the quickest SSD offloading arrangement.

Additionally, many machines lack sufficient RAM to unload 175B parameters. LLMs may be made more widely available through public inference APIs, where one party hosts the model and allows others to query it online. This is a fairly user-friendly choice because the API owner handles most of the engineering effort. However, APIs are frequently too rigid to be used in research since they cannot alter the model’s control structure or have access to its internal states. Additionally, the cost of some research initiatives may be exorbitant, given the current API price. In this study, they investigate a different approach motivated by widespread crowdsourcing training of neural networks from scratch.

Mar 16, 2023

Scientists develop new lithium niobate laser technology

Posted by in category: materials

Scientists at EPFL and IBM have developed a new type of laser that could have a significant impact on optical ranging technology. The laser is based on a material called lithium niobate, often used in the field of optical modulators, which controls the frequency or intensity of light that is transmitted through a device.

Mar 16, 2023

The rise of AI automation in the workplace is bad news for the ‘brilliant jerks’ of tech

Posted by in categories: futurism, robotics/AI

Brilliant jerks who nobody actually likes working with will have a tougher time after AI automates much of their work. The future belongs to those with people skills.

Mar 16, 2023

Book Review: The Mountain In The Sea

Posted by in category: robotics/AI

Harry’s Review of The Mountain In The Sea by Ray Nayler

Ray Nayler’s recent contemplative sci-fi thriller The Mountain In The Sea offers a first contact story which holds a mirror to our notions of intelligence and responsibility.

Following ecologist Dr Ha Nguyen, The Mountain In The Sea centres upon the recent discovery of an octopus species in the remote Vietnamese islands of the Con Dao archipelago. Questions abound: just how intelligent are these octopuses? Corporations and activists alike become interested. DIANIMA-a vast organisation interesting in automation and artificial intelligence hire Dr Nguyen to determine how valuable the octopus species are to their research. Accompanying her for this job is DIANIMA’s own previous attempt at creating a being that can pass the Turing test, a silicon lifeform known as Evrim.

Mar 16, 2023

Uploading your consciousness will never work, a neuroscientist explains

Posted by in categories: materials, neuroscience

1. The mind, brain, and body are inextricably linked

The idea that the mind and brain are separate is usually attributed to the 17th-century French mathematician and philosopher René Descartes, who was what philosophers now call a substance dualist. Descartes believed that the mind and body are made of different substances: the body of a physical substance, and the mind of some mysterious, nonphysical material.

Today, most neuroscientists reject this idea. Modern brain research suggests that the mind is made of matter and emerges from brain activity. Even so, most still study the brain in isolation, without taking the body into consideration.