Toggle light / dark theme

Moore’s Law Will Soon End, but Progress Doesn’t Have to

In 1965, Intel co-founder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years and predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.

That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.

Yet the law has been fraying for years and experts predict that it will soon reach its limits. However, I spoke to Bernie Meyerson, IBM’s Chief Innovation Officer, and he feels strongly that the end of Moore’s Law doesn’t mean the end of progress. Not by a long shot. What we’ll see though is a shift in emphasis from the microchip to the system as a whole.

Apple Manufacturer Foxconn to Fully Replace Humans With Robots

In Brief

  • The Taiwanese company that manufactures Apple’s iPhone has announced a three-part plan to fully automate its factories, with hopes to achieve 30% automation by 2020.
  • The move could put as many as a million people out of work, another example of automation’s major implications for the global workforce.

Foxconn Electronics, the Taiwanese manufacturing company behind some of the biggest electronic brands’ devices, including Apple’s iPhone, has announced that it will ramp up automation processes at its Chinese factories. The goal is to eventually achieve full automation.

In an article published in Digitimes, General Manager Dai Jia-peng of Foxconn’s Automation Technology Development Committee explains that the process will unfold in three phases.

Apple’s first AI paper focuses on creating ‘superrealistic’ image recognition

Apple’s first paper on artificial intelligence, published Dec. 22 on arXiv (open access), describes a method for improving the ability of a deep neural network to recognize images.

To train neural networks to recognize images, AI researchers have typically labeled (identified or described) each image in a dataset. For example, last year, Georgia Institute of Technology researchers developed a deep-learning method to recognize images taken at regular intervals on a person’s wearable smartphone camera.

Example images from dataset of 40,000 egocentric images with their respective labels (credit: Daniel Castro et al./Georgia Institute of Technology)

How machine learning Is revolutionizing the diagnosis of rare diseases

Well before the family came in to the Batson Children’s Specialty Clinic in Jackson, Mississippi, they knew something was wrong. Their child was born with multiple birth defects, and didn’t look like any of its kin. A couple of tests for genetic syndromes came back negative, but Omar Abdul-Rahman, Chief of Medical Genetics at the University of Mississippi, had a strong hunch that the child had Mowat-Wilson syndrome, a rare disease associated with challenging life-long symptoms like speech impediments and seizures.

So he pulled out one of his most prized physicians’ tools: his cell phone.

Using an app called Face2Gene, Abdul-Rahman snapped a quick photo of the child’s face. Within a matter of seconds, the app generated a list of potential diagnoses — and corroborated his hunch. “Sure enough, Mowat-Wilson syndrome came up on the list,” Abdul-Rahman recalls.

Now You Can Make Movies of Living Cells With Your Smartphone!

Very cool; I do look forward to see where we land in the next 5 years on mobile imaging systems.

Years ago I remember developing software for a mobile blood gas analyzer to help researchers and doctors in some of the world’s most remote locations. And, the technology then did improve survival rates for so many. And, I see advances like this one doing so much for many who do not have access or the luxury of centralize labs, or hospitals, etc.


Democratizing Cellular Time-Lapses with a Cell-Phone!

A group of researchers from Uppsala University have recently developed an affordable system capable of capturing time-lapse videos of living cells under various conditions. Dubbed the affordable time-lapse imaging and incubation systm (ATLIS), the system can be constructed out of off-the-shelf electronic components and 3D-printed parts while using a standard smartphone for imaging.

While there have been other microscope adapters for smartphones to enable easy image capturing, the ATLIS is much more than microscope smartphone adapters. It is optimised in order to convert old microscopes found in abundance in Universities and hospitals into full-fledged time-lapse systems to image cell dynamics. Such a system requires strict environmental control of temperature, pH, osmolarity and light exposure in order to maintain normal cell behaviour.

Just a SmidgION: Oxford Nanopore announce iPhone-powered sequencing

CTO Clive Brown announces new Oxford Nanopore sequencing and library prep devices during his keynote address to the company’s user group conference

Stop the presses! Not something we call on a regular bases at FLG towers because, well, our work is largely digital. But when the latest news from Oxford Nanopore landed on our desks this afternoon, this old print journalism adage felt rather apt.

News in brief: Groupon grief; Apple encryption delay; post-quantum crypto

Your daily round-up of some of the other security stories in the news

Groupon grief – was it password reuse?

The Telegraph reports that crooks have hijacked a number of Groupon accounts and used them to purchase expensive items like games consoles, iPhones and holidays. Some victims have suffered thousands of pounds of losses.

Flaunt Magazine

A column on #transhumanism I did for Flaunt:


Are you ready for the future? A Transhumanist future in which everyone around you—friends, family, and neighbors—has dipped into the cybernetic punch bowl? This is a future of contact lenses that see in the dark, endoskeleton artificial limbs that lift a half-ton, and brain chip implants that read your thoughts and instantly communicate them to others. Sound crazy? Indeed, it does. Nevertheless, it’s coming soon. Very soon. In fact, much of the technology already exists. It’s being sold commercially at your local superstore or being tested in laboratories right now around the world.

We’ve all heard about driverless test cars on the roads and how doctors in France are replacing people’s hearts with permanent robotic ones, but did you know there’s already a multi-billion dollar market for brainwave-reading headsets? Using electroencephalography (EEG) sensors that pick up and monitor brain activity, NeuroSky’s MindWave can attach to Google Glass and allow you to take a picture and post it to Facebook and Twitter just by thinking about it. Other headsets allow you to play video games on your iPhone with only your thoughts as well. In fact, a few months ago, the first mind-to-mind communication took place. A researcher in India projected a thought to a colleague in France, and using their headsets, they understood each other. Telepathy went from science fiction to reality, just like that.

The history of cybernetics—sometimes used to describe robotic implants, prosthetics, and cyborg-like enhancements in the human being and its experience—has come a long way since scientists began throwing around the term in the 1950s. What a difference a generation or two makes. Today a thriving pro-cyborg medical industry is setting the stage for trillion-dollar markets that will remake the human experience. Five million people in America suffer from Alzheimer’s, but a new surgery that involves installing brain implants is showing promise in restoring people’s memory and improving lives. The use of medical and microchip implants, whether in the brain or not, are expected to surge in the coming years. NBC News recently reported that many Americans will likely have chip implants within a decade’s time. It’s truly a new age for humans.

Standing proud at the forefront of all this change is the fascinating biohacker culture, where extreme inventors and innovators are leading the way by sticking RFID tracking chips in their bodies, permanent wireless headphones near their eardrums, and magnets in their fingers.

/* */