Archive for the ‘computing’ category: Page 808
Feb 21, 2016
Robot chores: Machines tipped to take 15m Brit jobs in the next ten years
Posted by Karen Hurst in categories: computing, employment, habitats, robotics/AI
“No offense; but your robots are ugly”
Robots today (especially for home and care giver usage) will need to improve drastically. We’re still designing robots like the are a CPU for homes which frankly freaks some kids out, scares some of the elderly population that it’s too fragile to operate, and my own cat will not come near one. If robotics for home use is ever going to be adopted by the large mass of the population they will need to look less like they are a robot part of a manufacturers’s assembly line, will need a softer/ low noise sound with volume controls for those with hard of hearing, will need modifications for the deaf and blind, will all need to be a multi purpose robot that can do 2 or more types of work inside the home vacumn/ dust/ cook/ wash dishes/ wash clothes, etc., not complicated to set up and operate, reliable (not needing repairs all the time & not over heat), less bulky, better sensors to determine stairs and can climb stairs, etc.
From mowing the lawn to cooking dinner, experts say automatons are set to take over some of our most tedious tasks.
Feb 20, 2016
United Nations CITO: Artificial intelligence will be humanity’s final innovation
Posted by Karen Hurst in categories: computing, internet, quantum physics, robotics/AI, security
I hate to break the news to the UN’s CITO — has she ever heard of “Quantum Technology?” After AI flood into the scene; the next innovation that I and others are working on is Quantum Computing which will make AI, Internet, Cyber Security, devices, platforms, medical technology more advance with incredible performance.
The United Nations Chief Information Technology Officer spoke with TechRepublic about the future of cybersecurity, social media, and how to fix the internet and build global technology for social good.
Artificial intelligence, said United Nations chief information technology officer Atefeh Riazi, might be the last innovation humans create.
Feb 20, 2016
Gaming Chip Is Helping Raise Your Computer’s IQ
Posted by Karen Hurst in categories: computing, entertainment, mobile phones, robotics/AI
Using gaming chips to read people’s images, etc. definitely makes sense especially as we move more and more in the AI connected experience.
Facebook, Google and Microsoft are tapping the power of a vintage computer gaming chip to raise your smartphone’s IQ with artificially intelligent programs that recognize faces and voices, translate conversations on the fly and make searches faster and more accurate.
Feb 20, 2016
Basic income may be needed to combat robot-induced unemployment, leading AI expert says
Posted by Karen Hurst in categories: computing, economics, employment, robotics/AI
I do believe that there will be some level of expansion of social services to help employees to be retrained for the new positions that are coming as well as assist lower skill workers to be retrained. However, the larger question is who should pay. Some people are saying tech should assist governments in retooling since the AI technology created the situation; others say it’s a governments issue only, etc. It will be interesting to say the least how the retraining program and other services are covered.
A leading artificial intelligence (AI) expert believes that societies may have to consider issuing a basic income to all citizens, in order to combat the threat to jobs posed by increased automation in the workplace.
Dr Moshe Vardi, a computer science professor at Rice University in Texas, believes that a basic income may be needed in the future as advances in automation and AI put human workers out of jobs.
Feb 20, 2016
Infographic: Combining Electronics and Photonics Opens Way for Next-Generation Microprocessors
Posted by Shailesh Prasad in categories: computing, electronics, engineering
Integrated circuits traditionally have been a domain reserved for electrons, which course through exquisitely tiny transistors, wires and other microscopic structures where the digital calculations and data processing that underlie so much of modern technology unfold. Increasingly, however, chip designers have been acting on a long-ripening vision of enlisting photons instead of, or in tandem with, electrons in the operation of microprocessors. Photons, for one, can serve as fast-as-light carriers of information between chips, overcoming digital traffic jams that at times put the brakes on electrons. Recently, DARPA-funded scientists designed and crafted a breakthrough microprocessor that combines many of the best traits of electrons and photons on a single chip. The result is a remarkable and elegant hybrid microtechnology that boggles the mind for the intricate complexity of its sub-Lilliputian architecture. To appreciate the engineering acumen involved in the development of this chip and its tens of millions of resident electronic and photonic components, DARPA has produced an annotated, graphical tour of the new chip’s innards. Check it out, and lose yourself in a world of highways, toll gates and traffic circles populated by some of the physical world’s smallest commuters.
Feb 19, 2016
Scientists say all the world’s data can fit on a DNA hard drive the size of a teaspoon
Posted by Shailesh Prasad in categories: biotech/medical, computing, genetics
Even though it’s looking increasingly likely that humanity will find a way to wipe itself off the face of the Earth, there’s a chance that our creative output may live on. Servers, hard drives, flash drives, and disks will degrade (as will our libraries of paper books, of course), but a group of researchers at the Swiss Federal Institute of Technology have found a way to encode data onto DNA—the very same stuff that all living beings’ genetic information is stored on—that could survive for millennia.
One gram of DNA can potentially hold up to 455 exabytes of data, according to the New Scientist. For reference: There are one billion gigabytes in an exabyte, and 1,000 exabytes in a zettabyte. The cloud computing company EMC estimated that there were 1.8 zettabytes of data in the world in 2011, which means we would need only about 4 grams (about a teaspoon) of DNA to hold everything from Plato through the complete works of Shakespeare to Beyonce’s latest album (not to mention every brunch photo ever posted on Instagram).
There are four types of molecules that make up DNA, which form pairs. To encode information on DNA, scientists program the pairs into 1s and os—the same binary language that encodes digital data. This is not a new concept—scientists at Harvard University encoded a book onto DNA in 2012—but up to now, it had been difficult to retrieve the information stored on the DNA.
Feb 19, 2016
Artificial Kidney Made of Nanofilters and Living Cells to Replace Dialysis
Posted by Shailesh Prasad in categories: biotech/medical, computing, health, nanotechnology
At Vanderbilt University scientists are building an artificial kidney that they envision will one day will be a standard of care over dialysis. The device consists of a silicon nanotechnology filter chip and embedded living kidney cells that would work together to mimic the functionality of a healthy kidney. The end result is expected to be about the size of a natural kidney, small enough to be implantable and powered by the body’s own blood flow.
The filter component has tiny pores that can be individually shaped to perform a specific task. These filters would sit in a series, each one performing a different filtration step. Between the filter slices there would be living kidney cells that perform tasks that the man made components are not very good at, including reabsorption of nutrients and getting rid of accumulated waste.
Here’s video with Vanderbilt University Medical Center’s Dr. William Fissell, the lead scientist on the research:
Feb 19, 2016
Google Preparing A New Smartphone With Virtual Reality Support
Posted by Shailesh Prasad in categories: computing, electronics, mobile phones, virtual reality
This time, there is a very serious news about virtual reality as Google Inc. is said to be getting ready to unveil a new-fangled smartphone headset.
According to The Financial Times, the new headset will succeed Cardboard, and would be featuring much better sensors, lenses, and a more solid plastic skin.
It’s said the product is the like of Samsung’s Gear VR since it will use a smartphone to display as well as most of its processing power. The only difference is that the current Cardboard VR headset is just a cardboard headset like its name with an inserted smartphone, while the new one will be coming with an extra motion sensor for adding whatever the phone places out.
MIT has developed a quantum computer design featuring an array of superconducting islands on the surface of a topological insulator that they’re experimenting with to process 0’s & 1’s — if they are successful; this could possibly get us within a 5 yr window for QC platforms.
Massachusetts Institute of Technology (MIT) researchers have developed a quantum computer design featuring an array of superconducting islands on the surface of a topological insulator.
The researchers propose basing both quantum computation and error correction on the peculiar behavior of electrons at neighboring corners of these islands and their ability to interact across islands at a distance.