How is it to be a cyborg? AI experiments by Google. Google Translate develops its own internal language. The state of AI. Who controls the robots and more!
More Than A Human
Real-life cyborgs talk at Google how it looks like to be a cyborg. Neil Harbisson hears in colours through an antenna attached to his skull, and Moon Ribas choreographs earthquakes with a sensor in her elbow.
Meet Moon Ribas, an artist and a cyborg. She has implanted a sensor in her elbow that vibrates whenever there is an earthquake anywhere in the world.
Here’s a video showing how one guy is operating a robotic arm using only with brain waves. He performs a set of simple actions, like select bottle to grab, grab it and move it in the first experiment. The second experiment is similar – select bottle, grab it and help the human drink it. It looks more like a proof of concept than a real product, but nevertheless it’s interesting to watch.
SuitX, a spin-off of the University of California, Berkeley, that makes exoskeletons for those with disabilities, has launched a trio of devices that use robotic technologies to enhance the abilities of able-bodied workers and prevent common workplace injuries. The devices can be worn together or separately and were designed to lower the forces on different joints and muscles. They might find their place in factories preventing injuries from lifting heavy things.
Oh, these hackers and their hackathons. They always came up with something interesting.
Google published a website full of fun experiments with AI and machine learning. The most popular was the one in which you draw a doodle and the AI has to guess what have you drawn, but there other like real-life translator or what does the neural network really see.
Interesting thing. Looks like the AI used in Google Translator developed an internal language through which it translates between different human languages. This article explains the concept in easy to understand way, but if you want more technical details, here’s a link to the paper published by the Google Translate team.
Tech giant, like Google, Apple, Facebook, Intel and others, don’t hesitate to spend money on buying smaller AI companies and assimilating them. It leads into a smaller pool of talents for other, mostly smaller players.
O’Reilly has published the new map of who is who and who’s doing what in artificial intelligence. While exploring the map you can play a game called “How many of these companies I know?”.
Here’s a legal riddle for you – if an AI system invents a new thing and the patent for that thing is submitted, who should be credited? For me, this question fits into a bigger problem of responsibility and ownership of intelligent machines. You can see a similar problem when it comes to taking responsibility for accidents involving self-driving cars. I wonder if the same question asked in different fields will give the same answer.
Military? Police? Hackers? Google? Amazon? A robot? It’s an interesting read proving how much we depend on robots and algorithms and on those who control them, deliberately or by accident.
Scientists at Japan’s Hokkaido University have created a miniature robot made from organic compounds. After exposing the compound in blue light and observing it under a microscope, researchers found out that an oscillatory bending-unbending motion of the crystals occurred. That kind of robots could one day be used to deliver medicine to targeted areas of our bodies.
Imagine this – you say “give me the mug” and a swarm of robots find the mug and give it to you. Similar thing (but without voice control) was presented by researchers at the Association for Computing Machinery’s UIST conference. Maybe one day these simple swarm robots will evolve into machines we saw in Big Hero 6.