This week – Google’s AI made a better AI than humans; a security robot was used to kick a homeless out; AI makes fake porn; Google open-sources its gene sequencing tool; and more!
More than a human
Aubrey de Grey led a Reddit Ask Me Anything on Dec. 7, and one topic that came up multiple times was the resources needed to progress anti-ageing research. The science behind human longevity is solid, de Grey argued, but the funding is not.
Here’s a video that checks how close are we to building true android – the robots that look and behave like a real human. It goes through the challenges facing engineers, like mechanical challenges or AI. But do we need a human-like robot? And if we build an android, what will be its task and how we are going to communicate with them?
The new ultrasonic sensor allows amputees to control each of their prosthetic fingers individually, researchers report. The device provides fine motor hand gestures that aren’t possible with current commercially available devices. Jason Barnes, a musician who lost part of his right arm five years ago, was the first amputee to use it, allowing him to play the piano for the first time since his accident.
Someone used an algorithm to paste the face of ‘Wonder Woman’ star Gal Gadot onto a porn video, and the implications are terrifying.
Google published the result of AutoML project which aims to build an AI building other AIs. According to the researchers, NASNet was 82.7 percent accurate at predicting images on ImageNet’s validation set. This is 1.2 percent better than any previously published results, and the system is also 4 percent more efficient, with a 43.1 percent mean Average Precision (mAP). Additionally, a less computationally demanding version of NASNet outperformed the best similarly sized models for mobile platforms by 3.1 percent.
Machine learning and language translation is not a new thing. But what about applying machine learning to understand what did the chicken say? Researchers recorded chickens and they then fed the audio into a machine-learning program, training it to recognize the difference between the sounds of contented and distressed birds. The results of the researchers will be used in poultry to be better understand how the chickens feel but I can see it as a first step into building an AI-powered human-animal translator.
Here’s a talk by Juergen Schmidhuber, one of the AI pioneers, where he explains how long-term short memory (LSTM), an AI method he pioneered, works. Then he embarkes on a history of artificial intelligence lesson and where it can go in the future.
Here’s a story how and why Baidu, the Chinese Google, invests heavily in AI and AI-powered services like self-driving cars, robotics or personal assistants.
Welcome to our dystopian future. An animal shelter in San Francisco has been criticized for using a K5 security guard robot built by Knightscope to scare off homeless people.
Last year, the Dutch police went viral with their idea of using eagles to take rogue drones out from the sky. One year later, the Dutch police are retiring the project due to low demand, high costs, and unexpected drawbacks.
Biorobotics is a new way of building robots, combining tissue engineering with robotics. And as every new emerging field of research, it needs to get its names sorted out. In her recent paper, Vickie Webster-Wood calls for comprehensive organization of the biorobotics, which she proposes to call “organismal engineering” (I prefer biorobotics, sounds better).
There are companies that imagine a world where your burger or pizza is made by robots, with no humans involved.
Creating a walking robot is no easy task. But despite numerous efforts throughout the years, people are still trying to create robots that can truly walk around our environments like we do. Motherboard met with Agility Robotics, a small startup in Oregon that is one of many trying to crack the code of creating the perfect bipedal robot.
Some researchers believe that future robots will not look like robots. They challenge a widespread basic assumption: that robots are either “machines that run bits of code” or “software ‘bots’ interacting with the world through a physical instrument.” “We take a third path: one that imbues intelligence into the very matter of a robot,” they said. On that path, materials scientists are developing new bulk materials with the inherent multifunctionality required for robotic applications, while roboticists are working on new material systems with tightly integrated components, disappearing into the background of everyday life.
Wandelbots is using wearables to train robots. The trainer wears a sensor-laden suit and trains the robot by showing it how to do the job. No programming skills required. Wandelbots is another example of a company that tries to make training robots feel like training humans, either by showing or telling what to do.
The big-ticket acquisition of genetic design company Cell Design Labs signals a coming wave of precision cures. A two-year-old company developing molecular “logic” for cancer treatment has been snapped up for $175 million by Gilead Sciences amid a surge of interest in ways to battle disease using engineered immune cells.
Google released a tool called DeepVariant that uses deep learning—the machine learning technique that now dominates AI—to identify all the mutations that an individual inherits from their parents. DeepVariant is more accurate than all the existing methods out there. Last year, it took first prize in an FDA contest promoting improvements in genetic sequencing. The open source version the Google Brain/Verily team introduced to the world Monday reduced the error rates even further—by more than 50 percent.