This week – self-destructing GMOs; robotic swans in Singapore; how AI coded its AI child; AI recreates Leia’s scene from Rogue One; and more!
More than a human
Last month’s big news was a successful cloning of monkeys, paving the way for cloning humans. Here’s what the experts have to say about it.
This article about “life extension” and solving “the problem of death” goes a different way than other articles. It notes that the people publicly championing life extension are mainly men and looks at the topic from that perspective.
This video does a good job of explaining what is Auto ML – one of the first successful automated AI projects made guys from Google Brain.
Someone on Reddit used open source AI to recreate young Princess Leia as seen in Rogue One. The end effect is quite good and it has been achieved with only a fraction of resources Disney and Industrial Light and Magic used to make the scene.
Since its discovery over a hundred years ago, the 240-page Voynich manuscript, filled with seemingly coded language and inscrutable illustrations, has confounded linguists and cryptographers. Using artificial intelligence, Canadian researchers have taken a huge step forward in unravelling the document’s hidden meaning.
Laura Gilmore is a roboticist at a cheese factory. In this article for Scientific American, she presents different kinds of robots fully dedicated to cutting, shredding, sorting and boxing up cheese.
When you are in Singapore and you spot a swan, chances are it isn’t a swan. Researchers at the National University of Singapore have created a clever self-driving drone called the Smart Water Assessment Network – the SWAN. These swan-shaped robots swim in Singapore’s waters and assess pollution, drinkability, and temperature, allowing researchers to gather data without scaring people with dangerous-looking traditional water drones.
University of Zurich researchers built an AI that flies around on a drone telling it how to navigate the streets thanks to data gleaned from self-driving cars and GoPro-toting bicycles. The AI-equipped drone takes that data, builds a map, locates itself on the map, and then plans a route, using images from a camera to provide speed and steering commands, follow road markings, and avoid collisions.
We already live among robots: tools and machines like dishwashers and thermostats so integrated into our lives that we’d never think to call them that. What will a future with even more robots look like? Social scientist Leila Takayama shares some unique challenges of designing for human-robot interactions – and how experimenting with robotic futures actually leads us to a better understanding of ourselves.
Researchers in Germany have developed a robot that is about a seventh of an inch (1.7cm) long and looks at first like no more than a tiny strip of something rubbery. Then it starts moving. The robot walks, jumps, crawls, rolls and swims. It even climbs out of the pool, moving from a watery environment into a dry one. The researchers hope to see this tiny robot delivering drugs within the human body.
When you release genetically altered species out into the wild, how can you prevent them from breeding with untweaked organisms living in their natural environment, and producing hybrid offspring that scientists can’t control or regulate? Simple, said one scientist. Make the offspring self-destruct.