- Seed University might actually be the first good influencer school Tuesday 9:35 PM
- Black couple says they were accused of stealing during marriage proposal Tuesday 6:57 PM
- How to live stream Robert Mueller’s testimony Tuesday 6:00 PM
- ‘MAGA Bomber’ believed that antifa was trying to murder Papa John’s employees Tuesday 5:23 PM
- Forever 21 under fire for sending Atkins diet bars with online orders Tuesday 4:56 PM
- Apple denies boosting its own apps in App Store Tuesday 4:25 PM
- The new Overwatch hero is a naked foot enthusiast, apparently Tuesday 4:19 PM
- Bella Thorne comes out as pansexual Tuesday 3:17 PM
- Macy’s pulls portion-control plates after social media uproar Tuesday 2:59 PM
- John Oliver confirms the internet’s suspicions about that ‘Lion King’ cast photo Tuesday 2:14 PM
- Report: Fake Libra accounts rampant on Facebook, Instagram Tuesday 2:10 PM
- Tennessee neighbors form human chain to help father and son escape ICE Tuesday 1:57 PM
- Google settled two multi-million dollar lawsuits this week Tuesday 1:26 PM
- How to live stream Guadalajara vs. Atletico Madrid Tuesday 12:47 PM
- Forget Area 51—People are planning to storm the Bermuda Triangle Tuesday 12:41 PM
Is this the future of language?
Robots are already inventing spoken languages, and it’s fascinating to watch unfold in real time.
The BBC recent uploaded a YouTube clip profiling Dr. Luc Steels of the VUB Artificial Intelligence Lab. Steels’ robotics research is focused on language, which may be an important missing ingredient in conventional robotics research. Some academics suggest that high-level artificial intelligence will only come about when robots can “grow up” in a manner similar to human children becoming adults. Language may be a valuable tool for robots to acquire for facilitating this, and Steels’ robots have been coming up with their own languages since 2011.
In the clip, Steels’ robots examine themselves in a mirror, studying their own reflections while moving to refine how they control themselves. This is a robotic version of a toddler learning to walk on his or her own. “In order to move in the world [and to] recognize the movements of another, you need to have some sort of model of your own body,” Steels tells the BBC presenter.
Then, using an invented language of nonsense syllables, the robots play a Simon-Says-style game with each other (and even with the human presenter). Each “word” corresponds to a pose for the other to strike. If a posing robot guesses the position correctly, the speaking robot nods its head in confirmation. If it guesses incorrectly, the robot shakes its head and demonstrates the proper position.
The robots come up with these words from scratch and use different ones every time. “They are starting from absolute zero: no prior words, no prior concepts, and then they co-evolve a shared lexicon,” Steels told the Daily Dot via email. “Each time the experiment restarts another language will come out.”
Gazing purposefully into a mirror calls to mind the mirror test, a psychological measurement used since 1970 to gauge self-awareness in non-human animals. Steels tells us that “self-awareness” is “one of those words like ‘consciousness’ that is hard to define.” Some monkeys and higher mammals might display significant self-awareness, using a mirror to more thoroughly groom oneself, but Steels’ robots engage on a far more basic level. They have “learned” that there is a relationship between their sensors, their motors, and the position of their bodies, and their understanding doesn’t go beyond that.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.