- Who is Corn Pop? Here are all the theories about the gang leader from Joe Biden’s past 4 Years Ago
- Fresh sexual misconduct allegations against Kavanaugh spur calls for impeachment Today 3:28 PM
- Mike Pence says a triple crown winning racehorse bit him Today 12:51 PM
- Disney CEO Bob Iger leaves Apple board amid streaming wars Today 12:01 PM
- Influencer Destiny Marquez faces backlash for berating Forever 21 employee Today 10:32 AM
- Chelsea Handler tackles system racism in ‘Hello Privilege. It’s Me, Chelsea’ Today 9:18 AM
- Gun control proposal: Trump, lawmakers considering background check-conducting app Today 9:05 AM
- How to stream Browns vs. Jets on Monday Night Football Today 7:00 AM
- What are anons? Today 6:30 AM
- How to stream Eagles vs. Falcons on Sunday Night Football Today 6:00 AM
- How to stream ‘Power’ season 6, episode 4 Today 5:00 AM
- How to stream WWE’s Clash of Champions 2019 Saturday 8:00 PM
- How ‘F*ck off Scotland’ became a Scottish rallying cry amid Brexit madness Saturday 6:28 PM
- A Missouri officer resigned after his Islamophobic Facebook posts surfaced Saturday 5:08 PM
- Adding ‘Triggered’ to stock photos of white men creates Netflix comedy special thumbnails Saturday 3:10 PM
Watch two robots invent their own spoken language
Is this the future of language?
Robots are already inventing spoken languages, and it’s fascinating to watch unfold in real time.
The BBC recent uploaded a YouTube clip profiling Dr. Luc Steels of the VUB Artificial Intelligence Lab. Steels’ robotics research is focused on language, which may be an important missing ingredient in conventional robotics research. Some academics suggest that high-level artificial intelligence will only come about when robots can “grow up” in a manner similar to human children becoming adults. Language may be a valuable tool for robots to acquire for facilitating this, and Steels’ robots have been coming up with their own languages since 2011.
In the clip, Steels’ robots examine themselves in a mirror, studying their own reflections while moving to refine how they control themselves. This is a robotic version of a toddler learning to walk on his or her own. “In order to move in the world [and to] recognize the movements of another, you need to have some sort of model of your own body,” Steels tells the BBC presenter.
Then, using an invented language of nonsense syllables, the robots play a Simon-Says-style game with each other (and even with the human presenter). Each “word” corresponds to a pose for the other to strike. If a posing robot guesses the position correctly, the speaking robot nods its head in confirmation. If it guesses incorrectly, the robot shakes its head and demonstrates the proper position.
The robots come up with these words from scratch and use different ones every time. “They are starting from absolute zero: no prior words, no prior concepts, and then they co-evolve a shared lexicon,” Steels told the Daily Dot via email. “Each time the experiment restarts another language will come out.”
Gazing purposefully into a mirror calls to mind the mirror test, a psychological measurement used since 1970 to gauge self-awareness in non-human animals. Steels tells us that “self-awareness” is “one of those words like ‘consciousness’ that is hard to define.” Some monkeys and higher mammals might display significant self-awareness, using a mirror to more thoroughly groom oneself, but Steels’ robots engage on a far more basic level. They have “learned” that there is a relationship between their sensors, their motors, and the position of their bodies, and their understanding doesn’t go beyond that.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.