- Twitch streamer says she’s receiving backlash for ‘getting men banned’ 4 Years Ago
- ‘Game of Thrones’ fulfilled a twisted version of its biggest prophecy Today 8:17 AM
- Minions memes are more popular than the far-right on Telegram Today 7:35 AM
- ‘Best of Nextdoor’ reveals the true insanity of modern life Today 7:30 AM
- How to watch ‘Jeopardy’ for free Today 7:00 AM
- There’s a water bottle hiding in the ‘Game of Thrones’ finale Today 6:46 AM
- What happens to Disney’s Loki TV series after ‘Avengers: Endgame’? Today 6:30 AM
- Brienne writing Jaime’s history is the best meme from the ‘Game of Thrones’ finale Today 6:25 AM
- How to stream live TV on PlayStation 4 Today 6:00 AM
- How to watch Disney XD online for free Today 5:30 AM
- Who survived the ‘Game of Thrones’ series finale? Sunday 10:21 PM
- Justin Bieber fans are damaging one of Iceland’s top tourist spots Sunday 1:28 PM
- James Charles drops 41-minute response video to Tati Westbrook’s accusations Sunday 1:15 PM
- Watch what happens when this Twitch streamer quits his job on camera Sunday 12:25 PM
- Men are finally sharing their abortion stories Sunday 10:58 AM
Is this the future of language?
Robots are already inventing spoken languages, and it’s fascinating to watch unfold in real time.
The BBC recent uploaded a YouTube clip profiling Dr. Luc Steels of the VUB Artificial Intelligence Lab. Steels’ robotics research is focused on language, which may be an important missing ingredient in conventional robotics research. Some academics suggest that high-level artificial intelligence will only come about when robots can “grow up” in a manner similar to human children becoming adults. Language may be a valuable tool for robots to acquire for facilitating this, and Steels’ robots have been coming up with their own languages since 2011.
In the clip, Steels’ robots examine themselves in a mirror, studying their own reflections while moving to refine how they control themselves. This is a robotic version of a toddler learning to walk on his or her own. “In order to move in the world [and to] recognize the movements of another, you need to have some sort of model of your own body,” Steels tells the BBC presenter.
Then, using an invented language of nonsense syllables, the robots play a Simon-Says-style game with each other (and even with the human presenter). Each “word” corresponds to a pose for the other to strike. If a posing robot guesses the position correctly, the speaking robot nods its head in confirmation. If it guesses incorrectly, the robot shakes its head and demonstrates the proper position.
The robots come up with these words from scratch and use different ones every time. “They are starting from absolute zero: no prior words, no prior concepts, and then they co-evolve a shared lexicon,” Steels told the Daily Dot via email. “Each time the experiment restarts another language will come out.”
Gazing purposefully into a mirror calls to mind the mirror test, a psychological measurement used since 1970 to gauge self-awareness in non-human animals. Steels tells us that “self-awareness” is “one of those words like ‘consciousness’ that is hard to define.” Some monkeys and higher mammals might display significant self-awareness, using a mirror to more thoroughly groom oneself, but Steels’ robots engage on a far more basic level. They have “learned” that there is a relationship between their sensors, their motors, and the position of their bodies, and their understanding doesn’t go beyond that.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.