- Lizzo reps Beyoncé’s Ivy Park collection in adult-themed TikTok 4 Years Ago
- Netflix’s ‘Eye for an Eye’ is a fun but messy thriller about revenge Today 7:00 AM
- Which 2020 Democratic candidates post the most cringe? Today 6:30 AM
- The new ‘Hunger Games’ book paints President Snow as a hero—and people are not happy Tuesday 9:03 PM
- Influencer called out for ‘troubling image’ with Kenyan child Tuesday 8:18 PM
- Professor arrested for spending $185K of grant money on iTunes and strippers Tuesday 7:28 PM
- Man cuts his books in half to make them ‘portable,’ spurs online debate Tuesday 6:09 PM
- Fans defend Lana Del Rey after she was mocked for flying commercial Tuesday 5:10 PM
- Lady Gaga fans find alleged new song name in her website’s code Tuesday 4:42 PM
- Barstool Sports deletes anti-union tweets, blog post in settlement Tuesday 3:47 PM
- The ‘can have … as a treat’ meme has come full circle Tuesday 3:09 PM
- Joe Rogan says he’s voting for Bernie Sanders Tuesday 2:54 PM
- Woman spots mole in man’s TikTok video, saves him from cancer Tuesday 2:17 PM
- ‘You’ star confirms his character is queer and ‘never will be’ straight Tuesday 1:08 PM
- This Twitch streamer pooped his pants during a broadcast Tuesday 12:17 PM
Watch two robots invent their own spoken language
Is this the future of language?
Robots are already inventing spoken languages, and it’s fascinating to watch unfold in real time.
The BBC recent uploaded a YouTube clip profiling Dr. Luc Steels of the VUB Artificial Intelligence Lab. Steels’ robotics research is focused on language, which may be an important missing ingredient in conventional robotics research. Some academics suggest that high-level artificial intelligence will only come about when robots can “grow up” in a manner similar to human children becoming adults. Language may be a valuable tool for robots to acquire for facilitating this, and Steels’ robots have been coming up with their own languages since 2011.
In the clip, Steels’ robots examine themselves in a mirror, studying their own reflections while moving to refine how they control themselves. This is a robotic version of a toddler learning to walk on his or her own. “In order to move in the world [and to] recognize the movements of another, you need to have some sort of model of your own body,” Steels tells the BBC presenter.
Then, using an invented language of nonsense syllables, the robots play a Simon-Says-style game with each other (and even with the human presenter). Each “word” corresponds to a pose for the other to strike. If a posing robot guesses the position correctly, the speaking robot nods its head in confirmation. If it guesses incorrectly, the robot shakes its head and demonstrates the proper position.
The robots come up with these words from scratch and use different ones every time. “They are starting from absolute zero: no prior words, no prior concepts, and then they co-evolve a shared lexicon,” Steels told the Daily Dot via email. “Each time the experiment restarts another language will come out.”
Gazing purposefully into a mirror calls to mind the mirror test, a psychological measurement used since 1970 to gauge self-awareness in non-human animals. Steels tells us that “self-awareness” is “one of those words like ‘consciousness’ that is hard to define.” Some monkeys and higher mammals might display significant self-awareness, using a mirror to more thoroughly groom oneself, but Steels’ robots engage on a far more basic level. They have “learned” that there is a relationship between their sensors, their motors, and the position of their bodies, and their understanding doesn’t go beyond that.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.