- Black model speaks out after fashion show featured racist accessories 2 Years Ago
- CBS to launch ‘House of Brands’ to bulk up streaming 2 Years Ago
- Steven Spielberg’s daughter is launching a career in adult entertainment 2 Years Ago
- Congress asks Ring for more information about police partnerships Today 10:38 AM
- Vlogger who posts behind-the-scenes footage at Disney is in trouble with YouTube Today 10:16 AM
- Was Bloomberg owned harder on the debate stage or by Twitter? Today 8:54 AM
- Make ‘Animal Crossing: New Horizons’ your desert island game Today 8:51 AM
- Everyone’s trying to dance like Jennifer Lopez on TikTok Today 8:27 AM
- Why the coronavirus bioweapon theory persists Today 7:57 AM
- You won’t see deleted scenes on the ‘Rise of Skywalker’ home release Today 7:47 AM
- The return of ‘Star Wars: The Clone Wars’ throws you right back into the fight Today 7:00 AM
- Am I overreacting to a Facebook message from a dating app match? Today 6:00 AM
- Buttigieg, Klobuchar come together to laugh at Bloomberg Wednesday 10:29 PM
- Bernie Sanders calls Bloomberg’s wealth ‘grotesque’ to his face Wednesday 9:53 PM
- Angry Bloomberg asks debate moderators if he’s ‘chicken liver’ Wednesday 9:29 PM
The audio clip Google demonstrated during its I/O conference keynote sounded like an ordinary phone call of a woman setting up a haircut appointment with a salon employee. But there was something peculiar about the exchange that audience members probably wouldn’t have noticed if Google CEO Sundar Pichai hadn’t prefaced the feature he was showing off. Incredibly, one of the women on the call was actually an enhanced version of Google’s AI Assistant.
The fascinating demonstration at the company’s developer conference illustrated how the Assistant can make a call on behalf of a user to help them book an appointment. It does this using a frighteningly convincing voice, realistic enough so people on the other end have no idea who (or what) they are speaking to.
Pichai played backed audio clips from two tests Google conducted using the in-development feature it’s calling Google Duplex. In the first, the voice assistant calls a salon to book a haircut appointment. Not only is the tone and pitch of the voice surprisingly lifelike, but its ability to adjust its responses on the fly to create a fluid, intelligible conversation walks a fine line between fascinating and terrifying.
Here is a sample of the conversation.
Duplex AI: “Hi. I’m calling to book a women’s haircut for a client. Um, I’m looking for something on May 3rd.”
Salon: “Sure, give me one second.”
Duplex AI: “Mhm”
Salon: “Sure, what time are you looking for, around?”
Duplex AI: “At 12pm.
Salon: “We do not have a 12pm available, the closest we have to that is a 1:15”
Duplex AI: Do you have anything between 10pm and uhh 12pm?
Salon: “It depends on what service she would like. What service is she looking for?”
Duplex AI: “Just a women’s haircut for now.”
The AI goes on to book the appointment and send a calendar event as confirmation to the user who requested it.
In the second example, Google explained how its AI can handle difficult situations when booking an appointment or reservation isn’t straightforward. In this demo, a male voice calls up a restaurant to book a reservation for four but language barriers create some misunderstanding. The AI effectively simplifies things to reach its desired outcome.
You can watch both demos in the video below.
“The technology is directed towards completing specific tasks, such as scheduling certain types of appointments,” the company wrote in its blog post that further describes the system. “For such tasks, the system makes the conversational experience as natural as possible, allowing people to speak normally, like they would to another person, without having to adapt to a machine.”
Pichai said Google has been working on the technology for years, and it’s still in development. A stripped down version will be tested in the “coming weeks” that calls up restaurants during holidays to determine what their hours are. Once Google has that data, it can make adjustments to Google Search and Maps so individuals don’t flood businesses with phone calls.
Duplex is easily one of the most impressive displays of AI we’ve ever seen. Yes, it was Google’s own DeepMind that created an AI capable of beating the world’s best Go players, but that system had an extremely limited set of abilities. Duplex, which was made with natural languaging understanding, deep learning, and text-to-speech capabilities, has the potential to have a significant impact on people’s day-to-day lives.
As impressive it may be, there is something inherently unnerving about a robot voice being nearly indistinguishable from a human. Google is aware of the moral implications of its technology and is working on a solution.
“It’s important to us that users and businesses have a good experience with this service, and transparency is a key part of that,” the company wrote. “We want to be clear about the intent of the call so businesses understand the context. We’ll be experimenting with the right approach over the coming months.”
Google Duplex isn’t perfect. There were moments in the demo where you could pick out some anomalies in its tone or responses that weren’t exactly in line with what was asked. Google acknowledged that while Duplex is “capable of carrying out sophisticated conversations” and “completes the majority of its tasks fully autonomously, without human involvements” there are tasks it can’t complete, which are handed off to a human operator.
Regardless, if Google releases the AI with the level of accuracy it showed off today, it would mark a significant leap in the complexity of AI-based features available to average smartphone users.
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.