The demo was
impressive, I’ll give them that. A machine - Google’s voice assistant -
booked an appointment at a hair salon. And then a table at a restaurant.
Not online, or through some automated system. But by talking to a human. Over the phone. Blimey. Google Duplex, unveiled at the firm’s annual developers’ conference, has incredible potential - albeit one that’s laced with a hint of terror over AI’s continued march into our lives.
Duplex is a system that can take a user's data - like wanting a hair cut on a specific day at 12pm - and relay it using an automated voice to a human being, reacting to questions and the irregularities of a typical person’s speech.
The voice is designed to sound completely natural, complete with the ums and uh huhs found in everyday life. The recipient of the call, ideally, is none the wiser.
You should take a minute to listen to the examples on Google’s blog post about the idea, published on Tuesday.
'Experimenting'
Now. Let’s start with the obvious question. Does it work? We don’t know.
Frustratingly, Google was unable to show us this technology in action. We have no idea if the calls shared today were the successful calls out of many, many attempts - nor do we know if the recipient was prepped beforehand. We don’t know how easy the system is to fool, or just confuse enough to render it useless. Anyone who uses Google’s Assistant today knows how often it stumbles over a lot of basic requests.
So, while others in attendance have referred to today’s demo as “stunning”, I’ll retreat to something more sensible: it’s promising.
As is increasingly the case, the key hurdle here may not be one of technological limitations, but societal.
Start here: will the recipient know they are talking to a machine? Do they deserve to be told? Will Google monitor each call and learn from its contents? If so, how would the recipient consent to that - as the law, in many places, demands - ?
"It’s important to us that users and businesses have a good experience with this service, and transparency is a key part of that,” Google said in its blog.
"We want to be clear about the intent of the call so businesses understand the context. We’ll be experimenting with the right approach over the coming months.”
Deceptive
So far that approach, according to the recordings, isn’t to tell the recipient they are talking to a bot - indeed, the “mhmm” noises from the AI are arguably pretty deceptive.
Assuming that gets figured out, think of it from the other side. As a human being in a service job, how would you feel about customers who couldn’t be bothered to call you themselves?
Like a shop worker might recognise an irritating customer’s voice, staff will surely start to notice the telltale signs of an AI speaking to them. I’m not sure I’d give it the time of day.
Then again - if it brings in a lot of business, workers might be told to just get over it.
Or maybe, the salon could get an AI assistant of its own. And then it's AI talking to AI. Booking appointments. Planning. Plotting. Waiting.
No comments:
Post a Comment