Stop the Chitchat. Bots Don’t Need to Sound Like Us

The makers of conversational bots like Alexa and Siri insert unnecessary conversational fillips. "OK, here's what I found..."
Image may contain Human Person and Book
Everyday conversation is filled with little pauses and filler words, the “phatic” spackle of social interactions.ZOHAR LAZAR

Bert Brautigam is sick of having conversations with his devices. Like many of us, Brautigam, who works for the design firm Ziba, uses voice assistants like Google’s phone AI or Amazon’s Alexa. The theory is that voice commands make life more convenient.

But these assistants are scripted to emulate every­day conversation. And everyday conversation is filled with little pauses and filler words, the “phatic” spackle of social interactions. That’s why Alexa says things like “Sorry, I’m not sure about that,” or Siri says “OK, here’s what I found …” when it delivers search results. It’s how humans talk. But when a bot does it, the chitchat clogs up the flow of command-and-action.

It is gradually driving Brautigam nuts—not just the bots’ tics, but the ratiocinations he has to go through to make them do, well, anything. “If I want to turn on the flashlight on my phone,” he says, “I say, ‘Turn on the flashlight.’ It’s four words, but all I should need is one word, right? ‘Flashlight!’”

For years, sci-fi promised that one day we’d interact with machines as if they were people. But what if conversation turns out to be a lousy idea? We’ve been down this road before. It’s the problem of so-called skeuomorphic design: In the early days of a new technology, designers mimic the look and feel of older media. Apple’s first iPad calendar app resembled a paper day planner, including “pages” that you’d rip away as time passed.

Sometimes designers use skeuomorphs because they’re imprisoned by the past, unable to imagine the demands of the new. (Early cars had buggy-whip holsters.) And sometimes they do it on purpose, to ease future shock. Either way, skeuomorphs slow things down by adding functionally useless interactions. It’s only when designers finally abandon them that they’re free to create zippier interfaces.

“Conversational AI” is suffering through these precise growing pains. Our bots talk like 19th-century butlers, clotting their replies with ponderous conversational fillips. When I ask Alexa “What’s the weather,” I get a needlessly verbose answer (“Today, you can look for …”). It’s not a big deal the first time, but after months of this I get impatient. I’m looking for some quick statistics; Alexa is auditioning for a role in an Oscar Wilde play.

The designers of these garrulous interactions say we need bots to act like humans. “Most people feel uncomfortable talking to a machine,” says David Contreras, an Alexa designer. “Adding these ‘chat patterns’ or some kind of personality helps overcome this feeling.” Sensely, a firm that makes a virtual-nurse app, has found that many patients appreciate it when a bot does things like let out a cheer when a blood-pressure reading is low. “They develop a relationship,” says Cathy Pearl, Sensely’s VP of user experience.

I predict that many people will eventually want to move beyond all that. They’ll crave a more fluid, allegro pace of voice interaction the same way power users of desktop software eventually adopt keyboard commands.

The question is how non-­conversational bots ought to behave. A post-­skeuomorphic world ought to allow more creativity—and weirdness. Some of the cutesier servant bots already mainly coo and purr. As a remedy to his own annoyance, Brautigam has suggested that bots abandon words entirely and speak in musical tones or f/x sounds, like R2-D2. (He’s silent on whether we’d do the same.) That kind of post-human design could unlock weirder, wilder AI. (The artist and author Joanne McNeil points out that nonverbal bots could also kill the icky convention of perky, subservient feminine-voiced interfaces.)

We’re bothered by AIs that try to sound human. Let’s see how we like it when they try to sound like robots.

Write to clive@clivethompson.net.


This article appears in the October issue. Subscribe now.