Siri, who is the most comprehensive digital personal assistant right now?
Sorry, I don’t know what you mean.
A determined team at Stanford University has developed a chatbot system that is triumphantly distinct from all the other chatbot personal assistants – an agent that draws on human conversational strategies to combine commands, allowing it to perform more complex tasks that it has not been explicitly designed to support.
Iris turns language commands into blocks of text that can be immensely combined with other ones. This design allows every user command (such as “make a reservation”) to be tagged with instructions that tell Iris how it can be stitched together with further commands. Handling more complex forms of conversation, this clever chatbot could pave the way for personal assistants that understand how people really speak to one another.
To enable this complexity, the team introduced a domain specific language that transforms commands into automata that Iris can compose, sequence, and execute dynamically by interacting with a user through natural language, as well as a conversational type system that manages what kinds of commands can be combined. Iris intends to help users with data science tasks, a domain that requires support for command combination.
Furthermore, Iris comprehends another conversational quirk called anaphora: a phrase that depends on a prior conversation, such as saying “she” when you earlier mentioned your sister. Again, the top digital assistants have this ability, but only when hard-coded.
Normally, humans use all sorts of linguistic tricks and techniques to make ourselves understood. One of the most common is the way we nest sub-conversations within an overarching discussion. You do this, for example, when you answer the question “when shall we meet at the pub?” by asking a further question about when that person finishes work.
Alexa or Siri struggle with such nested conversations unless they have been pre-programmed – or hard-coded – to react to specific examples. Damn.
Fast will launch a standalone web app in the next few months to allow more users to interact with Iris and improve its understanding – and ours.Iris is still a bit limited, which means that for now it’s only being used as a modified data science tool. It lacks the natural language ability that Apple, Google and Amazon have coded into their chatbots. But in the future, these could integrate Iris’s underlying architecture, providing “a scaffolding of context” for a future generation of digital assistant chatbots, says Ethan Fast, part of the team behind Iris. “We hope we can learn much higher-level stuff about how conversations flow,” he says.
Iris, see you soon.
Get weekly science updates in your inbox!