Voice Search app is, try saying into it, “How much snow did Binghamton, N.Y., get yesterday?” Google delivers 116,000 results, and none answer the question.
A key to Watson’s success is what IBM calls natural language processing – the ability to tease out the meaning of a sentence. Search apps today just recognize individual words and then they look for matches. Watson had to essentially diagram every sentence like a third-grade teacher and try to discover the intent of the whole collection of words and punctuation.
“When we hear language, we bring so much context to interpreting the question that we come up with sensible and reasonable answers,” David Ferrucci, the IBM scientist who led the Watson effort, told NPR. “The computer struggles with that.”
It helps when the field is narrowed, so the computer starts with at least some context. Smartphone users can already see how satisfying that can be when using speech recognition in a navigation app. The system knows to look for an address or place, so when you say “Dulles airport,” most times the app gets it right.
Apps that will understand intent
What we need are apps that, like Watson, understand the intent of a question, and then comb through tons of information to supply the precise answer. And most likely, the first such apps will focus on a specific kind of information.
Just after the Jeopardy matches aired, IBM said it is working with the medical schools at Columbia University and the University of Maryland to develop a physician’s assistant system based on Watson technology. “There’s a tremendous amount of medical information for physicians to search, process and sort, and Watson provides a powerful way to do all that automatically,” said Peter Durlach, a vice president at Nuance Communications, which will also work on the project.
And while the system at this point would be for high-end professional use, there’s clearly a trend line that leads to a medical app on a smartphone that can answer medical questions a whole lot better than typing symptoms into WebMD. We want an app that will let a mom pick up her phone and just talk about her child’s symptoms, and get a first-pass diagnosis and a little advice on what to do. As Watson technology develops, it should allow that to happen.
The possibilities are endless
It’s easy to imagine a travel agent app. While the web revolutionized travel information, putting together a vacation takes a lot of work. Watson could evolve into an app that lets you say, “I want to go to Acapulco in May and stay at a small hotel on the beach that gets good reviews by travelers” – and will know exactly what you’re looking for.
Some legal professionals see Watson as a way to create a tool for legal research, finding facts or precedents buried in heaps of documents or deep in law books. As with medicine, this would at first be a big, expensive system used by law firms, but inevitably it could become a legal app. Tell it your situation and ask, for instance, whether you have the right to sue, and it could tell you.
The possibilities, really, are endless. A shopping app based on Watson could find the product you want – like a personal shopper boxed in a mobile device. A dating app could help you find a match by listening to you talk about your dream date.
How long until some of this happens? Watson today is a science project running on 90 servers. Futurist Ray Kurzweil has said that in seven years, a much-improved Watson could run on a single server – and on a personal device within a decade.
Along the way, IBM will have some competition in natural language processing. Google may not have developed a Jeopardy winner, but it isn’t standing still. The company is plowing ahead developing language technology based on the vast amount of data that flows through Google’s servers.
Even before Watson-level software can run on a PC or iPad, it could run in data centers and be available through the network – which is how Google voice search and navigation apps work now. So Watson-like question-answering apps could start arriving in just a few years.