Think far back to the early days of web search—after the days when queries were recorded with rock and chisel, but not by long. Search engines like Ask Jeeves strove to understand natural language queries, such as “what will the weather be like this weekend?” or “when is ‘Titanic’ playing?”. It was a bit ambitious given the processing and indexing limitations of the time.
Image via The Official Google Blog.
15 years later, natural language processing has improved considerably, and the challenge is not a lack of data, but instead a profusion of it. With Siri and Voice Search, Apple and Google are going head-to-head to create search products that behave more like personal assistants than a gateway to what was once called the information superhighway. Yesterday, Google integrated into its search results the “Google Knowledge Graph“, a database of over 500 million “real-world people, places and things with 3.5 billion attributes and connections among them.”
By tapping this massive database of real-world and semantic user data, Google aims to offer web search that is more personalized, better at understanding natural language queries, and capable of making smart connections between information on the web and the data stored in users’ accounts. As anyone who’s tried to do a demo of Siri or Google Voice Search in front of their friends can probably attest, we still haven’t quite realized the goal of having a personalized digital butler in our pockets. But as data proliferates and the tools of analysis improve exponentially, we’re much closer than we might have imagined 15 years ago.
About the Author