There’s a phrase used in software all to describe a utility that’s so essential, so brilliant, that it simply sells the hardware around it: The Killer Application.
Not as deadly as they sound, it should be pointed out. Windows, in its time, was a killer app. So was the original Netscape browser, so (arguably) was id Software’s Doom. Killer Applications don’t even have to be applications that launch a new idea as much as they refine an existing one, and if you want proof of that, look at Google. When Google launched, there were a plethora of general purpose search engines on offer. These days, if Microsoft wasn’t pouring buckets of money into Bing, there’d only be Google.
One of the big buzzwords that has been labelled as a killer application in recent years has been the concept of Augmented Reality. This is taking a device with an inbuilt camera (typically a smartphone) and often location awareness (usually GPS), and matching the two to enable the screen on the smartphone to display additional information about the location around you. To date, it’s largely been used simple games and for navigation-style applications, such as pointing out where nearby restaurants are, or for interacting with Wikipedia entries for local points of interest. Reasonable stuff, although the number of people actually willing to wander around unfamiliar environments holding an expensive smartphone up to their faces is, not surprisingly, rather low.
I’ve recently been testing out something that could well be the next great killer augmented reality application, even though it’s on a platform that’s very well established: The iPhone. As with all things iPhone, it’s an app, and in this case, it’s a translation application called Word Lens. And as with most killer applications, it’s not really in what it’s doing — which is essentially just crude machine-based single word translation — but in how it melds existing technologies with new ones to achieve its purpose.
The use of software to aid in language translation goes back decades, but until relatively recently it was largely limited in use to those in fixed positions. What Word Lens does is use the iPhone’s camera to capture text, then translate it on the fly and superimpose it over the onscreen display. The end results can be a little shaky depending on how well you’re focusing the camera and whether the font used on the text is easily readable or not, but for basic translations, as long as you’re aware of the essential context of what you’re looking at, it’s surprisingly good. For those with a sense of the impish, it’ll also reverse words or blank them out altogether in the free demo version. For now, it’ll only support English to Spanish or Spanish To English, but apparently other languages are in the works.
Word Lens isn’t the only augmented reality application with utility at its core on the market, even though it’s one with immediate impact for any traveller, especially as it works without an active data connection. Google’s been slowly improving its Google Goggles application (available for Android and iPhone), which uses the same kind of image recognition for immediate searching as well as simple translation via optical character recognition, although its text-handling capabilities are nowhere near as good as that offered by Word Lens.
Where Augmented Reality applications like Word Lens or Google Goggles get it right is by reducing the need to interface with the application down to a few seconds, rather than a constant connection. You’re much more likely to pull out a phone and take a quick snapshot than you are wander around with your phone on prominent display. Equally, by providing a genuinely useful service, such as translation or the display of a quick search to help you understand something, they’ve got the real ability to provide genuine value.