Report ID
2010-12
Report Authors
Victor Fragoso, Steffen Gauglitz, Shane Zamora, Jim Kleban, Matthew Turk
Report Date
Abstract

Researchers have long been interested in the synergy between portability and computing power but had been limited by unwieldy, uncommonly used devices. The latest generation of mobile phones, i.e. 'smartphones', are equipped with hardware powerful enough to develop novel, interesting applications with allow users to directly interact with the world around them. This paper describes a multimodal, augmented reality translator developed using a smartphone's (Nokia N900) camera and touchscreen combined with OCR(Tesseract) and online translation services (Google Translation API). We describe our methods for tracking, text detection, OCR and translation, and provide results quantifying OCR accuracy on a set of signs collected around the UCSB campus.

Document