This paper introduces fast and robust computer vision methodsfor hand gesture-based mobile user interfaces. In combinationwith other algorithms, these methods achieve usabilityand interactivity even when both the camera and theobject of interest are in motion, such as with mobile andwearable computing settings. By means of a head-worncamera a set of applications can be controlled entirely withgestures of non-instrumented hands.We describe a set of general gesture-based interactiontechniques and explore their characteristics in terms of tasksuitability and the computer vision algorithms required fortheir recognition. By doing so, we present an arsenal ofmostly generic interaction methods that can be used to facilitateinput to mobile applications. We developed a prototypeapplication testbed to evaluate our gesture-based interfaces.We chose three basic tasks from an infrastructuremaintenance and repair scenario to illustrate the applicabilityof our interface techniques.