Holographic Raman tweezers controlled by multi-modal natural user interface

Authors

TOMORI Z KESA P NIKOROVIC M KANKA J JÁKL Petr SERY M BERNATOVÁ Silvie VALUSOVA E ANTALÍK Marián ZEMÁNEK Pavel

Year of publication 2016
Type Article in Periodical
Magazine / Source Journal of Optics
Citation
Web http://dx.doi.org/10.1088/2040-8978/18/1/015602
Doi http://dx.doi.org/10.1088/2040-8978/18/1/015602
Keywords holographic optical tweezers; Raman microspectroscopy; human-computer interface
Description Holographic optical tweezers provide a contactless way to trap and manipulate several microobjects independently in space using focused laser beams. Although the methods of fast and efficient generation of optical traps are well developed, their user friendly control still lags behind. Even though several attempts have appeared recently to exploit touch tablets, 2D cameras, or Kinect game consoles, they have not yet reached the level of natural human interface. Here we demonstrate a multi-modal 'natural user interface' approach that combines finger and gaze tracking with gesture and speech recognition. This allows us to select objects with an operator's gaze and voice, to trap the objects and control their positions via tracking of finger movement in space and to run semi-automatic procedures such as acquisition of Raman spectra from preselected objects. This approach takes advantage of the power of human processing of images together with smooth control of human fingertips and downscales these skills to control remotely the motion of microobjects at microscale in a natural way for the human operator.

You are running an old browser version. We recommend updating your browser to its latest version.

More info