InteliTouch is an engine which was designed to both use and augment the current Intel Perceptual Computing SDK to enable using the available depth camera to convert a regular work surface such as a table into a touch surface. This was achieved by detecting finger touches close to the tables surface using a unique and robust algorithm which we developed and that handles both depth distortion and noise that results from inaccurate and often low quality depth maps. The video on the left shows a demonstration of some simple applications which use the underlying InteliTouch engine to detect keystrokes on a printed keyboard, a virtual drum, a piano set and an extended emoticon set.

The engine works by detecting finger touches on a predefined area of a work surface and then uses a customised mapping image to understand what each finger touch means. Detecting the finger touches is non-trivial and cannot be done using background subtraction or pure depth thresholding. Instead we build a graph of the underlying surface (which might not be perfectly flat) and determine the best fit plane using SVD which spans the domain of the graph. For a new input depth image we convert it to a cloud of points and measure their distance to the plane and subtract the height of the graph from this. It is on this compensated height field that we perform distance thresholding. Points that are within 1cm (most likely to be finger touches) are projected down onto the plane, mapped back to the image and then perspective-warped back onto the final map. After clustering the larger groups of thresholded points and eliminating noisy detections we are left with the location of the touches on the mapped image. The video on the right demonstrates the underlying information being exposed by the InteliTouch engine.