Smart watch expands touch display to the skin

Smart watch expands touch display to the skin

Technology News |
In a paper titled "LumiWatch: On-Arm Projected Graphics and Touch Input" presented at the 2018 Conference on Human Factors in Computing Systems (CHI 2018), researchers from Carnegie Mellon University unveiled a smartwatch-applicable pico-projector proof-of-concept, using the wearer's skin to expand the smartwatch's display area and offer more intuitive touch interactivity.
By eeNews Europe


In their paper, the authors argument that skin on our forearm or on the hand is inherently more intuitive to touch and interact with than a small artificial display area lacking tactile feedback. They designed and integrated a fully self-contained on-skin projection wristwatch, dubbed LumiWatch, featuring a custom-made 15-lumen scanned-laser projector combined with a 10-element 1D ToF depth-sensing array to determine a user’s finger position on the skin when interacting with the projected display.

The self-contained LumiWatch delivers rectified graphics
calibrated onto the forearm’s projection area during a simple
swipe-to-unlock action.

Designed around a Bluetooth 4.0 and WiFi-capable Qualcomm APQ8026 system-on-chip sporting a 1.2GHz quad-core CPU and a 450MHz GPU, the 50×41×17mm prototype ran on Android 5.1 and could operate one hour in continuous projection mode (drawing 2.7W at maximum brightness) on a 740mAh, 3.8V lithium-ion battery incorporated in the casing. Also integrated in the LumiWatch were 768MB of RAM, 4GB of flash memory, an inertial measurement unit and an ambient light sensor.

The researchers designed the 25.8×16.6×5.2mm projector module from three lasers (RGB) with a pair of MEMS mirrors operating in a raster-scan mode to project a 1024×600 image at 60Hz across a 39º×22.5º field of view.

A tricky part of this research was to calibrate and rectify the projected graphics onto the forearm as well as correctly perform continuous 2D finger touch tracking on the skin with coordinated interactive graphics.

Finger tracking was performed thanks to the 1D depth-sensing array, a 7×38×3mm device consisting of ten STMicro VL6180X time-of-flight sensors assembled in line and operated sequentially to determine finger position (at a 27.5Hz frame rate).

The researchers analyse the ToF sensors’ raw data
(red dots) and estimate the touch point, in green. The
resulting touch paths are shown at the bottom of the screen.
On the arm, the current path is also projected for debugging.
© 2018 Association for Computing Machinery.

But associating touch coordinates with the interactive graphics requires arm calibration first. For simplicity, rather than use a real 3D model of the user’s arm, the researchers made up a simple and generic arm model consisting of the interpolation of two ellipses (figuring the wrist and forearm cross-sections). Luminance correction was also necessary to compensate for the differences in pixel brightness based on their distance from the projector. Although a map of the approximate distance from the projector to each pixel on the arm could be pre-computed for this purpose, the researchers also needed to correctly estimate the wrist’s angle relative to the forearm, a precise parameter that drastically affects the projected pixels’ distance relative to the projector and so the location of the display area.

They leveraged a simple “swipe-to-unlock” gesture, which when performed on a fixed projected slider, allows the ToF sensor array to record the finger’s initial axial position and infer the true arm angle (applying simple trigonometry). As the paper reveals, the touch points generated by the swipe can also be used to dynamically calibrate the algorithm’s world coordinate transform, to correctly determine the mapping between the projector’s image coordinates and real-world 3D coordinates.

Hence, the paper concludes that a simple and intuitive unlock gesture is sufficient to calibrate the projector and touch sensor, opening up about 40cm2 of interactive display area, five times that of a typical smartwatch.

On that projected display, the authors demonstrated common touchscreen operations including the detection of single finger taps and swipes for continuous positional input, letting users draw or perform stroked input for text input, map panning or list scrolling. Co-authors of the paper were several engineers from Chinese startup ASU Tech Co. Ltd, hinting on its website at the imminent commercialization of such a pico-projector smartwatch.

CHI 2018 was organized by the Association for Computing Machinery –

CHI 2018 –

Carnegie Mellon University –

ASU Tech Co. Ltd. –


Linked Articles
eeNews Analog