Virtualization of Touch

Mohamed Eid, an assistant professor of Electrical Engineering at NYUAD, works with associates on projects in computer haptics.

Mohamed Eid lights up with enthusiasm as he expounds on the future of "tangible interfaces." No wonder: from computer gaming to wheelchair control, his work promises to expand users' computer experiences beyond the keyboard and screen, beyond just sight and sound, to touch, gesture, and more.

Eid, an assistant professor of Electrical Engineering at NYU Abu Dhabi, is working with associates on a sheaf of projects in computer haptics (from the Greek word for touch), a field that offers vast commercial potential while also promising enormous social utility. Consider the "HugMe" jacket: vibro-tactile actuators are inserted into the garment and are controlled by computer; the wearer can feel a remotely applied touch, hug, or even "gunshot" — an example of "affective haptics."

The Lebanese-born Eid's interest in haptics began during his Ph.D. studies and early teaching days at the University of Ottawa, Canada. Aware that engineers must solve problems in many different fields, he took an interest in, among other things, physical therapy for stroke victims at Ottawa General Hospital.

Many such patients lose their understanding of kinesthetic interaction: they would often, for example, either drop or crush a soft plastic cup in trying to pick it up. That sparked Eid's curiosity about the sense of touch, which he noted is "very rich in terms of how much information and energy we exchange," usually unconsciously. For an electrical engineer, the next stop was naturally the new field of haptics.

Today, sight and sound are the "dominant modalities" in users' computer experience, but soon, Eid said, touch will be dominant in some uses. In the slingshot game he has developed, for example, you grasp a ball-like device and move it in three dimensions to control a virtual slingshot. As you "pull back" your slingshot before "firing" the virtual payload, the commercially available controller device gives you a realistic sensation of elasticity. And the "target" person — using a separate computer — must move, or he will feel the hit through the HugMe jacket. Shooters using the slingshot have been found to aim more precisely than those using a standard point-and-click interface.

The potential in the big business of computer games is obvious, but the possibilities go much further. Eid's team has modified a commercially made wheelchair so that it can be activated by the user's blinks and eye motions, to help people with diseases such as Amyotrophic Lateral Sclerosis (ALS). These diseases can devastate motor skills, but eye-muscle control often endures. "We found commercial glasses with infra-red sensors to measure pupil movement," Eid said, "and developed a novel graphical user interface paradigm, so that the wheelchair can be controlled by the eyes."

In a first test, an ALS patient in the US mastered eye control of a modified motorized wheelchair in just 15 minutes.

If we engineers are not helping people, then what are we good for?

Mohamed Eid, NYUAD assistant professor of Electrical Engineering.

The team is developing a related "virtual keyboard" to allow typing by eye gaze and blinks. Add a computerized audio playback device and patients will be able to "speak" even after control of vocal and finger muscles is lost.

Further possibilities come tumbling out of Eid. One of local significance but global promise involves the learning of physical skills such as handwriting: a haptic interface could allow a remote or computerized teacher to physically guide a child's hand in forming Arabic letters; learning through muscle memory appears to be more effective for this purpose than visual learning. On this project, Eid said he is close to launching a start-up commercial venture. And he is encouraging team members to work toward more start-up opportunities.

There is still a lot of practical work to be done in haptics. The HugMe jacket, for example, started with 32 actuators, but now Eid's team is trying to optimize that number, balancing transmitted sensations with wearer comfort while keeping costs down.

Challenges abound. Hardware must minimize user fatigue — a growing issue as motion increases. Software must provide multi-modal synchronization; if a user's haptic experience is not coordinated with visual and aural cues, the result is worse than a movie with out-of-sync audio.

Then there's "multi-modal perception." In our away-from-the-computer lives, each of our senses is more or less important in each second, depending on what we're doing. Such switching among "dominant modalities" in the virtual world will not be easy.

These are early days for the study of haptics, but the manifold possibilities fuel Eid's idealism. After all, he asks, "If we engineers are not helping people, then what are we good for?"

This article originally appeared in NYUAD's 2013-14 Research Report (13MB PDF).