Applied Interactive Multimedia Laboratory

The Applied Interactive Multimedia (AIM) research group at New York University Abu Dhabi is working with diverse facets of interactive multimedia and immersive multimodal systems. Our interest includes the conception, engineering, and utilization of novel haptic interfaces as a new media in Human-Computer Interaction (HCI). AIM investigates the acquisition, communication, and display of spatial, temporal, and physical knowledge of perceived reality through the human sense of touch and the integration/coordination of this knowledge with other sensory displays (such as audio, video, text, smell, etc.) in a general multimedia system. Several applications are of our interest including multimodal learning, gaming, and interpersonal communication.

The AIM lab is pursuing research in the area of affective haptics. Affective Haptics is an emerging area of research that related to, arises from, or influences emotion and enable affective interaction by means of sense of touch. Our goal is to explore the following areas:

  •  acquisition of human emotions via haptic modality;
  •  display of human emotions using haptic display devices;
  •  and enhancement of user’s affective state and reproduction of feeling of social touch by means of haptic stimulation.

Current Projects

Neurohaptics

We define Neurohaptics as “the science and technology that investigate the neural representation and cognitive modulations associated with tactile and/or kinesthetic haptic interactions”. The ultimate goal of this line of research is to develop a neural model that can quantify human haptic experience based on brain activities, and simulate human haptic responses during haptic interaction.
Read more about Neurohaptics

KATIB

This project involves the development of an assistive platform entitled KATIB to help children with learning difficulties acquire handwriting skills. The proposed KATIB platform includes software and hardware components that have the potential to revolutionize the manner of learning and teaching in schools and special education centers. The KATIB platform utilizes a haptic (touch-based) device to teach children how to write Arabic words correctly by physically guiding the hand along a sequence of strokes of reference handwriting.
Read more about KATIB

Haptogram

Haptogram is a system that provides 3D tactile feedback via focused ultrasound without physical contact with the human skin. The tactile sensations are displayed by generating acoustic radiation forces where a phased array of ultrasonic transducers is used to exert forces on a target point in 3D space. Moving the point of tactile stimulation at very high speed along a 3D model creates 3D tactile experience. The ultimate goal is the design an immersive multimodal system that provides visual and haptic hologram display.
Read more about Haptogram

Affective Haptics

In this project, we study the relationship between touch and emotions. For that, we developed a haptic jacket that is capable of displaying vibrotactile sensation with three modes of interaction: discrete sensation, continuous sensation, and continuous sensation with various speeds. Tactile stimulation uses an array of vibration motors using funneling illusion concept.
Read more about Affective Haptics

Haptic jacket used to study the relationship between touch and emotions.

Haptic-Based Simulator

The Haptics-based virtual reality periodontal training simulation project focuses on the research, development, and evaluation of a simulator for training of periodontal procedures for dental students/clinicians. Using virtual reality and Haptics Technologies, the periodontal simulator allows trainees to learn to perform diagnosis and/or treatment procedures of periodontal diseases by visualizing a 3D virtual human mouth and feeling physical tactile sensations as they touch the surface of teeth, gingiva, bone, and calculi via virtual dental instruments.
Read more about Haptic-Based Simulator

Haptic Eye

Haptic Eye’s goal is to produce a way to measure thermal properties of materials based on their response to thermal excitation. Such a procedure would allow for material characterization and identification. Then, we can create haptic models based on the identified materials and apply appropriate values to their haptic properties. The process being modeled is contact-less laser excitation step thermography with known, finite but not infinitesimal excitation length and radius. Simulation and experimental results consistently demonstrate the ability of the proposed approach to classifying different materials based on their thermal properties.
Read more about Haptic Eye

Projects for Social Impact

  • Wheelchair Project
  • Skydiving Project
  • Braille Communication System

Contact

Principal Investigator

Mohamad Eid
Asisstant Professor of Electrical and Computer Engineering
Email: mohamad.eid@nyu.edu
Phone: +971 2-628-4182

Research Engineer

George Korres
Senior Research Engineer

Research Engineer

Muhammad Hassan Jamil
Associate Instructor of Engineering

Postdoctoral Fellows

Wanjoo Park
Postdoctoral Associate

PhD Students

Haneen Hisham Alsuradi