MRI: Multisensory Human Interaction Measurement and Synthesis for Computer Graphics and Interactive Virtual Environments

Project Details

Description

EIA-0215887

Pai, Dinesh

DeCarlo, Douglas M.

Mataxas, Dimitris N.

Nguyen, Thu D.

Rutgers University - New Brunswick

MRI: Multisensory Human Interaction Measurement and Synthesis for Computer Graphics and Interactive Virtual Environments

This proposal, measuring multisensory human interaction with everyday objects and with other humans, develops an integrated facility in which it will be possible to acquire synchronized measurements of visual, auditory, and haptic behavior with low latency. The work includes measuring motion at up to 250Hz using a marker-based motion capture system, acquiring dense range images, recording speech and contact sounds, and measuring forces and pressure distributions due to human contact. The facility will make a broad range of research activity possible. Multisensory models of objects will be developed; it will then be possible not only to see images of a virtual object, but to feel its stiffness and surface texture using a force feedback haptic device, and to hear its sound when hit. Multisensory models of human conversational behavior will be developed by tracking lip and arm movements at the same time as recording voice. 'Interaction capture' will extend motion capture, the current state of the art for realistic computer animation. Hence, it will be possible to not only transfer and transform the motion of an animated character, but also the forces and sounds produced when the character interacts with the world. Moreover, the computing infrastructure needed to support interaction will be investigated. Multisensory interaction imposes new constraints on system behavior, particularly latency, and could lead to new designs of computer operating systems and communication networks. The instrumentation will enable more students to be educated about multisensory simulation and interaction, and to use multisensory environments in new ways to stimulate learning and discovery. Funds are requested for

1. Sensor systems for interaction and measurement,

2. Acoustical environment of experiment,

3. Audiovisual displays, and

4. Computing.

StatusFinished
Effective start/end date7/1/026/30/05

Funding

  • National Science Foundation: $259,598.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.