Carnegie Mellon University
Browse

Perception of 3-D location based on vision, touch, and extended touch.

journal contribution
posted on 2013-01-01, 00:00 authored by Nicholas A. Giudice, Roberta KlatzkyRoberta Klatzky, Christopher R. Bennett, Jack M. Loomis

Perception of the near environment gives rise to spatial images in working memory that continue to represent the spatial layout even after cessation of sensory input. As the observer moves, these spatial images are continuously updated. This research is concerned with (1) whether spatial images of targets are formed when they are sensed using extended touch (i.e., using a probe to extend the reach of the arm) and (2) the accuracy with which such targets are perceived. In Experiment 1, participants perceived the 3-D locations of individual targets from a fixed origin and were then tested with an updating task involving blindfolded walking followed by placement of the hand at the remembered target location. Twenty-four target locations, representing all combinations of two distances, two heights, and six azimuths, were perceived by vision or by blindfolded exploration with the bare hand, a 1-m probe, or a 2-m probe. Systematic errors in azimuth were observed for all targets, reflecting errors in representing the target locations and updating. Overall, updating after visual perception was best, but the quantitative differences between conditions were small. Experiment 2 demonstrated that auditory information signifying contact with the target was not a factor. Overall, the results indicate that 3-D spatial images can be formed of targets sensed by extended touch and that perception by extended touch, even out to 1.75 m, is surprisingly accurate.

History

Date

2013-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC