Towards a Rhetoric of Tactile Pictures

Carol Wiest

continued . . .

section: A Brief History of Tactile Pictures

Before moving ahead with my analysis of tactile pictures, let me sketch out the current status of tactile research and then explain the critical differences between tactile and visual perception. These two issues form the background history from which a tactile rhetoric will emerge.

Tactile pictures are relatively new. Raised line pictures have been around since the 1960s but were rare as late as the mid 1980s. Researchers, particularly perceptual psychologists, have been studying tactile pictures since the 1970s but high costs of production and technological problems kept tactile pictures from becoming widely available. The earliest research looked at ways of producing maps and tactile graphs to help teach mathematics to blind students. Today, tactile pictures appear in a wide range of educational and recreational contexts, including books for preschool children.

However, many questions about how tactile pictures "work" remain unanswered. Further, production and reproduction problems continue to restrict the content and availability of tactile pictures. Aside from questions of technology, current research emphasizes the following questions:

  • How does tactile perception work and how does it differ from visual perception? (Heller and Kennedy, Kennedy, Pring)

  • What qualities make a tactile picture the most legible and meaningful? (Pring, Barth, Lederman and Campbell, Hinton and Ayres)

  • What skills and strategies do blind readers need to work effectively with tactile pictures? (Hinton 1988, Lederman and Campbell, Mack)

Researchers are approaching these questions from the perspective of psychology or from the discipline for which they wish to create tactile pictures. With the exception of John M. Kennedy, none of these researchers approaches tactile pictures from a rhetorical perspective.

Current rhetorical theory, however, has a great deal to offer. It also has a lot to gain. The questions that tactile researchers have asked deal fundamentally with the question of semiosis. Tactile pictures constitute a different kind of semiotic code, one which intersects with language and vision. Kennedy, based on work with tactile pictures, has suggested that certain aspects of perception are amodal, that is, the same for touch and vision (1). Although this provides a starting point, many gaps remain in our understanding of tactile perception. What we do know about touch is that it is local and sequential rather than holistic and simultaneous, and that touch uses a completely different set of "stimulus parameters" than vision (Barth 270).

While visual images use stimulus parameters such as colour and shadow, tactile pictures use texture, boundaries, and elevation (Barth 270-72, Hinton and Ayres 24, Hinton 11). One of the greatest problems facing tactile researchers is how to use these stimulus parameters to create meaningful impressions, that is, how to combine stimulus parameters so that tactile pictures express the greatest amount of useful information.

When we view a picture, we quickly identify the shapes of forms and the distribution of forms within a space (Foulke). Touch, on the other hand, provides information about specific areas one at a time as the hand moves across the page. The "field of view" for touch is much more limited than for vision. While vision can perceive general layout, touch must work from specific impressions to create a sense of the general layout. As Lederman and Campbell note, "the exceptionally slow, sequential nature of haptic input may prevent or hinder the reader from achieving a holistic impression of the graphic information" (108).

In the past, some researchers have called touch a "primitive" sense because of its local, sequential nature (see for example Nsth 407). These researchers held that touch, because it can only perceive what comes in contact with the body, is inferior to vision. Touch is proxemic, limited to local space, while vision and hearing are distal, able to perceive objects distant from the body. However, more recent studies show that touch can perceive objects distant from the body (Kennedy 1). Tactile pictures can represent objects at varying distances, including objects, such as the sun and moon, which cannot be touched. Tactile pictures can also represent mythical, imaginary creatures such as dragons. Thus, despite the local and sequential nature of tactile perception, our sense of touch can comprehend and interpret a wide range of representations.

Another distinct feature of touch is its physical, active nature. Viewing a tactile picture involves a process of systematic, physical exploration. Readers use their fingertips to explore the tactile picture and to discover relationships between picture elements. Fingertips, hands, wrists, arms, shoulders, and torso adjust and shift position as the reader views the image. Kennedy, Mack, and Edman point out that, perhaps above all other factors, readers of tactile images need to be active, systematic explorers. Unfortunately, active touch is generally discouraged in Western culture. Young, visual readers quickly learn that, although tracing words is acceptable for a time, "real" or "good" readers do not use their hands. Our valuation of visual perception combined with this view of touch as primitive or childish may make it difficult for young tactile readers to value their own forms of literacy.

The issue of tactile literacy is particularly crucial given the current trend toward graphical, screen-based communication. In the early days of computing, for example, user interfaces were text-based. This made it possible for screen reading software to sequentially read aloud the information on the screen. Today, however, user interfaces rely heavily on graphical information and these graphics are almost impossbile for screen readers to interpret. Refreshable Braille displays provide a tactile representation of a screen, but they display the screen a few lines at a time and only display the text. Researchers are investigating new technologies for presenting graphical user interfaces, such as tactile graphic input/output tablets (Fricke, Fricke and Baehring). Such a tablet would allow the user to read and respond to text and graphics on the screen. As these technologies become available, tactile literacy skills will become increasingly important, as will an understanding of the rhetoric of tactile pictures.

  1 | 2 | Next Node | 4 | 5 | 6 | 7 | Works Cited

Copyright © Enculturation 2001

Home | Contents 3:2 | Editors | Issues
About | Submissions | Subscribe | Copyright | Review | Links