by Dan Roberts, Editor-in-Chief
Living Well With Low Vision
I recently wrote about the challenges people with visual impairment face when listening to speech (“I Can’t See What You’re Saying“, October 4, 2013). In addition to audible confusion, as discussed in that article, our inability to see facial expressions can also cause difficulty understanding implicit meaning. How, then, are people with blindness or low vision supposed to interpret speech successfully in the absence of visual cues? The answer is actually fairly simple. We listen.
People can communicate their meaning solely by the words they choose and by the context of those words. How they utter them, however, is just as important. Consider how the meaning of this sentence changes depending upon which words receive emphasis (indicated by upper case font).
“Set your glass on the COASTER.” (meaning not on the TABLE)
“Set your glass ON the coaster.” (meaning not NEXT to it)
“Set your GLASS on the coaster.” (meaning not your PLATE)
“Set YOUR glass on the coaster.” (meaning not MY glass)
“SET your glass on the coaster.” (meaning don’t SLAM it on the coaster)
Theatrical directors expect their actors to enhance the meaning of their speech by employing both physical and vocal devices, with the latter usually carrying the most weight. Physical, especially facial, expression can be lost to the balcony audience, and it is totally inapplicable to reader’s theater, radio, audio books, and other nonvisual performance media. Proper inflection, on the other hand, leaves little room for misinterpretation.
Fortunately, we humans have learned, or have evolved, to use inflection for meaning. Inflections may differ among cultures, but they are tightly and uniquely tied into every language. So when I listen to someone, my wife for example, I rarely need to see her face. Which words does she emphasize? Where does her pitch rise or her volume fall? Why did she pause just then? Chris and I have little problem understanding one another’s meaning. Not necessarily because we’ve been married for many years, but because we share the linguistic style of a common language. When she speaks, I don’t have a problem understanding what she means. If she gently says, “Place your glass on the COASTER”, I know she is just trying to help me find where to set my glass. But if her tone is stronger and reprimanding, I not only know where to place the glass, I know Chris is tired of wiping water rings off of her vintage oak coffee table.
Okay, so if I listen to tone and inflections, I’ll be able to catch meanings most of the time. And if I can’t make the call, I can always politely ask for clarification. But what about when it’s my turn to talk? It may be easy to catch Chris’s meaning when she speaks to me, but how do I read her reactions when she’s listening? Is she nodding (understanding)? Is she closing her eyes and shaking her head (disagreeing)? Or maybe she’s not even looking at me, just wondering when she can return to her magazine.
Since we frequently display our thoughts nonverbally during conversation, visualization is an important part of mutual understanding. We smile, roll our eyes, smirk, and drop our jaws in dismay. It is important that we are aware of these kinds of reactions so we know how to sculpt our talk. But how? Again, by listening.
If I listen carefully, I might hear Chris making subtle sounds. Common sounds such as “hmm”, “uh-huh”, and “humph” cue me into her thoughts. If, however, she makes no sounds, I could invite her response with questions like, “Do you agree?” or “Right?” I might even break her silence by saying, “Are you still there?” Normally, that would sound sarcastic, but it is perfectly reasonable coming from someone who can’t see. It might even be an effective way of reminding the listener that we are using our ears, not our eyes.
When at least one dialogue partner is visually impaired, it should sound like conversing over the phone. We have all done that, and we have learned that a little extra effort is required in such situations when vision isn’t possible. All we have to do is apply telephone conversation techniques to in-person communication.
Nothing can replace the benefits of vision in conversation. The ability to see the thousands of combinations of macro- and micro-expressions passing over the face is extremely beneficial to successful interaction. We can, however, do very well if we let our ears attend not to just the words, but to the timbres and sounds that accompany them.