The effect of background noise on the gestures, gaze and speech of hearing-impaired interlocutors
Holding a conversation in noise is a remarkably complex task, in which multiple individuals must coordinate speaking and listening turns toward a shared communicative goal. To be able to best bolster communication for hearing-impaired individuals, it is crucial to know the interplay between individuals’ gestures, gaze and speech in real-world scenarios, data that can be compared with multi-modal strategies for optimal communication (e.g., to see if head movements and speech levels adjust appropriately for noise levels). We therefore measured the behaviour of 16 hearing impaired dyads and 11 triads holding conversations in background noise. Participants held semi-structured conversations while head movement, eye movement and speech were recorded. All groupings were mixed gender, with participants matched on speech-in-noise perception, hearing asymmetry, and age. Dyadic interactions involved three conversations in speech-shaped noise, while triadic interactions involved two conversations in speech-shaped noise, and two conversations in eight-speaker babble. In each conversation, noise level was varied between 54-78 dB in 15-25 s segments. Conversation topics (and in triads, noise types) were counterbalanced.
Results showed an effect of noise level on head movement behaviours. Across both experiments -- dyad and triad conversations -- participants leant closest together for the loudest noise level, and moved gradually apart for each subsequent noise level. Participants also focused their head more directly toward their partner at louder noise levels. Furthermore, there was an effect of noise type in triadic conversation, with participants leaning closer in the babble conditions. The acoustic changes due to such movements, however, were small (<1 dB) relative to changes in speaking level (1-2 dB), neither of which accommodated the noise-level changes (6 dB steps). Despite expressed difficulties in noisier conditions, there were few requests for clarification during conversation. Participants’ eye position (with respect to their partner) and speech level will also be discussed. The data as a whole demonstrate clearly identifiable strategies in conversation to ameliorate hearing difficulty that could be exploited in current hearing rehabilitation and future multi-modal hearing prostheses.
[Work supported by the Medical Research Council (grant number U135097131) and the Chief Scientist Office of the Scottish Government.]