People apply rules of their spoken language to sign language: study
When our brain is "doing language," it projects abstract structure.
When our brain is \"doing language,\" it projects abstract structure.
Washington D.C.:
In contrary to popular belief that language is not limited to speech, a recent study reveals that people also apply the rules of their spoken language to sign language.
According to Northeastern University, language is not simply about hearing sounds or moving our mouths. When our brain is \"doing language,\" it projects abstract structure. The modality (speech or sign) is secondary.
\"There is a misconception in the general public that sign language is not really a language,\" said researcher Iris Berent. \"Part of our mandate, through the support of the NSF, is to reveal the complex structure of sign language, and in so doing, disabuse the public of this notion.\"
To come to this conclusion, Berent's lab studied words (and signs) that shared the same general structure. She found that people reacted to this structure in the same way, irrespective of whether they were presented with speech or signs.
In the study, Berent studied words and signs with doubling, ones that show full or partial repetition. She found that responses to these forms shift, depending on their linguistic context.
When a word is presented by itself (or as a name for just one object), people avoid doubling. For example, they rate slaflaf (with doubling) worse than slafmak (with no doubling). But when doubling signaled a systematic change in meaning, participants now preferred it.
Next, Berent asked what happens when people see doubling in signs (signs with two identical syllables). The subjects were English speakers who had no knowledge of a sign language. To Berent's surprise, these subjects responded to signs in the same way they responded to the words. They disliked doubling for singular objects, but they systematically preferred it if (and only if) doubling signaled plurality. Hebrew speakers showed this preference when doubling signaled a diminutive, in line with the structure of their language.
\"It's not about the stimulus, it's really about the mind, and specifically about the language system,\" said Berent. \"These results suggest that our knowledge of language is abstract and amodal. Human brains can grasp the structure of language regardless of whether it is presented in speech or in sign.\"
Currently there is a debate as to what role sign language has played in language evolution, and whether the structure of sign language shares similarities with spoken language. Berent's lab shows that our brain detects some deep similarities between speech and sign language. This allows for English speakers, for example, extending their knowledge of language to sign language.
\"Sign language has a structure, and even if you examine it at the phonological level, where you would expect it to be completely different from spoken language, you can still find similarities. What's even more remarkable is that our brain can extract some of this structure even when we have no knowledge of sign language. We can apply some of the rules of our spoken language phonology to signs,\" said Berent.
Berent says these findings show that our brains are built to deal with very different types of linguistic inputs. The results from this paper confirm what some scientists have long thought, but hasn't truly been grasped by the general public--language is language no matter what format it takes.
\"This is a significant finding for the deaf community because sign language is their legacy. It defines their identity, and we should all recognize its value. It's also significant to our human identity, generally, because language is what defines us as a species,\" he added.
To help further support these findings, Berent and her lab intend to examine how these rules apply to other languages. The present study focused on English and Hebrew. The study has been published in Proceedings of the National Academy of Sciences.