-
Cognition Jul 2022If language has evolved for communication, languages should be structured such that they maximize the efficiency of processing. What is efficient for communication in...
If language has evolved for communication, languages should be structured such that they maximize the efficiency of processing. What is efficient for communication in the visual-gestural modality is different from the auditory-oral modality, and we ask here whether sign languages have adapted to the affordances and constraints of the signed modality. During sign perception, perceivers look almost exclusively at the lower face, rarely looking down at the hands. This means that signs articulated far from the lower face must be perceived through peripheral vision, which has less acuity than central vision. We tested the hypothesis that signs that are more predictable (high frequency signs, signs with common handshapes) can be produced further from the face because precise visual resolution is not necessary for recognition. Using pose estimation algorithms, we examined the structure of over 2000 American Sign Language lexical signs to identify whether lexical frequency and handshape probability affect the position of the wrist in 2D space. We found that frequent signs with rare handshapes tended to occur closer to the signer's face than frequent signs with common handshapes, and that frequent signs are generally more likely to be articulated further from the face than infrequent signs. Together these results provide empirical support for anecdotal assertions that the phonological structure of sign language is shaped by the properties of the human visual and motor systems.
Topics: Gestures; Humans; Language; Recognition, Psychology; Sign Language; Visual Perception
PubMed: 35192994
DOI: 10.1016/j.cognition.2022.105040 -
Journal of Deaf Studies and Deaf... Sep 2022Sign language speakers are at a disadvantage in terms of health literacy due to the lack of health education materials in sign languages. Deaf and hard of hearing (DHH)...
Sign language speakers are at a disadvantage in terms of health literacy due to the lack of health education materials in sign languages. Deaf and hard of hearing (DHH) individuals are excluded from health literacy research due to the lack of measurement tools in their language of excellent fluency. This study aims to provide the literature with a tool that allows the measurement of health literacy among DHH individuals. The Turkish Health Literacy Scale (THLS)-32 was translated into Turkish Sign Language (TSL). After the THLS-32 was translated into TSL in video format, it was tested for validity and reliability. The translated version of the scale was administered to participants from a DHH association in Turkey who are fluent in TSL. Subsequently, a study was conducted with 207 DHH individuals. The study group was assessed in terms of their mean index scores and evaluated to have "limited health literacy" according to the THLS-32 classification. We conclude that the THLS-32 in TSL is suitable to measure health literacy in DHH individuals and to assess the impact of the health education system.
Topics: Health Literacy; Hearing Loss; Humans; Language; Reproducibility of Results; Sign Language; Turkey
PubMed: 35914243
DOI: 10.1093/deafed/enac025 -
Brain & Development Apr 2004Forms of sign language have developed in a number of countries. American Sign Language, which originated from French signing, has been most extensively researched. As... (Review)
Review
Forms of sign language have developed in a number of countries. American Sign Language, which originated from French signing, has been most extensively researched. As sign language is based on gestures executed in space and perceived visually it might be thought that it would mainly be a function of the right cerebral hemisphere when this is the non-dominant one. A number of studies are reviewed showing that sign language is a language in its own right and therefore, as with spoken language, its primary site of organization is in the dominant hemisphere. This does not mean that there is not a significant contribution from the other hemisphere with an interplay between the two. Each research project usually contributes some facet of knowledge apart from the main conclusions. These included the importance of distinguishing signs from gestures, the localization of different types of signing within the left dominant cerebral hemisphere, the fact that lesions of the right non-dominant hemisphere, although not causing a loss of signing will result in dyspraxia, and that aphasic symptoms of signing and speech are not modality dependant but reflected a disruption of language processes common to all languages. Examples are given of discoveries made by the use of the newer neuroradiological techniques such as functional magnetic resonance imaging and positron emission tomography, and no doubt these will lead to further advances in knowledge. The use of sign language in the treatment of patients with verbal aphasia is considered, especially of children with the Landau-Kleffner syndrome, but therapy of this kind can be used in children with delayed language development, and in other types of acquired aphasia at any age. Other methods of treatment than signing, such as cochlear implants may be increasingly used in the future, but it seems likely that sign language will continue to be a dominant feature in the deaf culture.
Topics: Aphasia; Deafness; Humans; Nervous System Physiological Phenomena; Sign Language
PubMed: 15030901
DOI: 10.1016/S0387-7604(03)00128-1 -
Neuropsychologia Mar 2022The present study explored the influence of iconicity on sign lexical retrieval and whether it is modulated by the task at hand. Lexical frequency was also manipulated...
The present study explored the influence of iconicity on sign lexical retrieval and whether it is modulated by the task at hand. Lexical frequency was also manipulated to have an index of lexical processing during sign production. Behavioural and electrophysiological measures (ERPs) were collected from 22 Deaf bimodal bilinguals while performing a picture naming task in Catalan Sign Language (Llengua de Signes Catalana, LSC) and a word-to-sign translation task (Spanish written-words to LSC). Iconicity effects were observed in the picture naming task, but not in the word-to-sign translation task, both behaviourally and at the ERP level. In contrast, frequency effects were observed in the two tasks, with ERP effects appearing earlier in the word-to-sign translation than in the picture naming task. These results support the idea that iconicity in sign language is not pervasive but modulated by task demands. As discussed, iconicity effects in sign language would be emphasised when naming pictures because sign lexical representations in this task are retrieved via semantic-to-phonological links. Conversely, attenuated iconicity effects when translating words might result from sign lexical representations being directly accessed from the lexical representations of the word.
Topics: Humans; Language; Linguistics; Semantics; Sign Language
PubMed: 35114219
DOI: 10.1016/j.neuropsychologia.2022.108166 -
Professioni Infermieristiche 2019The lack of verbal communication due to the tracheostomy is a great challenge for nurses because communication is an essential aspect for caring, which also involves... (Review)
Review
BACKGROUND
The lack of verbal communication due to the tracheostomy is a great challenge for nurses because communication is an essential aspect for caring, which also involves parental figures in pediatric patients.
OBJECTIVES
Qualitative synthesis of the evidence to support learning and use of sign language in tracheostomised children in order to enhance the therapeutic relationship as well as communication between nurse and pediatric patient.
METHOD
We conducted a narrative review, the following databases were interrogated: PubMed (Medline), Cinahl, Scopus, Cochrane and Google Scholar. Studies related to the research question were included, without temporal limitation.
RESULTS
43 articles were selected and were subsequently grouped in relation to the type of study, description of specific teaching programs and recommendations.
CONCLUSIONS
The use of alternative communication techniques is a priority for nurses who take care of tracheostomised children. Among these, undoubtedly, the sign language reveals its efficiency.
Topics: Child; Communication; Humans; Nurse-Patient Relations; Sign Language; Tracheostomy
PubMed: 31883570
DOI: 10.7429/pi.2019.723192 -
Scientific Reports Sep 2023Sensory and language experience can affect brain organization and domain-general abilities. For example, D/deaf individuals show superior visual perception compared to...
Sensory and language experience can affect brain organization and domain-general abilities. For example, D/deaf individuals show superior visual perception compared to hearing controls in several domains, including the perception of faces and peripheral motion. While these enhancements may result from sensory loss and subsequent neural plasticity, they may also reflect experience using a visual-manual language, like American Sign Language (ASL), where signers must process moving hand signs and facial cues simultaneously. In an effort to disentangle these concurrent sensory experiences, we examined how learning sign language influences visual abilities by comparing bimodal bilinguals (i.e., sign language users with typical hearing) and hearing non-signers. Bimodal bilinguals and hearing non-signers completed online psychophysical measures of face matching and biological motion discrimination. No significant group differences were observed across these two tasks, suggesting that sign language experience is insufficient to induce perceptual advantages in typical-hearing adults. However, ASL proficiency (but not years of experience or age of acquisition) was found to predict performance on the motion perception task among bimodal bilinguals. Overall, the results presented here highlight a need for more nuanced study of how linguistic environments, sensory experience, and cognitive functions impact broad perceptual processes and underlying neural correlates.
Topics: Adult; Humans; Sign Language; Language; Hearing; Brain; Motion Perception
PubMed: 37714887
DOI: 10.1038/s41598-023-41636-x -
Sensors (Basel, Switzerland) Sep 2022Communication between people is a basic social skill used to exchange information. It is often used for self-express and to meet basic human needs, such as the need for...
Communication between people is a basic social skill used to exchange information. It is often used for self-express and to meet basic human needs, such as the need for closeness, belonging, and security. This process takes place at different levels, using different means, with specific effects. It generally means a two-way flow of information in the immediate area of contact with another person. When people are communicating using the same language, the flow of information is much easier compared to the situation when two people use two different languages from different language families. The process of social communication with the deaf is difficult as well. It is therefore essential to use modern technologies to facilitate communication with deaf and non-speaking people. This article presents the results of work on a prototype of a glove using textronic elements produced using a physical vacuum deposition process. The signal from the sensors, in the form of resistance changes, is read by the microcontroller, and then it is processed and displayed on a smartphone screen in the form of single letters. During the experiment, 520 letters were signed by each author. The correctness of interpreting the signs was 86.5%. Each letter was recognized within approximately 3 s. One of the main results of the article was also the selection of an appropriate material (Velostat, membrane) that can be used as a sensor for the proposed application solution. The proposed solution can enable communication with the deaf using the finger alphabet, which can be used to spell single words or the most important key words.
Topics: Communication; Humans; Poland; Sign Language; Translating
PubMed: 36146138
DOI: 10.3390/s22186788 -
Sensors (Basel, Switzerland) Jun 2020This review analyses the different gesture recognition systems through a timeline, showing the different types of technology, and specifying which are the most important... (Review)
Review
This review analyses the different gesture recognition systems through a timeline, showing the different types of technology, and specifying which are the most important features and their achieved recognition rates. At the end of the review, Leap Motion sensor possibilities are described in detail, in order to consider its application on the field of sign language. This device has many positive characteristics that make it a good option for sign language. One of the most important conclusions is the ability of the Leap Motion sensor to provide 3D information from the hands for due identification.
Topics: Gestures; Hand; Humans; Pattern Recognition, Automated; Sign Language
PubMed: 32599793
DOI: 10.3390/s20123571 -
PloS One 2022Currently there are around 466 million hard of hearing people and this amount is expected to grow in the coming years. Despite the efforts that have been made, there is...
Currently there are around 466 million hard of hearing people and this amount is expected to grow in the coming years. Despite the efforts that have been made, there is a communication barrier between deaf and hard of hearing signers and non-signers in environments without an interpreter. Different approaches have been developed lately to try to deal with this issue. In this work, we present an Argentinian Sign Language (LSA) recognition system which uses hand landmarks extracted from videos of the LSA64 dataset in order to distinguish between different signs. Different features are extracted from the signals created with the hand landmarks values, which are first transformed by the Common Spatial Patterns (CSP) algorithm. CSP is a dimensionality reduction algorithm and it has been widely used for EEG systems. The features extracted from the transformed signals have been then used to feed different classifiers, such as Random Forest (RF), K-Nearest Neighbors (KNN) or Multilayer Perceptron (MLP). Several experiments have been performed from which promising results have been obtained, achieving accuracy values between 0.90 and 0.95 on a set of 42 signs.
Topics: Humans; Sign Language; Deafness; Recognition, Psychology
PubMed: 36315481
DOI: 10.1371/journal.pone.0276941 -
Philosophical Transactions of the Royal... May 2021Several Upper Palaeolithic archaeological sites from the Gravettian period display hand stencils with missing fingers. On the basis of the stencils that Leroi-Gourhan...
Several Upper Palaeolithic archaeological sites from the Gravettian period display hand stencils with missing fingers. On the basis of the stencils that Leroi-Gourhan identified in the cave of Gargas (France) in the late 1960s, we explore the hypothesis that those stencils represent hand signs with deliberate folding of fingers, intentionally projected as a negative figure onto the wall. Through a study of the biomechanics of handshapes, we analyse the articulatory effort required for producing the handshapes under the stencils in the Gargas cave, and show that only handshapes that are articulable in the air can be found among the existing stencils. In other words, handshape configurations that would have required using the cave wall as a support for the fingers are not attested. We argue that the stencils correspond to the type of handshape that one ordinarily finds in sign language phonology. More concretely, we claim that they correspond to signs of an 'alternate' or 'non-primary' sign language, like those still employed by a number of bimodal (speaking and signing) human groups in hunter-gatherer populations, like the Australian first nations or the Plains Indians. In those groups, signing is used for hunting and for a rich array of ritual purposes, including mourning and traditional story-telling. We discuss further evidence, based on typological generalizations about the phonology of non-primary sign languages and comparative ethnographic work, that points to such a parallelism. This evidence includes the fact that for some of those groups, stencil and petroglyph art has independently been linked to their sign language expressions. This article is part of the theme issue 'Reconstructing prehistoric languages'.
Topics: Archaeology; Caves; Cultural Evolution; France; Gestures; Humans; Language; Linguistics; Sign Language
PubMed: 33745310
DOI: 10.1098/rstb.2020.0205