Induction Pack for new Staff and Students -

Mental Health and Deafness

Information on Deaf Issues

Types and levels of deafness

Types of language and communication

Communication Tips

Communication Support

Sign Language Interpreting Service

Interpreting Policy

Signing Policy and BSL Classes

Deaf Community

Deaf Organisations

Equipment and Technology, On-line BSL interpreter

Mental Health by Jim Cromwell

Mental Health and Deafness Tips

Deaf Ethnicity in Mental Health

Towards Equity and Access report 2005

References

Mental Health and Deafness Contact

Types of Deafness

Some children are born deaf and others become deaf later (are deafened), due to illness or medication they may have been given. Few children are profoundly deaf. Most have some hearing on some frequencies at certain volumes.

There are three types of deafness, these are described below:

Conductive deafness

This is the most common type of deafness. It occurs when sounds cannot pass through the outer and middle ear to the inner ear, (which includes the cochlea). This is often caused by blockages such as wax in the ear canal or fluid in the middle ear.

Sensori-neural deafness (or ‘nerve deafness’)

As sound passes through the outer and middle ear, tiny hair cells in the cochlea convert sound waves into electrical signals. These signals travel along the auditory nerve to the brain. Most cases of sensori-neural deafness are caused by loss of, or damage to the hair cells in the cochlea.

This damage can be caused by infectious diseases such as rubella, mumps, measles or meningitis. Children may be deaf because of a shortage of oxygen in the bloodstream at birth or could be a result from another trauma during pregnancy or childbirth.

Mixed Deafness

This is a mixture of conductive and sensori-neural deafness.

Levels of Deafness

There are different levels of deafness and these are most often classified as mild, moderate, severe or profound.

Mild Deafness

A child with mild deafness may be able to hear sounds ranging between 24 and 40 decibels (dB) on average in their better ear. Children with a mild level of deafness may find it difficult to follow speech in situations where there is a lot of background noise.

Moderate Deafness

A child with a moderate level of deafness may hear sounds between 40 and 70 decibels on average in their better ear. Children with moderate deafness find it difficult to follow speech without a hearing aid or other technology to amplify the sound.

Severe Deafness

A child with a severe level of deafness may hear sounds between 70 and 95 decibels on average in their better ear. Children with a severe level of deafness may rely heavily on lip-reading or may use sign language as a communication method. They may also use technology such as a text phone.

Profound Deafness

A child with a profound level of deafness can hear sounds of around 95 decibels or more on average in their better ear. Profoundly deaf children may lip-read and/or use sign language and use a text phone.

The commonest cause of acquired hearing loss is ageing. Some illnesses, such as mumps, measles and meningitis, and severe head injuries may also cause deafness. Exposure to extreme noise, for example, explosions or repeated exposure to loud musicor machinery,may also cause hearing loss.

There are other reasons for deafness. One or two children per thousand are born with significant, permanent deafness. Of these, an estimated 50% have a moderate hearing loss, and 50% are severely or profoundly deaf. There are many reasons why a child may be born with a hearing loss – over 90% of deaf children are born into families where both parents are hearing.

Hearing Aids and Cochlear Implants

Hearing Aids

Most hearing aids have a common purpose – to amplify sound signals. They come in various shapes and types and may be worn on the body, behind the ear, or in the ear. Some, like cochlear implants, have parts that are surgically implanted. Most have audiological settings so they can be adjusted to suit the users’ specific needs. A range of digital hearing aids are available which offer much more precise control of these settings. Hearing aids enable people to utilise their residual hearing. It is important to remember all noise including background noise is amplified making communication difficult in noisy environments.

Cochlear Implants

Most sensory neural deafness is caused by loss or damage to the tiny hair cells in the cochlea. Where enough functioning hair cells remain, conventional hearing aids may help. If a child has a severe to profound deafness, there may not be sufficient functioning hair cells for hearing aids to be effective. For these children a cochlear implant may help.

The implant is a sophisticated hearing aid, which works by stimulating the auditory nerve and bypassing the damaged hair cells in the cochlea to provide a sensation of hearing. Like hearing aids, cochlear implants do not restore typical hearing levels. The implant system has two parts; the external part consists of the speech processor, a lead, transmitter coil and microphone. The internal part is surgically implanted under the skin behind the ear. It includes a number of electrodes that directly stimulate the auditory nerve.

Types of Language and Communication

How deaf people communicate depends on their hearing loss and their preference.

People with mild hearing loss (25-40dBHL) may have some difficulty in following what is said, mainly in groups or noisy situations. Some wear hearing aids and find lip-reading helpful in certain situations.

A moderate hearing loss (40-70dBHL) means people have difficulty in following what is said without a hearing aid, particularly somewhere noisy. They will probably use a voice telephone if it has an adjustable volume or is designed to work with hearing aids.

People with a severe hearing loss (70-95dBHL) may have difficulty following what is being said even with use of a hearing aid. Many rely on lip-reading and some use British Sign Language (BSL). Mostdeaf find it hard to use a voice telephone, even if it is amplified, and the majority choose to use a text phone.

Hearing aids may be of little or no benefit to people with profound hearing loss (95+dBHL). They may use British Sign Language (BSL) or lip-read, or both. They will probably use a text phone. Some may have a cochlear implant.

British Sign Language (BSL)

BSL is the language used by 50,000 – 70,000 people within the British Deaf community (RNID). BSL is a complex visual-spatial language with its own vocabulary, structure and grammar, which is different from spoken English. BSL uses both hand shapes and non-manual features including facial expressions, lip shapes and body movement. As spoken languages have different dialects, sign language has regional variations. Also, as with spoken languages, sign language is different in different countries. BSL is a language in its own right. BSL is officially recognised by the UK government to be a language. Recognition was achieved in March 2003. Finger spelling is a way of spelling out words, usually for names and places, using your hands to show each letter. In Britain most people use a two-handed alphabet. Some countries (for example North America) use a one-handed alphabet.

Auralism/Oralism

An umbrella term covering aural/oral approaches to communication and education that concentrates on developing listening skills and spoken language.

Lip-reading/speech reading

Lip-reading/speech reading is used by some deaf people to follow speech. When people speak, their lips make patterns. Lip-reading is the ability to read these patterns. It is not possible to distinguish all the parts of speech from lip-reading alone, as only a third of words can be understood by lip-reading. Knowledge of spoken language is extremely important for successful lip-reading.

Sign Supported English (SSE)

Sign Supported English uses BSL (signs and finger spelling) and follows English word order but it does not require every word to be signed.

Signed (Exact) English (SEE)

Signed (exact) English uses BSL (signs and finger spelling) and other specifically developed signs to give an exact manual representation of spoken English. Each spoken word is represented with a sign and it is designed to be used at the same time as spoken English.

Total Communication

Total communication is a philosophy, which involves selecting the communication method that is the most appropriate for the individual at any given time. Total communication may involve the use of aural/oral support and/or the use of a sign system.

Bilingualism

Bilingualism is the ability to use two languages fluently. Usually for deaf people in England these are English and British Sign Language.

Cued Speech

Some words which sound different when verbalised can look very similar when they are lip-read by deaf people (e.g. pat and bat). Cued speech uses one hand placed near the mouth and a variety of hand shapes to highlight the differences between spoken words.

Makaton

Makaton is a basic system of a few hundred signs, which is mainly used by children and adults who have learning disabilities. Although it is separate from BSL, it is used by some deaf children and adults with additional needs. It consists of vocabulary, (influenced by BSL) which allows the child to express basic needs.

Deaf blindness

About 23,000 people in the UK have a combined sight and hearing loss (RNID). They need additional support for communication, accessing information and mobility.

Degrees of deafness

Hearing loss is measured in decibels, as dBHL (hearing level). It is often greater at some pitches than others. Many people have less hearing at high pitches than at low pitches. They may be able to hear you speak but not make out the words because they cannot tell the difference between some consonants, particularly the higher pitched ones like ‘s’, ‘sh’, ‘f’, ‘p’, ‘t’ and ‘k’.

Statistics on the incidence of deafness within the UK Population.

Description of Average dBHLNumber of% of total UK

hearing loss(better ear)peoplepopulation

Mild25-404,645,0007.9%

Moderate40-703,335,0005.7%

Severe70-95537,0000.9%

Profound95+146,0000.2%

Total8,663,00014.7%

Estimates are based in National Study of Hearing (see A Davis, Hearing in Adults, Whurr 1995) and current general population estimates.

Statistics relating to deafness are regularly published on the RNID website:

See

RNID Tinnitus

Tinnitus helpline

0808 808 6666 voice

0808 808 00007 text

020 7296 8199 fax

For further details on oral education please refer to the following websites:

and

History of Sign Language

Juan Pablo Bonet, Reducción de las letras y arte para enseñar a hablar a los mudos (Madrid, 1620).

The written history of sign language began in the 17th century in Spain. In 1620, Juan Pablo Bonet published Reducción de las letras y arte para enseñar a hablar a los mudos (‘Reduction of letters and art for teaching mute people to speak’) in Madrid. It is considered the first modern treaty of Phonetics and Logopedia, setting out a method of oral education for the deaf people by means of the use of manual signs, in form of a manual alphabet to improve the communication of the dumb or deaf people.

From the language of signs of Bonet, Charles-Michel de l'Épée published his alphabet in the 18th century, which has arrived basically unchanged until the present time.

In 1755, Abbé de l'Épée founded the first public school for deaf children in Paris; Laurent Clerc was arguably its most famous graduate. He went to the United States with Thomas Hopkins Gallaudet to found the American School for the Deaf in Hartford, Connecticut.[1] Gallaudet's son, Edward Miner Gallaudet founded the first college for the deaf in 1857, which in 1864 became Gallaudet University in Washington, DC, the only liberal arts university for the deaf in the world.

Generally, each spoken language has a sign language counterpart in as much as each linguistic population will contain Deaf members who will generate a sign language. In much the same way that geographical or cultural forces will isolate populations and lead to the generation of different and distinct spoken languages, the same forces operate on signed languages and so they tend to maintain their identities through time in roughly the same areas of influence as the local spoken languages. This occurs even though sign languages have no relation to the spoken languages of the lands in which they arise. There are notable exceptions to this pattern, however, as some geographic regions sharing a spoken language have multiple, unrelated signed languages. Variations within a 'national' sign language can usually be correlated to the geographic location of residential schools for the deaf.

International Sign, formerly known as Gestuno

It is used mainly at international Deaf events such as the Deaflympics and meetings of the World Federation of the Deaf. Recent studies claim that while International Sign is a kind of apidgin, they conclude that it is more complex than a typical pidgin and indeed is more like a full signed language.

Engravings of Reducción de las letras y arte para enseñar a hablar a los mudos (Bonet, 1620)


A. /
B, C, D. /
E, F, G. /
H, I, L.

M, N. /
O, P, Q. /
R, S, T. /
V, X, Y, Z.

Linguistics of Sign

In linguistic terms, sign languages are as rich and complex as any oral language, despite the common misconception that they are not "real languages". Professional linguists have studied many sign languages and found them to have every linguistic component required to be classed as true languages.

Sign languages are not pantomime - in other words, signs are largely arbitrary and have no necessary visual relationship to their referent, much as most spoken language is not onomatopoetic. Nor are they a visual rendition of an oral language. They have complex grammars of their own, and can be used to discuss any topic, from the simple and concrete to the lofty and abstract.

Sign languages, like oral languages, organise elementary, meaningless units (phonemes; once called cheremes in the case of sign languages) into meaningful semantic units. The elements of a sign are Handshape (or Handform), Orientation (or Palm Orientation), Location (or Place of Articulation), Movement, and Non-manual markers (or Facial Expression), summarised in the acronym HOLME.

Common linguistic features of deaf sign languages are extensive use of classifiers, a high degree of inflection, and a topic-comment syntax. Many unique linguistic features emerge from sign languages' ability to produce meaning in different parts of the visual field simultaneously. For example, the recipient of a signed message can read meanings carried by the hands, the facial expression and the body posture in the same moment. This is in contrast to oral languages, where the sounds that comprise words are mostly sequential (tone being an exception).

Sign languages' relationships with oral languages

A common misconception is that sign languages are somehow dependent on oral languages, that is, that they are oral language spelled out in gesture, or that they were invented by hearing people. Hearing teachers of deaf schools, such as Thomas Hopkins Gallaudet, are often incorrectly referred to as inventors of sign language.

The manual alphabet is used in sign languages, mostly for proper names and technical or specialised vocabulary. The use of fingerspelling was once taken as evidence that sign languages are simplified versions of oral languages, but in fact it is merely one tool among many. Fingerspelling can sometimes be a source of new signs, which are called lexicalised signs.

On the whole, deaf sign languages are independent of oral languages and follow their own paths of development. For example, British Sign Language and American Sign Language are quite different and mutually unintelligible, even though the hearing people of Britain and America share the same oral language.

Similarly, countries which use a single oral language throughout may have two or more sign languages; whereas an area that contains more than one oral language might use only one sign language. South Africa, which has 11 official oral languages and a similar number of other widely used oral languages is a good example of this. It has only one sign language with two variants due to two major educational institutions for the deaf which serve different geographic areas of the country.

Use of Signs in Hearing Communities

Gesture is a typical component of spoken languages. More elaborate systems of manual communication have developed in situations where speech is not practical or permitted, such as cloistered religious communities, scuba diving, television recording studios, loud workplaces, stock exchanges, in baseball, while hunting (by groups such as the Kalahari bushmen), or in the game Charades. In Rugby Union the Referee uses a limited but defined set of signs to communicate his/her decisions to the spectators. Recently, there has been a movement to teach and encourage the use of sign language with toddlers before they learn to talk and with non-deaf or hard-of-hearing children with other causes of speech impairment or delay. This is typically referred to as

Baby Sign.

On occasion, where the prevalence of deaf people is high enough, a deaf sign language has been taken up by an entire local community. Famous examples of this include Martha's Vineyard Sign Language in the USA, Kata Kolok in a village in Bali, Adamorobe Sign Language in Ghana and Yucatec Maya sign language in Mexico. In such communities deaf people are not socially disadvantaged.