Report

It's good to talk

Learning to speak is a complex process that begins with hearing. How can you help your child become word perfect?

Read more Last updated: 2018-04-03
In collection Kids
Reading duration: 5 minutes

It’s one of the most exciting milestones for new parents – that precious moment when a baby opens their mouth and utters, not just a coo or a gurgle, but their very first fully formed word. Although this may seem like the start of a new phase – your baby’s very first communication using language – it’s actually just another step in a process that began some time ago in the womb.

First sounds

By the 20th week of pregnancy, the cochlea – the part of the inner ear that transmits sound signals to the auditory nerve – is fully formed. By 25 weeks, the whole auditory system is complete. Some research suggests that babies react to their mothers’ voices in the womb as early as 16 weeks, while one study showed that when babies heard music in the womb, their heart rates increased and they moved around in time to the music. They even remember the sounds too. In a 2013 Finnish study, EEG sensors revealed that babies’ brains displayed memories of made-up words that had been played to them when they were in the womb.

Why hearing is important

So what does all this have to do with talking? “Children learn to speak by copying what they hear, so hearing is key to the development of speech and language,” explains Dr Kaukab Rajput, Consultant Audiological Physician at Great Ormond Street Hospital for Children in London, UK “A child will hear a sound or word, reproduce their own version, then correct themselves as their listening skills gradually improve. This is also the way in which a child learns to use language.” It‘s important to note that speech is defined as the ability to make sounds, while language is the ability to understand and use these sounds in order to communicate.

5 ways to help your child progress

The following strategies can help children who hear normally, as well as those with cochlear implants, to learn to talk:

  1. Try baby talk – “In the first few weeks, repeating the sounds your baby makes will encourage them to continue to make these sounds. This reinforcement is a vital part of the learning process for listening and spoken communication. Taking it in turns to ‘speak’ helps teach babies the art of conversation, and provides an opportunity for them to listen to you, listen to themselves, and gradually shape their own productions to match yours more closely,” says Ingrid Steyns.
  2. Chat to them – Even though your baby can’t understand what you’re saying, talk through routines, describing what you’re doing and making eye contact with them.
  3. Limit technology – “Avoid letting toddlers spend too long watching TV or playing on tablets, as children learn better through interaction that is personally meaningful to them,” says Dr Rajput. Talking, reading and singing will all help teach them new words.
  4. Pronounce words properly – Once your child is beginning to speak, it is important to set an example by using the correct pronunciation.
  5. Encourage imaginative play – Pretending to work in a shop, for instance, is a great way to expand their vocabulary.
Mother and kids reading
Reading, talking and singing with your child are great ways to help them learn new words.
© Getty Images

The process

Ingrid Steyns specialises in listening, speech and language rehabilitation at the Innsbruck-based head office of MED-EL, an Austrian company that develops and produces hearing implants. She explains: “At four to six months, babies begin to babble and imitate sounds. In babbling and vocal play, babies gradually learn to coord­inate and combine early sounds together. At seven to 11 months, they start to understand a small range of frequently heard words such as their names, parents’ names or common words such as ‘bye bye‘.

“We see the emergence of first words at approximately 12 months, which is when words carry a consistent meaning as opposed to just copying a sound – for instance, the child will say ‘Mama’ and look at their mother. They will also understand common phrases like ‘sit down’ or ‘where’s Daddy?’, and we see a rapid expansion in vocabulary.”

Spotting problems

However, problems arise if the child can’t hear. Dr Rajput explains: “If a child hears no sound at all, the auditory path ways aren‘t stimulated and the hearing part of the brain won‘t develop. After three or four years of no stimulation, the auditory part of the brain makes way for other senses, such as vision, which become more dominant.”

While a profound or total hearing loss will become apparent reasonably quickly, either through routine tests or parental observation, a partial hearing problem is harder to spot. “For instance, if a child can only hear low frequencies, they will only hear parts of some words, so ‘house’ will become ‘how’ and ‘shoe’ will become ‘oo’. But this may not be obvious until perhaps the age of three, delaying their speech and language development,” says Dr Rajput.

“If you think your child has a hearing problem, it’s important to get an assessment as soon as possible. A delay in hearing, speech and language development may delay communication and affect their educational attainment.”

What about tonal languages?

Cochlear implants (CIs) were developed in western countries so most of the research was automatically carried out using European languages such as English and German.

However, over more recent years CIs have found their way to China and other non-western countries, where languages have very different characteristics. One of the main features of Chinese languages such as Mandarin is that they are tonal. This means that the same word can have totally different meanings when said at a constant, falling or rising pitch. This can be particularly challenging for people who are learning to hear with CIs.

Peter Nopp, Director of Research – Signal Processing, at MED-EL in Innsbruck, Austria, explains: “In order to enable a child to learn to speak accurately, it’s important to provide as many of the details contained in the sound as possible through the CI. This is achieved using a sound-coding strategy, which analyses sound signals and then stimulates the cochlea via the electrode contacts at the correct rate, volume and in the correct place.”

Naturally, when a sound enters the ear, the whole cochlea, the part of the inner ear that conveys sound information to the auditory nerve, is stimulated and each sound is perceived at a specific region in the cochlea and at a specific rate. Basically, the sound-coding strategy imitates this natural process.

“Pitch coding through a CI improves with greater coverage of the cochlea. The greater the coverage, the broader the range of frequencies provided,” adds Peter Nopp.

Also, practice makes perfect when it comes to (re)habilitation. Ingrid Steyns, Rehabilitation Manager at MED-EL in Innsbruck, Austria, explains: “Children who are using CIs to learn tonal languages would need to focus on distinguishing and reproducing these unique tones more than, say, their European counterparts. As tonal sounds carry a change in meaning, the importance of accurately recognising and utilising them is crucial for successful communication.”

Explore Life offers you a wide range of diverse content with a focus on hearing. Set off for an exploration tour through articles, interviews, videos and more.