The Brain Functions for Language Process

1246 Words3 Pages

Learning and expressing a language is an amazing capability that begins in the early stages of a person’s life. Several brain regions have been pinpointed as distinct language areas, such as Broca’s and Wernicke’s areas, but additional brain regions contribute to language processes as well. In addition to understanding spoken language, humans have the ability to comprehend music and to learn sign language if needed.
Language acquisition starts early in life, especially the acquisition and learning of phonetics which is detailed in Kuhl’s review article. Numerous studies have been carried out on infants using a wide variety of neuroimaging techniques including EEG/ERP, MEG, fMRI and NIRS. These studies mostly focus on which brain mechanisms are responsible for understanding phonetic units of speech, or the consonants and vowels that create words. It is first important to note that language displays a critical period in which the ability to acquire a second language declines after age seven. Furthermore, a study that tested American infants and Japanese infants on their ability to discriminate between the English /ra-la/ phonetic contrast showed that before the critical period both groups of infants performed similarly, but after the critical period American infants performed much better than Japanese infants. This provides evidence that infants are able to detect which phonemes, which can alter the meaning of words, are meaningful for their own language. Infants also execute phonetic learning using statistical learning; they become sensitive to the distribution of frequencies of the sounds in their everyday language between the ages of six to twelve months. Social interaction is another mechanism that aids in language acquisitio...

... middle of paper ...

...igners result in aphasia just like they do in people who speak. Another finding is that sign language processing is different than gesture processing in general. One study showed greater activation in the left perisylvian regions and the left frontoparietal network for ASL signs than for transitive and grooming gestures. The left perisylvian regions are also involved in the processing of spoken language. A difference between the processing of sign language and spoken language is that sign language activates the superior and inferior parietal lobules. The superior parietal lobule is thought to be important for proprioceptive monitoring during signing and that the inferior parietal lobule is needed for phonological processing and is employed during the production and imitation of hand movements. Sign language and spoken language have both similarities and differences.

Open Document