Language-Affect Interface in Parent-Infant Communication
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 798658.
Liquan Liu (photo: Mona Ødegård, UiO)
Language, gesture, and affect are the three musketeers of parent-child interaction. Although affect, our experience with emotions, plays a key role in communication, its developmental trajectory in the beginning of life remains unclear. How do infants develop affect recognition that corresponds to their native environment? How do they perceive affect from non-native cultures? How does language interact with affect perception, and how does infants’ (multi-) linguistic and cultural experience play a role? To further our understanding toward these questions, I propose to experimentally examine infants’ affectual development and its interaction with language in the first year after birth, and specifically, adopting a preferential looking paradigm where infants will watch videos of happy/angry expressions from various cultures along with languages that match or mismatch with the correspondent culture. This exciting project bridges psycholinguistic and sociolinguistic approaches.
Europe is changing more than ever towards a complex social environment, with interactions between languages and cultures. I hope to enhance European scientific excellence by contributing to a better understanding of infant culture-specific affectual development in various linguistic contexts. As a direct consequence, the outcome of the project may change parental attitude towards a diverse linguistic and cultural environment.
About the project
Infant affectual development: The learning of world knowledge occurs in socio-cultural environments, where infants quickly adapt to their respective socio-cultural environment. Infants’ correct understanding of caretakers’ expressions is crucial for their well-being. Though being an issue for debate, studies investigating parent-infant interaction suggest that display norms for happy, surprised and sad expressions tend to be more universal, whereas those for anger, disgust, and fear differ across cultures. Parents communicate with their children in a culture-specific manner. American and Japanese mothers, for instance, interact with their 5-month-old infants following their own culture. In the domain of language, perceptual attunement and neural commitment theories pinpoint infants’ transition from universal to language-specific perception that fit their ambient environment in the first year after birth. Such a fine-tuning process appears to be domain-general crossing visual and auditory modalities. Although earlier research on children aged 2-5 years shows a gradual categorization pattern towards affectual expressions, the potential experience-induced culture-general to culture-specific changes in affectual perception in infancy remains unclear. To address this issue, I will compare infants’ perception of native and non-native affect representations at two developmental stages in the first year after birth.
Language-affect interaction: Language, gesture and facial expressions do not work alone but interact in human communication, yet limited studies have examined their inter-disciplinary relationship in infancy. Two-way interactions exist between linguistic and affectual development. On the one hand, emotion socialization models address the impact of language on emotion regulation and development, as caretakers’ language use influences infants’ development of emotion. On the other hand, affect assists child language learning. Children are drawn to positive affect embedded in speech directed towards them. Studies demonstrate rapid language development co-occurring with enhanced (negative) emotion at 12 months, yet no study has experimentally addressed the language-affect interaction in the first half of the first year. In addition, studies examining the language-affect interface typically code parent-infant interaction observations or adopt a single dimensional (auditory-only) design with speakers’ emotions embedded in spoken language. I intend to investigate infants’ multi-sensory perception where language (auditory) and affect (visual) interact.
Language, affect, and bilingualism: Children’s experience dominates their development. Regarding the development of a child in a multilingual setting, researchers of MultiLing (host institute) have shown that affective processing differs between sequential bilinguals’ first and second language, the latter being more “disembodied” than the former. The effect of language-affect “embodiment” among young simultaneous bilinguals, and especially the relationship between degree of exposure and early embodiment, remains unclear. Additionally, I have proposed a multilingual enhanced acoustic sensitivity hypothesis across auditory (speech, music) domains, and such multilingual enhancement appears to extend to the visual domain: bilingual infants at 8 and 12 months looking more at the mouth than the eye area than their monolingual peers. Whether a multilingual enhanced sensitivity would apply to affectual perception and language-affect interface needs to be examined. Linking the two lines of research between MultiLing and mine, this project will investigate the effect of infants' multilingual (and multi-cultural) experiences on their affect and language-affect interaction perception and development.
- To measure preference of native and non-native positive/happy and negative/angry affectual representations in a multi-sensory setting with the interaction between language (auditory) and affect (visual) among infants at two developmental stages in the first year after birth in Norwegian monolingual and bilingual contexts.
- To investigate the influence of social and linguistic factors on affectual perception, studying whether multilingual heightened sensitivity would extend to the affectual domain.
It is novel to test affect and language-affect interaction using a preferential looking paradigm as the general method. Similar paradigms have been adopted to examine infant linguistic and cognitive development such as face or motor perception and have provided good measurable data. Infant looking time will be recorded as the dependent variable and the independent variables are affect, culture, ethnicity, conditions, apart from infant backgrounds. Program setting and data analysis will involve UU (secondment, short visit).
Participants: Norwegian families with young children living in Oslo will be tested. Infants are incorporated in a 2 (age: 5-6 months vs. 11-12 months, cross-sectional) by 2 (linguistic background: monolingual vs. bilingual) design, with a minimum of 16 participants per condition (Ntotal minimum = 64).
Stimuli: Audio stimuli consist of positive/happy and negative/angry voices of Norwegian and Japanese mothers spoken to their infants will be recorded. Visual stimuli of Norwegian and Japanese mothers’ cultural-general (happiness) and cultural-specific (anger) facial expressions to their infants will be recorded in three conditions (no sound, Norwegian, Japanese).
Procedure: Caretakers will complete relevant questionnaires prior to testing. During the experiment, infants’ looking times will be recorded through behavioural/eye-tracking measures, and recoded as the dependent variables. A preferential looking paradigm will be adopted. Each experiment consists of three blocks with paired video comparisons.
Utrecht Institute of Linguistics OTS, Utrecht University
School of Social Sciences and Psychology, Western Sydney University
The MARCS Institute for Brain, Behaviour and Development, Western Sydney University