Super Linguistics Colloquium Series

Spring 2020 Schedule

Note: If not indicated otherwise, talks will take place on Zoom. If you would like to attend the talk, please send an email to Pritty Patel-Grosz

 
Friday June 19th, 4.15pm – 6pm
Dorothy Ahn (Rutgers)
Title: The Point of Pointing
Pointing occurs frequently in both spoken and signed languages, though the discussion and the analysis of it in the two language modalities have developed rather separately. In this talk, I point out the similarities of the co-speech pointing gesture in spoken languages and the indexical handshape (IX) used for referent tracking in signed languages. I propose a unified analysis of pointing, where both the co-speech gesture and IX are analyzed as a modifier that provides a locational restriction. I discuss the main implications of this analysis and how it relates to other recent studies on exophoric demonstratives, co-speech gestures, and loci use in sign languages.
 
Friday June 5th, at 4.15pm-6pm
Yosef Prat (Institut de Biologie, Université de Neuchâtel)
Animals Have No Language, and Humans Are Animals Too

Do nonhuman animals have language? In humans, language is prominently manifested by vocal communication (i.e., speech). However, while vocal communication is ubiquitous across the animal kingdom, studies to date have found only elementary parallels to speech in nonhuman animals. These modest linguistic capacities of other species have fortified our belief that language is uniquely human. But have we really tested this uniqueness claim? By adopting methods that are commonly used in bioacoustics, I demonstrate that, surprisingly, a true impartial comparison between human speech and other animal vocalizations has not been conducted yet. Oddly, studying human speech using the same methods used to study other species vocalizations is actually expected to provide us with no evidence for human uniqueness.

 

Wednesday 6th May, at 5.15pm-7pm

Patrick Georg Grosz (UiO), Elsi Heilala Kaiser (USC), and Francesco Pierini (ENS)

Emoji Resolution: Indexicality and Anaphoricity 🤔

Abstract: Emojis are an emerging object of study in linguistics and beyond (Bai et al. 2019), and it has been suggested that they are digital counterparts of speech-accompanying gestures in computer-mediated communication (Gawne & McCulloch 2019, Pierini 2019). In this talk, we focus on two subsets of emojis, namely non-face emojis that denote activities (such as the ‘basketball’ or ‘soccer ball’; henceforth ‘activity emojis’), and affective emojis, which include face emojis (such as the ‘grinning face’ and the “angry face’) as well as a set of affective non-face emojis (such as ‘thumbs up’ and ‘heart’). We argue that both the activity emojis and the affective emojis are typically anchored to an individual with a role such as Agent or Experiencer. Moreover, we provide evidence for a view where activity emojis are anaphoric (and often exhibit properties similar to 3rd person pronouns), while affective emojis exhibit 1st-person indexicality. The central paradigm is given in (1ab)-(2ab), where (1ab) exhibit 3rd-person anaphoricity, whereas (2ab) exhibit 1st-person indexicality. We propose a formal semantic analysis, where activity emojis denote separate discourse units, connected to the accompanying text via salient discourse relations, whereas affective emojis are expressive modifiers (similar to adverbs like ‘damn’ and interjections like ‘oh my’).

 

(1a) kate said sue impressed ann 🏀 [basketball-emoji]

–-> agent of basketball-event = Sue

(1b) kate said sue admired ann 🏀 [basketball-emoji]

–> agent of basketball-event = Ann

(2a) kate said sue impressed ann 😲 [astonished-face-emoji]

–> experiencer of astonished-state = author (speaker)

(2b) kate said sue admired ann 😲 [astonished-face-emoji]

–> experiencer of astonished-state = author (speaker)

 

Friday 24th January, at 2.15-4pm

Jean-Julien Aucouturier (CNRS/IRCAM)

Henrik Wergelands hus, Room 536. (This presentation will be given via video conferencing.)

Is Schumann a scam? how music tricks our brain into thinking it's worthy of emotions.

Music holds tremendous power over our emotions. Through a particularly touching phrase, a forceful chord or even a single note, musical sounds trigger powerful subjective reactions. For scientists, these strong reactions are vexing facts, because such emotional reactions are typically understood as survival reflexes: our increased heart rates, suddenly-sweaty hands or deeper breath are responses preparing our organism to e.g. fight or run away if we stumble into a bear in the woods. Stumbling into music, be it a violin or a flute, a C or a C#, hardly seems a similar matter of life or death. This talk will review recent scientific experiments, from the fields of musicology, psychology and neuroscience, which are trying to dissect musical sounds to see what exactly makes our brains think them worthy of such strong reactions – perhaps because they mimic the dissonant roar of a predator, reproduce the accents and prosody of emotional speech, or the spectral patterns of certain environmental sounds.
Bio: Jean-Julien Aucouturier is a CNRS researcher in cognitive science in IRCAM (Institut de Recherche et Coordination Acoustique/Musique) in Paris, where he leads the CREAM music neuroscience lab (http://cream.ircam.fr).
 

Archive

Autumn 2019 Schedule

Spring 2019 Schedule

Published Oct. 23, 2018 1:44 PM - Last modified June 21, 2020 4:40 PM