Super Linguistics Colloquium Series
Spring 2020 Schedule
Note: If not indicated otherwise, talks will take place on Zoom. If you would like to attend the talk, please send an email to Pritty Patel-Grosz
Do nonhuman animals have language? In humans, language is prominently manifested by vocal communication (i.e., speech). However, while vocal communication is ubiquitous across the animal kingdom, studies to date have found only elementary parallels to speech in nonhuman animals. These modest linguistic capacities of other species have fortified our belief that language is uniquely human. But have we really tested this uniqueness claim? By adopting methods that are commonly used in bioacoustics, I demonstrate that, surprisingly, a true impartial comparison between human speech and other animal vocalizations has not been conducted yet. Oddly, studying human speech using the same methods used to study other species vocalizations is actually expected to provide us with no evidence for human uniqueness.
Wednesday 6th May, at 5.15pm-7pm
Emoji Resolution: Indexicality and Anaphoricity 🤔
Abstract: Emojis are an emerging object of study in linguistics and beyond (Bai et al. 2019), and it has been suggested that they are digital counterparts of speech-accompanying gestures in computer-mediated communication (Gawne & McCulloch 2019, Pierini 2019). In this talk, we focus on two subsets of emojis, namely non-face emojis that denote activities (such as the ‘basketball’ or ‘soccer ball’; henceforth ‘activity emojis’), and affective emojis, which include face emojis (such as the ‘grinning face’ and the “angry face’) as well as a set of affective non-face emojis (such as ‘thumbs up’ and ‘heart’). We argue that both the activity emojis and the affective emojis are typically anchored to an individual with a role such as Agent or Experiencer. Moreover, we provide evidence for a view where activity emojis are anaphoric (and often exhibit properties similar to 3rd person pronouns), while affective emojis exhibit 1st-person indexicality. The central paradigm is given in (1ab)-(2ab), where (1ab) exhibit 3rd-person anaphoricity, whereas (2ab) exhibit 1st-person indexicality. We propose a formal semantic analysis, where activity emojis denote separate discourse units, connected to the accompanying text via salient discourse relations, whereas affective emojis are expressive modifiers (similar to adverbs like ‘damn’ and interjections like ‘oh my’).
(1a) kate said sue impressed ann 🏀 [basketball-emoji]
–-> agent of basketball-event = Sue(1b) kate said sue admired ann 🏀 [basketball-emoji]
–> agent of basketball-event = Ann(2a) kate said sue impressed ann 😲 [astonished-face-emoji]
–> experiencer of astonished-state = author (speaker)(2b) kate said sue admired ann 😲 [astonished-face-emoji]
–> experiencer of astonished-state = author (speaker)
Friday 24th January, at 2.15-4pm
Jean-Julien Aucouturier (CNRS/IRCAM)
Henrik Wergelands hus, Room 536. (This presentation will be given via video conferencing.)
Is Schumann a scam? how music tricks our brain into thinking it's worthy of emotions.