Listen or read the following collection of Scott’s interviews with leading experts and learn from their studies.
Dr. Alexander Francis
Dr. Alexander Francis is Associate Professor of Speech, Language and Hearing Sciences at Purdue University.
What is your background?
I started getting interested in languages when I had a chance to spend a year in Germany after high school. I had always been intrigued by accents and was good at mimicking them. In college, I loved learning new languages but I was always more interested in learning a new one than in following the typical college progression of language majors and delving into literature and criticism. For example, while studying abroad in college in Munich in college I took courses in Hungarian and Georgian. Although I started college at the University of Illinois at Urbana-Champaign as a biochemistry major, I eventually graduated in linguistics and I’m very grateful to the professors and instructors I had there who got me started in the field. I went to the University of Chicago intending to do a Ph.D. in historical linguistics focusing on the Caucasus. However, I was first required to take a course in phonetics and became fascinated by that field. I was lucky enough to earn a summer graduate research assistantship at Los Alamos National Labs, and ended up studying speech perception by double-majoring in linguistics and psychology. That opened the door for a postdoctoral degree in speech and hearing sciences and eventually the job I have now.
I’ve never worked on literacy per se, but I’ve always been interested in what the sounds of a language tell us about how its speakers think about their language. Obviously, with historical languages, the written record is all we have to tell us about their sounds, but even with modern spoken languages the two (writing and speech) interact. I’ve also been interested in phonetic learning — that is, the very first stage in learning the sounds of new languages, the stage in which the brain first learns to recognize that, for example the [y] sound in German Mütter (English “mothers”) is distinct from the [u] sound in Mutter (English “mother”) – and in fact wrote my dissertation on phonetic learning. Through working in that subfield I’ve also been introduced to some aspects of first language learning. For example, many researchers in second language speech learning are also interested in questions of how learning to write affects knowledge of language and vice versa, or whether adult learners of new languages use the same mental processes as infants do when learning a new language.
What is your profession?
I started my research career investigating how listeners learn to hear unfamiliar speech sounds contrasts in a foreign language. A lot of that work was framed in terms of, “What do listeners pay attention to in the speech signal?” Over time I became interested in studying attention in its own right (it helps that my Ph.D. supervisor is a cognitive psychologist who was primarily interested in attention at the time I started my degree).
In recent years, my research has shifted and I am currently studying how people cope with interference and distraction from irrelevant sounds (aka “noise”). I’m particularly interested in understanding individual differences – why does one person find it incredibly frustrating when trying to work in a noisy office, while another person has no trouble at all (or even likes to work in coffee shops where there’s a constant hubbub in the background).
The problem of noise is huge for a wide range of people, from those who wear hearing aids or cochlear implants, to adults with brain injuries to children with learning disabilities. For example, children with autism are often profoundly disturbed by sounds emanating from every-day devices like light fixtures or air conditioning systems that neuro-typical individuals learn quickly to ignore. So architects and interior designers are increasingly considering noise abatement and minimization strategies when designing schools for children with special needs.
But, in fact, the connection between noise and language development is much more pervasive than that. Two early studies from the 1970s (Cohen, Glass & Singer, 1973; Bronzaft & McCarthy, 1975) demonstrated that children living on the lower floors of an apartment building (closer to the expressway and exposed to higher levels of noise), and children in classrooms that faced an elevated train track, were more delayed in their reading test scores than children who lived on upper (quieter) floors or in classrooms on the quiet side of the building.
What is your experience with language within your profession?
I study how people understand spoken language, so I use linguistic stimuli in my experiments. Also, I write a lot – papers, grant proposals, lectures…
Where can we find out more about your work?
My lab web page, which I try to keep relatively up-to-date, is at https://www.purdue.edu/hhs/slhs/spacelab/ I call my lab the SPACELab for “Speech Perception and Cognitive Effort” Lab, and also because I am interested in the spatial relationship between speech and noise affects listeners. So I appreciate a good acronym, too.