Eye-tracking for Linguistic Research

Course time: 
Monday/Thursday 11:00 AM-12:50 PM
Location: 
JSB 213
Description: 

This course is an introduction to eye tracking as a tool for understanding how language works.  The instructor's background is in phonetics and speech perception so there will necessarily be an emphasis on visual-world eye-tracking research in spoken word recognition.  However, attention (pun intended) will also be given to sentence processing and reading research using eye tracking.  Students completing this course will have the background needed to read, understand, and evaluate eye tracking literature.  You will also have the foundation needed to design, implement, and analyze your own original eye tracking research.  We will read and discuss foundational literature applying eye tracking to linguistic questions as well as a range of more recent papers covering both methodological and theoretical concerns.  Finally, we will implement, collect data for, and analyze data for a visual world eye tracking experiment.  The class will be most accessible if you have taken an introduction to psycholinguistics, had some exposure to phonetics, and have at least some familiarity using R for statistical data analysis (e.g. having completed the equivalent of this tutorial: http://www.cyclismo.org/tutorial/R/ ).

 

Allopenna, P. D., Magnuson, J. S., & Tanenhaus, M. K. (1998). Tracking the time course of spoken word recognition using eye movements: Evidence for continuous mapping models. Journal of Memory and Language, 38, 419-439.

Altmann, G. T. (2011). Language can mediate eye movement control within 100 milliseconds, regardless of whether there is anything to move the eyes to. Acta Psychologica, 137, 190- 200.

Altmann, G. T. M. (2011). The mediation of eye movements by spoken language. In S. P. Liversedge, I. D. Gilchrist, & S. Everling (Eds.), The Oxford Handbook of Eye Movements (pp. 979-1003). Oxford: Oxford University Press.

Barr, D.J. (2008). Analyzing "visual world" eyetracking data using multilevel logistic regression. Journal of Memory and Language. 59(4), 457-474

Barr, D. J., Gann, T. M., & Pierce, R. S. (2011). Anticipatory baseline effects and information integration in visual world studies. Acta Psychologica, 137, 201-207.

Beddor, P.S., McGowan, K.B., Boland, J.E., Coetzee, A.W., & Brasher, A. (2013). The Perceptual Time Course of Coarticulation.  Journal of the Acoustical Society of America, 133, 2350-2366.

Degen, J. & Tanenhaus, M.K. (2016). Availability of Alternatives and the Processing of Scalar Implicatures: A Visual World Eye-Tracking Study. Cognitive Science, Jan, 40(1), 172-201.

Dahan, D., Drucker, S.J., & Scarborough, R.A. (2008). Talker adaptation in speech perception: Adjusting the signal or the representations? Cognition, 108(3), 710-718.

Dahan, D., & Gaskell, G. M. (2007). The temporal dynamics of ambiguity resolution: Evidence from spoken-word recognition. Journal of Memory and Language, 57, 483-501.

Dahan, D., Magnuson, J. S., & Tanenhaus, M. K. (2001). Time course of frequency effects in spoken-word recognition: evidence from eye movements. Cognitive Psychology, 42, 317-367.

Dahan, D., Magnuson, J. S., Tanenhaus, M. K., & Hogan, E. M. (2001). Subcategorical mismatches and the time course of lexical access: Evidence for lexical competition. Language and Cognitive Processes, 16, 507-534.

Dahan, D., & Tanenhaus, M. (2005). Looking at the rope when looking for the snake: Conceptually mediated eye movements during spoken-word recognition. Psychonomics Bulletin & Review, 12, 453-459.

Eberhard, K.M. (1995). Eye movements as a window into real-time spoken language comprehension in natural contexts. Journal of Psycholinguistic Research, 24(6), 409-436.

Huettig, F., & Altmann, G. T. (2011). Looking at anything that is green when hearing “frog”: how object surface colour and stored object colour knowledge influence language-mediated overt attention. Quarterly Journal of Experimental Psychology, 64, 122-145.

Huettig, F., & McQueen, J. M. (2007). The tug of war between phonological, semantic and shape information in language-mediated visual search. Journal of Memory and Language, 57, 460-482.

Huettig, F., Rommers, J., & Meyer, A. S. (2011). Using the visual world paradigm to study language processing: A review and critical evaluation. Acta Psychologica, 137, 151-171. *Magnuson, J. S., Dixon, J. A., Tanenhaus, M. K., & Aslin, R. N. (2007). The dynamics of lexical competition during spoken word recognition. Cognitive Science, 31, 133-156.

McMurray, B., Clayards, M. A., Tanenhaus, M. K., & Aslin, R. N. (2008). Tracking the time course of phonetic cue integration during spoken word recognition. Psychonomics Bulletin Review, 15, 1064-1071.

McMurray, B., Tanenhaus, M. K., & Aslin, R. N. (2009). Within-category VOT affects recovery from "lexical" garden paths: Evidence against phoneme-level inhibition. Journal of Memory and Language, 60, 65-91.

McQueen, J. M., & Viebahn, M. C. (2007). Tracking recognition of spoken words by tracking looks to printed words. Quarterly Journal of Experimental Psychology, 60, 661-671.

Mirman, D., Dixon, J. A., & Magnuson, J. S. (2008). Statistical and computational models of the visual world paradigm: Growth curves and individual differences. Journal of Memory and Language, 59, 475-494.

Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological bulletin124(3), 372. Chicago

Reinisch, E., Jesse, A., & McQueen, J. M. (2010). Early use of phonetic information in spoken word recognition: Lexical stress drives eye movements immediately. Quarterly Journal of Experimental Psychology, 63, 772-783.

Salverda, A. P., Brown, M., & Tanenhaus, M. K. (2011). A goal-based perspective on eye movements in visual world studies. Acta Psycholica, 137, 172-180.

Salverda, A. P., Dahan, D., Tanenhaus, M. K., Crosswhite, K., Masharov, M., & McDonough, J. (2007). Effects of prosodically modulated sub-phonetic variation on lexical competition. Cognition, 105, 466-476.

Shatzman, K. B., & McQueen, J. M. (2006). Segment duration as a cue to word boundaries in spoken-word recognition. Perception & Psychophysics, 68, 1-16.

Shen, J., Deutsch, D., & Rayner, K. (2013). On-line perception of Mandarin Tones 2 and 3: evidence from eye movements. Journal of the Acoustical Society of America, 133, 3016- 3029.

Tanenhaus, M. K. (2007). Spoken language comprehension: Insights from eye movements. Chicago

Tanenhaus, M. K. (2007). Eye movements and spoken language processing. Eye movements: A window on mind and brain, 309-26. Chicago