Computational Phonology

Course time: 
Monday/Thursday 11:00 AM-12:50 PM
Location: 
JSB 203
Description: 

This course offers an overview of learnability in constraint-based phonology (classical and stochastic Optimality Theory, classical and stochastic Harmonic Grammar, Maximum Entropy Grammars). The language learning task is broken up into various specific and explicitly stated learning problems. For each learning problem, various learning algorithms are investigated and compared, drawn from both the discrete/combinatorial and the numerical/probabilistic approaches pursued in the current literature. The focus is on analytical guarantees, rather than sheer simulation results.

Week 1 will focus on the problem of learning efficiently a grammar consistent with a set of linguistic data. We will look at batch and error-driven algorithms for ranking and weighting, exploring the learnability implications of different modes of constraint interaction. Week 2 will focus on the problem of learning a restrictive grammar. We will establish the intractability of the problem and explore various ways to cope with that. Week 3 will focus on the problem of learning underlying forms. We will focus on the issue of the efficient exploration of lexicons and discuss the classical inconsistency detection method. Week 4 will focus on probabilistic approaches to learning variation and gradience. We will compare stochastic OT/HG and MaxEnt and explore their implications for the problem of learning hidden structure.

 

The course is covered by draft lecture notes (which will be extended and updated as we proceed) available at https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxtY...