CLASP
The Centre for Linguistic Theory and Studies in Probability

Latent-Variable Grammars and Natural Language Semantics

Probabilistic grammars are an important model family in natural language processing. They are used in the modeling of many problems, mostly prominently in syntax and semantics. Latent-variable grammars are an extension of vanilla probabilistic grammars, introducing latent variables that inject additional information into the grammar by using learning algorithms in the incomplete data setting.

In this talk, I will discuss work aimed at the development of (four) theoretically-motivated algorithms for the estimation of latent-variable grammars. I will discuss how we applied them to syntactic parsing, and more semantically-oriented problems such as machine translation, conversation modeling in online forums and question answering.

Lecturer:Shay Cohen is a Chancellor’s Fellow at the Institute for Language, Cognition and Computation, University of Edinburgh.