TTR at the SPA: Relating type-theoretical semantics to neural semantic pointers

Presented by: Staffan Larsson from University of Gothenburg
Date: November 15, 2023

Abstract: The goal of the work presented here is to provide a hybrid of formal and neural semantics for natural language. To this end, we consider how the kind of formal semantic objects used in TTR (a theory of types with records, Cooper, 2023) might be related to the vector representations used in Eliasmith (2013). An advantage of doing this is that it would immediately give us a neural representation for TTR objects as Eliasmith relates vectors to neural activity in his semantic pointer architecture (SPA). This would be an alternative using convolution to the suggestions made by Cooper (2019a) based on the phasing of neural activity. The project seems potentially hopeful since all complex TTR objects are constructed from labelled sets (essentially sets of ordered pairs consisting of labels and values) which might be seen as corresponding to the representation of structured objects which Eliasmith achieves using superposition and circular convolution.

Location: Attend in person at C250 or via Zoom, https://gu-se.zoom.us/j/66299274809?pwd=Yjc2ejc2VVhraXVJMmhWeWtOQ2NuUT09

Time: 13:15-15:00