CLASP
The Centre for Linguistic Theory and Studies in Probability

Language: The Tool for Interaction -- Surfing Uncertainty Together

With established recognition of the endemic context-relativity of language, it is now generally accepted that both parsing and production involve incremental context-relative decisions, requiring the concepts of both evolving contents and evolving contexts. Researchers across semantics, pragmatics, psycholinguistics, and computational linguistics are duly turning to the challenge of modelling language in terms that are compatible with such incrementality. Yet formal models of language remain largely grounded in the static terms of licensing sentential string- interpretation pairings reflecting only concepts such as compositionality, with little or no reflection of a time-linear process of information growth.

In this talk, I start by showing why linguists cannot avoid the challenge of defining grammar formalisms to reflect the dynamics of conversational dialogue, and how in order to achieve this, every aspect of linguistic knowledge needs to be recast as procedures for on-line incremental and predictive word-by-word understanding/production. I shall then briefly sketch the action-based Dynamic Syntax (DS) system to demonstrate its explanatory potential, by modelling what have been taken as canonical exemplars of semantic-independent syntactic processes, which in DS are all expressed in terms of incremental parsing/generation actions. I will show in passing how the resulting system, despite the lack of any conventional notion of syntax, nonetheless has the power to express both universal structural constraints and yet cross-language variability. Part of this will include the Directed Acyclic Graph characterisation of context as developing in lockstep with the evolving yet revisable content, demonstrating the system-internal potential for self/other-correction. The dynamics of conversational dialogue interactions will then emerge as the immediate consequence of this perspective on language; and I will briefly illustrate how this potential for interaction underpins all types of language-internal licensing constraint: syntactic, semantic, morphosyntactic and phonological.

I shall then turn to setting this perspective within the Predictive Processing (PP) model of cognition (Clark 2016), whose architectural properties the DS concept of language matches almost point by point. Like perception in the PP model, the DS grammar is a “fundamentally action-oriented” set of procedures, grounded in predictive processing resources shared by speakers (action) and hearers (perception) alike and “executed using the same basic computational strategy” leading to effects of interactive coordination without any need to invoke mind-reading or propositional inference. The result is that linguistic processing, perception, action, and thought are predicted to be “continuously intermingled” yielding representational updates “tailored to good enough online controls rather than aiming for rich mirroring”. Instead, such updates are accomplished due to a strong version of affordance competition since the brain ¿continuously computes multiple probabilistically inflected possibilities for action¿ in a cost-effect balancing dynamic, with possibilities progressively winnowed down, allowing for possible revision, to yield at least one output in any successful outcome. To this set of characteristics (Clark 2016 p. 251), we have only to add the potential for interaction which such a language system predicts as default, and a wholly different perspective on language evolution opens up. Language can now be seen as an emergent and evolving system with manifest potential for consolidating cross-individual interactions, hence group effects, without ever having to invoke high-level inferences as external, “designer”-imposed motivation for such consolidation, this a dynamic for which language change already provides robust motivation.