3rd Conference
The Evolution of Language
April 3rd - 6th , 2000

Abstracts

 

 

The evolution of communicative interaction systems:
A formal semantics perspective

Jonathan Ginzburg

Computational Linguistics Programme
Dept. of Philosophy
King's College, London
The Strand, London WC2R 2LS, UK
jonathan.ginzburg@kcl.ac.uk

Linguistics Programme
Dept. of English
Hebrew University of Jerusalem
Mount Scopus Jerusalem 91905, Israel

Introduction

Despite the fact that significant advances have taken place over the last 25 years in the study and understanding of the semantics and pragmatics of natural languages (NL's) using tools from formal logic and theoretical computer science (for a survey see e.g. Lappin 1996) such work has to date had little impact on the study of evolution of language. This is not surprising, given the focus in such work on classifying semantic properties in terms designed for disembodied formal languages where notions of interaction and context dependence had little place. However, in recent years in the wake of the development of the dynamic perspective on meaning, there has been significant interest in applying formal semantic techniques to the study of human dialogue. As a result various linguistic phenomena have emerged whose description and analysis require taking a perspective on language which, in a nutshell, views its use in interaction as basic. These include Collaborative utterances(Clark 1996), the Turn Taking Puzzle (Ginzburg 1997), and Cross speaker anaphora (Dekker 1997).

In this paper, I will argue that this perspective and the techniques emerging from the formal semantic study of the structure of human cognitive states utilized in communicative interaction shed new light on the issue of discontinuity between the human communicative interaction system (CIS) and the CIS's of other species. I will propose two measures of complexity for a CIS. One based on attitudinal complexity, that indicated by the type of cognitive attitude predicates structuring the information states apparently utilized inter alia in communicative interaction. The other based on the complexity of the messages employed in the CIS, a measure I will dub contextual complexity . I will suggest that both measures provide at least prima facie evidence for continuity between the CISs of primates and those of humans. More importantly, this way of looking at CISs offers prospects for locating criterial dividing points between the currently existing adult NL–based CIS and simpler human-neonate, proto-human, cetacean, and primate systems.

Cognitive states and attitudinal complexity

What is the structure of the cognitive states used in human conversational interaction? Simplifying somewhat, a tentative answer to this question which has been emerging in recent formal semantic and pragmatic work on dialogue (see e.g. Ginzburg 1996,1997,1998, Traum et al 1999) is to model these as feature structures of the form:

(1)

Here facts represents the knowledge that accumulates in the context during a conversation, qud is a partially ordered set of questions, which represents the issues under discussion at a particular point in a dialogue, and latest-move represents the most recent conversational move undertaken in the dialogue. Using this cognitive architecture, allows one to view conversational interaction dynamically as a sequence of cognitive states which are updated as a consequence of dialogue moves. These include queries, assertions, and various moves such as acknowledgements, corrections, and clarifications that relate to ``metalinguistic'' interaction about the grounding process of utterances, the feedback which conversationalists provide each other about whether an utterance has been understood or requires clarification (Clark 1996, Traum 1994, Ginzburg 1998). States of this kind can also be used to explain linguistic phenomena such as anaphora and ellipsis resolution possibilities in dialogue.

What underlies cognitive states of the form sketched in (1) is the fact that human cognitive states can be structured in terms of a number of distinct attitudes to the external environment, including minimally belief, wonder, and plan. Here belief is the familiar attitude predicate, relating an agent to a proposition, which if true provides some descriptive condition on an external situation .wonder is the attitude which relates an agent to a question, that semantic unit which encapsulates in a consistent way mutually exclusive ways of describing an external situation (see refs in footnote 3), and plan is the attitude predicate relating an agent to a sequence of actions (identifiable for current purposes with a temporally ordered sequence of (propositional) state descriptions.). I dub a CIS which exploits states as in (1) a Discursive Informing System (DIS): agents who communicate within a DIS can inform each other of facts but also discuss questions and dispute claims.

How might one explain the evolutionary chain that has lead to the emergence of DISs? One possible answer is to view the complexity of CISs as correlating with the abstractness or defeasibility of the information processed by agents in a given CIS. At the lowest end one would locate Action Registration Systems (ARS). In such a system information is entirely encapsulated in the act per-formed by the utterer such as an act of greeting or threatening. Such systems are well known even among non-primate mammals such as geese or wolves, as documented already in early ethological literature (e.g. Lorenz). The basic architecture required for communication using such states is the possession of ability to correlate message tokens with discrimination of situations into various distinct types.

A level above ARSs are Pure Informing Systems (PUS). In such a system the acts available involve messages which classify the situation in which utterer and addressee find themselves, most prototypically classifying it as dangerous in some way. Such a system involves one level of abstractness in that of course the reliability of the information depends on the agent providing it. This introduces the potential for providing incorrect information and, consequently, for inconsistency. Inconsistency between the information arising in the message and information arriving from a distinct source. For an agent in a PUS it seems apposite to attribute a cognitive attitude of confusion if a situation is encountered which is apparently classifiable in incompatible ways, then the animal behaves as if something is very wrong (its cognitive system (temporarily)``crashes'', various symptoms of panic are evinced etc).

A significant increase in the complexity of an information state is one that has evolved from merely encoding confusion, to one that manifests wondering: the ability to consistently represent the existence of incompatible ways of categorizing a situation, that which, as noted above, is represented by a question. wondering is the key to the evolution of a notion of reliability of a signal: if a signal arrives on the basis of which the hearer is supposed to classify the context situation with type T1 but the hearer also has information requiring her to classify the context with type T2, incompatible with T1 if her information state is advanced enough to encode wondering, the hearer can react rationally, and weigh whether to accept T1 or to reject/ignore it. From the information provided by e.g. Cheney and Seyfarth (Cheney& Seyfarth 1988}), we can conclude, for instance, that vervets' information states encode some notion of wondering. Similarly for Chimpanzees, given evidence for their ability to reason about deception (e.g. de Waal 1986). A CIS where the agent can wonder will be called a Pondered Informing Systems (PIS).

Qualitatively, the difference between a PIS and a DIS is that in the latter the agent can go further than simply wonder, they can actually externalize their wondering and engage in discussion of a given question, e.g. in whether information provided by an act of informing is correct or not. A parameter which can distinguish the complexity of two DISs is the cardinality of the qud (questions under discussion) attribute they carry: it is straightforward to demonstrate from conversations that the cardinality of qud for adult humans can be higher than 3. Agents with simpler DISs such as adult neonates seem to be limited to a qud of at most 2.

Evaluating a CIS by message complexity

Attitudinal complexity is a hopefully a measure which can provide some help in characterizing the complexity of CISs. The increase in complexity in the hierarchy sketched above (ARS < PUS < PIS < DIS) requires an increase in cognitive complexity and equally involves enhancements that are clearly adaptive. Attitudinal complexity is, nonetheless, only one component in evaluating a CIS. An additional component, which is at least partly independent, is a measure of the complexity of the messages communicated within a given CIS, taking as a starting point the increasingly detailed descriptions achieved by semantic analysis of the human CIS.

The measure of message complexity I develop in the extended version of this paper is one I call contextual complexity. It is motivated by the need to (a) define message complexity in a way which does not presuppose syntactic complexity (cf. the suggestive evidence for syntactic complexity in relatively `simple' CISs such as that of gibbons (Ujheyli 1998)) (b) the importance of integrating contextual factors in the calculation of message import. In linguistic semantics an influential way of thinking of meanings deriving originally from the work of David Kaplan and situation semantics (e.g. Kaplan 1989, Barwise and Perry 1983) is as relations between an utterance situation and potentially other situations in which certain parameters (the contextually dependent parameters) get instantiated and the content communicated.With this view of meanings, we can define a measure of complexity on meanings correlating increased complexity with the increase in situations used in calculating the meaning. A homo-situational message system involves merely contextual relativization to the utterance situation. This is exemplified in the following messages:

(2) hi: given spkr,hearer, spkr-greets-hearer;danger: given situation, danger(situation) eagle: given situation, dangereagle(situation);what's up: given situation, lP.P(situation)

In a bi-situational message system there is one additional situation relative to which contextual relativization is defined:

(3) Pogo is absent: given situation u and reference to p in s, absent(p) is true of u

In a dynamic-situational message system relativization can crucially involve the previous utterance situation. In other words, calculating the values of a meaning require a buffer which involves the previous utterance situation. Such a system is needed, for instance, to handle linguistic phenomena such as anaphora and feedback such as clarification and correction.

References

Barwise, J. and J. Perry.: 1983, Situations and Attitudes MIT Press, Cambridge

Cheney, D. and Seyfarth, R.: 1988, "Assessment of meaning and the detection of unreliable signals by vervet monkeys", Animal Behaviour, 36, 477-486

Clark, H.: 1996, Using Language, Cambridge University Press, Cambridge

de Waal, F.: 1986, "Deception in natural communication of chimpanzees" In: R. Mitchell and N. Thompson (eds.), Deception: Perspectives on Human and Nonhuman Deceit, SUNY Press, Albany

Dekker, P.: 1997, "First Order Information Exchange", In: G. Jaeger and A. Benz (eds.),Proceedings of MunDial 97 (Technical Report 97-106), Universitaet Muenchen Centrum fuer Informations- und Sprachverarbeitung, Muenchen

Ginzburg, J.: 1996, "Interrogatives: Questions, Facts, and Dialogue", In: S. Lappin (ed.), Handbook of Contemporary Semantic Theory, Blackwell, Oxford

Ginzburg, J.: 1997, Structural Mismatch in Dialogue, In: G. Jaeger and A. Benz (eds.), Proceedings of MunDial 97, (Technical Report 97-106)}}, pp 59-80, Universitaet Muenchen Centrum fuer Informations- und Sprachverarbeitung, Muenchen

Ginzburg, J.: 1998, "Clarifying Utterances" In J. Hulstijn and A. Nijholt (eds.), Proceedings of TwenDial 98, 13th Twente workshop on Language Technology, pp 11–30, Twente University, Twente

Groenendijk, J. and Stokhof, M.: 1997, "Questions", In J. van benthem and A. ter Meulen (eds.), Handbook of Logic and Linguistics , North Holland, Amsterdam

David Kaplan.: 1989 "Demonstratives: An Essay on the Semantics, Logic, Metaphysics, and Epistemology of Demonstratives and Other Indexicals", In: J. Almog et al (eds.) Themes from Kaplan, Oxford University Press, New York,

Lappin, S. (ed.): 1996, Handbook of Contemporary Semantic Theory, Blackwell, Oxford

David Traum.: 1994, A Computational Theory of Grounding in Natural Language Conversations. Unpublished PhD thesis, University of Rochester,

David Traum and Johan Bos and Robin Cooper and Staffan Larsson, and Ian Lewin and Colin Matheson and Massimo Poesio 1999, "A model of dialogue moves and information state revision", TRINDI Deliverable 2.1, University of Gothenburg", Available from http://www.ling.gu.se/research/projects/trindi

Ujhelyi, M. "Long-call structure in apes as a precursor for language. In: J. Hurford, M. Studdert-Kennedy, and C. Knight (eds.), Approaches to the Evolution of Language, Cambridge University Press

Worden, R.: 1998, "The Evolution of Language from social intelligence", In: J. Hurford, M. Studdert-Kennedy, and C. Knight (eds.), Approaches to the Evolution of Language, Cambridge University Press.

 

 

 Conference site: http://www.infres.enst.fr/confs/evolang/