Abstracts of the Invited speakers
Irene Heim
Not yet available
Department of Linguistics and Philosophy
MIT, Massachusets
Predicate Restriction and Saturation
Bill Ladusaw
At the heart of predicate - argument combination is a calculus of saturation in which
the semantically incomplete predicate is made semantically complete. Definite nominal phrases
fully saturate the predicate. By contrast indefinite determiner phrases and nominal phrases, which
have property contents, can combine with predicates to semantically restrict their denotations
without semantically saturating them. In this paper I will discuss cases in which a restricting
mode of argument composition seems appropriate.
Many languages present constructions in which the nominal head of an argument is morphologically
incorporated into the predicate. Interpreting the incorporated element as restricting the
predicate does not eliminate the possibility of further specifying an individual to saturate the
predicate. Incorporating languages differ from each other in whether such further specification is
allowed. I will argue that the contrast between these languages is due to properties of their
syntax and not enforced by the semantics.
In Maori, a Polynesian language spoken in New Zealand, there are two indefinite articles: ``he''
and ``tetahi''. Though they show similar profiles as indefinites and appear to be in free
variation in many cases, they each have limitations on their distribution reflects an underlying
semantic distinction between them. ``Tetahi'' may not be used in phrases in the pivot of an
existential sentence and ``he'' may not occur as the external argument of a verb. I argue that the
major features of their distribution follows if we assume that they contrast not in their contents
but in the mode in which they compose with predicates. ``Tetahi'' reflects true saturation, with
the indefinite representing a choice function. ``He'' combines by predicate restriction. Though
the resulting interpretations are generally equivalent, the analysis provides another view of a
specific/nonspecific contrast.
This paper is based upon joint work with Sandra Chung.
Department of Linguistics
UCSC, Santa Cruz
The Average American has 2.3 Children
Francis Jeffry Pelletier
(joint work with Greg Carlson and Thomas Hofweber)
Chomsky and certain of his students and followers think it can be demonstrated that the
field of semantics, especially the subfield of formal semantics, is bankrupt by considering certain
features of some NPs. Hornstein (1984: 58), for example, says:
No one wishes to claim tht there are objects that are average men in any meaningful sense. Š
If this is the case, it would appear that several currently influential approaches to the semantics
of natural language are misconceived. For example, programs of the Davidsonian varietyŠwill
inevitably construe quantification in natural language objectually, and reference as a relation
bertween a term and an actual object.Š If the above observations are correct, then for many cases,
how meaning is determined has nothing at all to do with notions like ³object², ³truth², or
³model². The key notions are not semantic, but syntactic or pragmatic. The semantic notions turn
out to have no role in generalizations that explain how meaning functions in natural language.
And Chomsky(1995: 29) puts the point
If I say that one of the things that concerns me is the average man and his foibles, or Joe
Sixpack¹s priorities, Š does it follow that I believe that the actual world, or some mental model
of mine, is constituted of such entities as the average man, Joe Sixpack, priorities,Š?
and (1986: 45)
One can speak of ³reference² or ³co-reference² with some intelligibility if one postulates a
domain of mental objects associated with formal entities of language by a relation with many of the
properties of reference, but all of this is internal to the theory of mental representations; it is
a form of syntax.
Although some researchers have tried to counter this sort of argument by postulating some new
³logical forms² for sentences of the relevant sort, most semanticists have reacted to such charges
simply by ignoring themŠother than to say that they ³demonstrate a lack of under- standing of
formal semantics² on the part of Chomsky and his followers.
In fact, though, I believe there is a real issue here. Firstly, I think that the few attempts to
give a direct account of the phenommena do not succeed even on their own terms. And secondly, I
think that if the Chomskean charge is not refuted then in fact formal semantics is not a viable
enterprise.
I think formal semantics can be saved, but that it will have to give up one of its background
assumptions: that the semantic role of NPs is either to denote (designate, etc.) something or to be
³quantificational² in nature. I will try to make this proposal more clear, and will focus on the
case of ³average NPs² in doing so.
Department of Philosophy
University of Alberta
Syntactic Constraints on Quantifier Scope Alternation
Mark Steedman
Ambiguities arising from alternations of scope in interpretations for
multiply quantified sentences have led to various complications which
have compromised the strong assumptions of syntactic/semantic
transparency and monotonicity underlying the Frege-Montague approach
to the theory of grammar. These include movement at logical form,
related abstraction or storage mechanisms, and proliferating
type-changing operations. The paper examines some interactions of
scope alternation with syntactic phenomena including coordination,
binding, and word-order in Germanic languages. Starting from the
assumption that many expressions that have been treated as generalized
quantifiers are in fact referential expressions, and using combinatory
categorial Grammar (CCG) as a grammatical framework, the paper
presents an account of quantifier scope ambiguities according to which
the available readings follow directly from the combinatorics of the
syntactic derivation, without any independent manipulation of logical
form and without recourse to type-changing operations other than
those with independent syntactic motivation.
Division of Informatics
University of Edinburgh
A Logicist Program for Lexical Semantics
Richmond Thomason
I will begin by trying to clear logicism of the bad name that
Frege and Russell gave it. I will situate certain problems in natural
language semantics with respect to larger trends in logicism, and will
claim that the essence of a good logicist program is a match between a
suitable logical formalism and target domain.
I will propose the following logicist program with respect to
lexical semantics.
To formalize the dependencies of the meanings of
complex words (where the complexity at stake is that of
derivational morphology) on the meanings of their
parts using a nonmonotonic extension of Intensional
Logic.
I will illustrate the program with case studies from English
derivational morphology. The resulting project is complementary to
attempts by Artificial Intelligence logicists to formalize aspects of
commonsense reasoning; and it appears to have a philosophical as well
as a linguistic dimension.
Department of Philosophy
University of Michigan
Modal Disjunction
Ede Zimmermann
This talk is about a construal of disjunctions "A or B" as lists of
epistemic possibilities "It might be that A; it might be that B."
Various phenomena can be explained in a natural way if we adopt this
non-boolean, modal view of disjunction:
- an apparent de re/de dicto distinction in disjunctions ("Tom bought a
book, or he stole it" vs. "Tom bought or stole a book");
- an alleged illogical (i.e. conjunctive) use of "or" between modal
statements ("He could be here, or he could be there");
- the use of "or" as conjoining alternatives in questions ("Would you
like tea or coffee?");
- disjunctive permission statements ("You may do P or Q") that can
express conjunctive permissions ("You may do P and you may do Q"),
especially when used performatively.
Institut für Deutsche Sprache und Literatur
Johann Wolfgang Goethe-Universität, Frankfurt
Evening Lecture.
Toward a Logic of Perceptions
Lotfi A. Zadeh
Our approach to the logic of perceptions (LP) is inspired by the
remarkable human capability to perform a wide variety of physical and
mental tasks without any measurements and any computations. Everyday
examples of such tasks are parking a car, driving in city traffic,
cooking a meal, deciphering sloppy handwriting, summarizing a story,
and engaging in discourse.
Underlying this capability is the brain's crucial ability to reason
with perceptions -- perceptions of time, distance, force, direction,
shape, intent, likelihood and truth, among others. In essence,
perceptions are summaries of impressions and as such are intrinsically
imprecise.
Perceptions have been studied extensively in a wide variety of
contexts. But what is not in existence is a theory in which
perceptions are objects of computation. The logic of perceptions (LP)
which is outlined in my talk is focused on the development of what is
referred to as the computational theory of perceptions (CTP) -- a
theory which comprises a conceptual framework for computing and
reasoning with perceptions. The base for CTP is the methodology of
computing with words (CW). In CW, the objects of computation are
words and propositions drawn from a natural language. A typical
problem in CW is the following. Assume that a function f, Y=f(X), is
described in words as: if X is small then Y is small; if X is medium
than Y is large; if X is large then Y is small, where small, medium
and large are labels of fuzzy sets. The question is: What are the
maximum and maximizing values of Y and X, respectively?
The point of departure in the computational theory of perceptions is
the assumption that perceptions are described as propositions in a
natural language, e.g., "Michelle is slim," "Mary is telling the
truth," "it is likely to rain tomorrow," "economy is improving," "it
is very unlikely that there will be a significant increase in the
price of oil in the near future." In this perspective, natural
languages may be viewed as systems for describing perceptions. In
effect, in the computational theory of perceptions computing and
reasoning with perceptions is reduced to computing and reasoning with
words. Interesting but unrelated approaches to a theory of
perceptions have been described by H. Rasiowa and R. Vallee.
To be able to compute with perceptions it is necessary to have a means
of representing their meaning in a way that lends itself to
computation. Conventional approaches to meaning representation cannot
serve this purpose because the intrinsic imprecision of perceptions
puts them well beyond the expressive power of predicate logic and
related systems. In the computational theory of perceptions, meaning
representation is based on what is referred to as constraint-centered
semantics of natural languages (CSNL).
A concept which plays a central role in CSNL is that of a generalized
constraint. Conventional constraints are crisp and are expressed as X
is C, where X is a variable and C is a crisp set. In a generic form,
a generalized unconditional constraint is expressed as X isr R, where
X is the constrained variable; R is the constraining (fuzzy) relation
which is called the generalized value of X; and isr, pronounces as
ezar, is a variable copula in which the value of the discrete variable
r defines the way in which R constrains X. Among the basic types of
constraints are the following: equality constraints (r:=);
possibilistic constraints (r:blank); veristic constraints (r:v);
probabilistic constraints (r:p); random set constraints (r:rs);
usuality constraints (r:u); fuzzy graph constraints (r:fg); and Pawlak
set constraints (r:ps).
In constraint-centered semantics, a proposition, p, is viewed as an
answer to a question, q, which is implicit in p. The meanings of p
and q are represented as generalized constraints, which play the roles
of canonical forms of p and q, CF(p) and CF(q), respectively. CF(q)
is expressed as: X isr ?R, read as "What is the generalized value of
X?" Correspondingly, CF(p) is expressed as: X isr R, read as "The
generalized value of X isr R." The process of expressing p and q in
their canonical forms plays a central role in constraint-centered
semantics and is referred to as explicitation. Explicitation may be
viewed as translation of p and q into expressions in GCL -- the
Generalized Constraint Language.
In the logic of perceptions, representation of meaning is a
preliminary to reasoning with perceptions -- a process which starts
with a collection of perceptions which constitute the initial data set
(IDS) and terminates in a proposition or a collection of propositions
which play the role of an answer to a query, that is, the terminal
data set (TDS). Canonical forms of propositions in IDS constitute the
initial constraint set (ICS). The key part of the reasoning process
is goal-directed propagation of generalized constraints from ICS to a
terminal constraint set (TCS) which plays the role of the canonical
form of TDS. The rules governing generalized constraint propagation
in the logic of perceptions coincide with the roles of inference in
fuzzy logic. The principal generic rules are: conjunctive rule;
disjunctive rule; projective rule; surjective rule; inversive rule;
compositional rule; and the extension principle. The generic rules
are specialized by assigning specific values to the copula variable,
r, in X isr R.
The principal aim of the logic of perceptions is the development of an
automated capability to reason with perception-based information.
Existing theories do not have this capability and rely instead on
conversion of perceptions into measurements -- a process which in many
cases is infeasible, unrealistic or counterproductive. In this
perspective, addition of the machinery of the computational theory of
perceptions to existing theories may eventually lead to theories which
have a superior capability to function in an environment of
imprecision, uncertainty and partial truth.
Department of EECS
University of California
Abstracts Accepted for Presentation at the Twelfth Amsterdam Colloquium.
Focus and topic sensitive operators
Maria Aloni, David Beaver and Brady Zack Clark
This paper concerns so-called focus sensitive particles (FSPs), which appear to encode
meanings sensitive to intonation. It will be argued that of the wide array of particles
previously described as focus sensitive fall into two natural subclasses cross-linguistically.
We use a range of original data, especially extraction and presupposition phenomena, to show
that only one of the resulting classes is genuinely focus sensitive. It will further be argued
that no current theory of focus can adequately account for both sub-classes.
ILLC/Department of Philosophy
University of Amsterdam and
Department of Linguistics
Stanford University
Two place probabilities, full belief and belief revision
Horacio Arlo-Costa and Rohit Parikh
Making use of van Fraassen's framework for defining beliefs from conditional probabilities, we
provide a discussion of the issues and present some new technical results. We provide a
complete characterization of van Fraassen's probabilities for countable spaces, show how belief
revision can be defined naturally, even by propositions of probability 0, and give some axioms
which are sound for the framework.
Computer Science, Mathematics and Philosophy
City University of New York and
Philosophy Department
Carnegie Mellon University
The logic of anaphora resolution
David Beaver
This paper concerns the semantics/pragmatics interface for natural language, and in particular
the question of how anaphora resolution should be orchestrated in a dynamic semantics. Previous
dynamic systems such as DPL have relied on preindexation of anaphors and antecedents. It is
argued that this represents a serious inadequacy. A dyanmic semantic system, RPL, is proposed
which eliminates the need for preindexation, by combining an "Amsterdam-style" dynamic
semantics with a pragmatic module based on the Centering algorithm. The semantics uses a novel
extension of DMPL information states, in which ambiguity of anaphors is represented using
multiple referent systems. The pragmatic componenent also uses a novel approach, reformulating
Centering declarations as a preference order over dynamic transitions. It is argued that the
resulting system not only provides a marked empirical imporvement over dynamic predecessors,
but also provides a quite general approach to the semantics/pragmatics interface.
Department of Linguistics
Stanford University
Plural predication and partitional discourses
Sigrid Beck
This paper argues against Schwarzschild's (1996) proposal that the interpretation of
relational plural sentences is constrained by a contextually determined relation. Instead, I
suggest that the data that seems to indicate the use of such relations are to be reanalysed as
a discourse phenomenon related to telescoping, which I term partitional discourses. This
captures the fact that the only relations that seem to be available are equivalence relations.
If there is no restricting salient relation, Schwarzschild's uniform analysis of relational
plurals in terms of double universal quantification becomes untenable. We come back to Krifka's
(1989) proposal of a polyadic plural operator to handle cumulative readings of relational
plurals. This theory of plural predication has to be embedded in a semantic framework capable
of ccounting for discourse effects.
Department of Linguistics
University of Connecticut
Incorporation as unification
Agnes Bende-Farkas
This paper proposes an analysis of a particular Definitesness Effect construction, that
involving the light verb {\em have} as found in e.g.\ {\em John has a sister}, or {\em mary has
a good salary}. The analysis is based on the notion of term unification, which has played an
important role in computer science and computational linguistics but, to my knowledge, hardly
within formal semantics. According to term unification, the verb {\em have} and its object
phrase boh introduce a predicational ``term'' consisting of a predicate and its arguments.
Combining verb and object phrase involves unifying these two terms. When, for some reason or
another, unification fails, the combination is unacceptable. The analysis seems capable of
extension to other Definitesness Effect constructions, including those involving ``light''
verbs in Hungarian.
IMS
Universit\"at Stuttgart
Questions as first class citizens
Martin van den Berg
In this paper we use the Linguistic Discourse Model and Dynamic Quantifier Logic (van den Berg
1996, van den Berg and Polanyi 1999) to give a formal treatment of questions and answers in
dynamic semantics. Following the ideas developed in van den Berg 1996, we give a treatment of
wh-phrases as a form of generalized dynamic quantifier that explains how questions can contain
anaphoric references and introduce new ones. Our approach, which stays close to the account of
questions and answers in (Groenendijk and Stokhof 1984), implements incremental interpretation
following the Linguistic Discourse Model.
FX Palo Alto Laboratory
A new probability model for data oriented parsing
Remko Bonnema, Paul Buying and Remko Scha
{\em Date oriented parsing systems} employ redundant Stochastic Tree Substitution Grammars to
analyse natural language utterances on the basis of an annotated corpus (a {\em tree-bank}). A
fundamental component of such systems is the way in which the substitution-probability of a
tree is estimated from its ocurrences in the treebank. In the standard method for doing this,
the probability of a tree is directly correlated with its occurrence frequency in the bag of
all fragments of all corpus trees. We show that this results in undesirable statistical biases.
We therefore propose an alternative method, which estimates the substitution-probability of a
fragment as the probability that it has been involved in the derivation of a corpus tree. We
show that this method has more plausible properties.
Institute for Logic, Language and Computation
University of Amsterdam
Consequences from Quine
Robin Clark and Natasha Kurtonina
We reconstruct Quine's (1960) combinatory logic (QCL), providing both a semantics and a proof
theory for it. We, furthermore, derive the surprising result that QCL can be used as a
framework for dynamic semantics, in particular allowing for a compositional account of
reference tracking mechanisms in natural language, including switch reference. Finally, we
demonstrate that QCL can express the dynamic modalities of arrow logic.
Department of Linguistics
University of Pennsylvania and
Institute for Research in Cognitive Science
University of Pennsylvania
Reciprocal interpretation with functional pronouns
Alexis Dimitriadis
Reciprocal constructions in which the antecedent of the reciprocal is a dependent pronoun have
been analyzed as instance of ``wide-scope'' reciprocals. Because the binder of the reciprocal
determines its range, this analysis cannot handle sentences in which the antecedent of the
reciprocal is not bound by a co-refering, c-commanding antecedent. I propose to derive the
correct range for the reciprocal by translating pronouns as functions (in the fashion of
Jacobson (1999)). The range of the reciprocal antecedent is computable by applying a maximality
operator to the restricted function representing its antecedent.
Department of Linguistics
University of Pennsylvania
The semantics of transitivity alternations
Edit Doron
In theoretical linguistics, causative and middle verbs are usually derived by independent
operations. But cross linguistically, both mark the same transitivity alterations. This paper
proposes a unified syntactic system for the derivation of both types of verbs, which, moreover,
sheds new light on problems in the interface of semantics and morphology. One problem is the
impossibility, mostly ignored in linguistic theory, of deriving the semantics of middle verbs
from that of the corresponding transitive verbs. The second is explaining the identity found
cross linguistically between middle and reflexive morphology. The third is providing an
alternative to the ``event-decomposition'' account of causative verbs.
Department of English Languages
Hebrew University of Jerusalem
Deriving and resolving ambiguities in {\em wieder\/}-sentences
Markus Egg
The paper discusses two challenges posed by sentences with German {\em wieder} `again', viz.,
the {\em derivation} and the {\em resolution} of the `repetitive'/`restitutive' ambiguity that
is characteristic for them. It is shown that the framework of underspecification is adequate
for both these tasks. Underspecification allows for an easy derivation of these ambiguities
that takes into account syntactic restrictions for semantic ambiguities. The resulting semantic
representations are suitable input for resolution processes that resolve the ambiguity of {\em
wieder}-sentences.
Computerlinguistik, Geb. 17.2
Universit\"at des Saarlandes
Non-monotonicity from constructing semantic representations
Tim Fernando
A standard approach to non-monotonicity locates the phenomenon in preferences between models,
against which certain well-formed formulas (or semantic representations, SRs) are interpreted.
Such an approach skips over the prior step of constructing suitable SRs for natural language
discourse --- arguably the main challenge from the perspective of formal linguistics
(computational or otherwise). The present work focusses on this step, tracing complications of
presupposition and ambiguity to it. A family of modal logics is outlined, supporting an
analysis of non-monotonicity as the failure of a sentence-by-sentence translation of a sequence
of natural language sentences to persist; that is, the SR associated with a sentence may, in
light of further natural language input, need to be revised. This revision may involve
adjustments to background assumptions, implicated in presupposition accommodation.
Computer Science
Trinity College Dublin
The epistemics of presupposition
Anthony S. Gillies
Successful linguistic interaction requires agents to be able to reason about what is being
said. Given this, it is natural to pursue the extent to which general epistemic principles can
be brought to bear on semantic phenomena. In particular, I look at the interaction between the
dynamics of belief and presuppositon. A simple defeasible update semantics for presupposition
is given in which accommodation is understood as belief updating, and presupposition failure as
failed belief revision.
Department of Philosophy
University of Arizona
Constructional ambiguity in conversation
Jonathan Ginzburg and Ivan Sag
The paper considers the contents associated with clauses in conversational interaction,
focussing particularly on the issue of how grammatical frameworks can accommodate uses such as
intonation questions and reprise uses. We point out that the phenomena at issue are problematic
for views of the syntax/semantics interface such as radical lexicalism and syntactic
modularity. We decribe a grammatical system formulated within HPSG in which generalizations
about phrasal types, characterized by means of constraints on both the syntactic and semantic
components of the sign, are captured by means of multiple inheritance hierarchies. We show how
this system can accommodate the relevant phenomena.
Department of Linguistics
Stanford University and
Dept of Philosophy, King's College, London and
Dept of English, Hebrew University of Jerusalem
Cross-linguistic semantics of weak pronouns in doubling structures
Javier Guti\'errez-Rexach
In this paper, a formal semantics of prosodically weak pronouns or clitics is developed in
which they are treated as a type of generalized quantifier inherently restrictied to a context
set (following Westerst\aa hl (1985, 1989) and van der Does (1995) general treatment of strong
pronouns). It is claimed that cross-linguistic variation in ``clitic doubling'' configurations,
where the pronoun has a quantifier associate, can be accounted for in the terms of GQ theory
and emerges from the existence of a series of semantic parameters on the retrievAl of context
sets.
Cunz Hall
Ohio State university
Substructural logic: a unifying framework for second generation datamining algorithms
Erik de Haas and Pieter Adriaans
In this paper we propose a framework for data mining algorithms based on a system of
substructural logic. We show the connections between data mining, inductive logic programming
and grammar induction. Furthermore we present a small family of substructural logics that can
represent modern complex information systems. This small family of substructural logics on its
turn will enable us to design efficient datamining algorithms using techniques from the field
of inductive logic programmaing.
ILLC/WINS
University of Amsterdam and Syllogic
Toward a unified analysis of DP conjunction
Caroline Heycock and Roberto Zamparelli
In this paper we analyse ``split'' conjunction within DPs, as found in English examples such
as (a) {\em that man and woman} and (b) {\em those cats and dogs}. We demonstrate that many of
the well-studied Western European languages allow plural split conjunction (b) but not singular
(a); in these languages DP-internal singular conjunction can only have the intersective
``joint'' reading possible also in English examples like (c) {\em my friend and colleague}. We
show that the operation of ``set product'' (union over each possible $n$-tuple across $n$
conjuncts) can be used to define conjunction in a way that derives all the available readings.
In order to explain their cross-linguistic distribution we propose a minimal difference between
languages in the way they obtain the denotation of a singular noun phrase. Taken together,
these ideas yield a syntactic and semantic theory of conjunction which can not only account for
the DP data but also explain the distributivity properties of different conjoined categories.
Department of Theoretical and Applied Linguistics and
Human Communication Research Centre
University of Edinburgh
Deconstructing Jacobson's {\bf Z}
Gerhard J\"ager
Following the methodology of Moortgat 1996b, we compare two multimodal deconstructions of
Jacobson's 1999 type shifting operator {\bf Z} for anaphoric dependencies. The global analysis
of J\"ager 1998 is shown to be of limited generality: it restricts the occurrence of anaphoric
expressions to {\em associative} environments---environments where sensitivity for constituent
structure is lacking. We propose an alternative deconstruction where anaphora resolution is
independent of resource mangement assumptions about the structural context. The analysis is
based on the general theory of structural control proposed in Kurtonina and Moortgat 1995: the
interaction principles of two binary products are fine-tuned in terms of unary modal controol
devices.
Zentrum fuer Allgemeine Sprachwissenschaft
IF logic and informational independence
Theo M.V. Janssen
In game theoretical smeantics the truth of a formula is determined by a game between two
players, $\forall $belard who tries to verify the formula, and $\exists $loise to refute it. He
chooses on $\wedge $ and $\forall $, she on $\vee $ and $\exists $. A version of such games,
introduced by J.\ Hintikka, is IF logic: independence friendly logic. The quantifier $\exists
(y/x)$ means that $y$ has to be chosen independent of $x$, and $\psi (\vee /x)\theta $ that a
subformula has to be chosen independent of $x$. A formula is true, if $\exists $loise has a
winning strategy. Hodges has given a compositional interpretation for the logic: trump
semantics. It will be argued that this interpretation gives results that are not in accordance
with intuitions concerning indepedence of information. Two -equivalent- alternative
interpretations will be proposed that do correspond with intuitions, one based on playing
games, and one on sets of assignments.
Computer Science
University of Amsterdam
A calculus for direct deduction with dominance constraints
Jan Jaspars and Alexander Koller
Underspecification has recently been a popular approach to dealing with ambuity. An important
operation in this context is {\em direct deduction}, deduction on underspecified descriptions
which is justified by the meaning of the described formulae. Here we instantiate an abstract
approach to direct deduction to dominance constraints, a concrete underspecification formalism,
and obtain a sound and comple calculus for this formalism.
Department of Mathematics and Computer Science
University of Amsterdam and
Department of Computational Linguistics
Universit\"at des Saarlandes
True to Facts
Jacques Jayez and Dani\`ele Godard
Ever since Vendler, the denotation of the term {\em fact} has been considered problematic.
Facts seem to be entities of the world and informational items at the same time, located
somewhere between states of affairs or events and propositions. Using Zalta's theory of
abstract objects, we investigate where this somewhere is. We propose that facts are abstract
objects which {\em encode} (in Zalta's sense) the property of being such that a given
proposition holds. Thus, the correspondence between facts and propositions is not an external
relation between two types of entities, but a built-in property of facts. This allows us to
distinguish facts from propositions and events but also to connect those categories and to
acount for their linguistic interplay.
EHESS and
CNRS and Lille III
Semantic composition for partial proof trees
Aravind K. Joshi, Seth Kulick and Natasha Kurtonina
We address the problem of semantic composition in a categorial system based on a hybrid logic.
One logic is used for unfolding a categorial type, resulting in a partial derivation. The
second logic computes the semantic representation from those partial derivations. We encode the
history of the derivation from the first logic by using bound variables to represent the
missing assumptions. Since the application of the partial derivations can take place in either
direction in the second logic, this results in $\lambda $-terms inside the resulting semantic
term. We show that by allowing such terms to be moved to the outermost position, then
compositionality can be maintained in the hybrid logic.
Institute for Research in Cognitive Science
University of Pennsylvania
A proof-theoretic view of intensionality
Reinhard Kahle
We discuss a proof-theoretic view of intensionality. Based on a notion of the {\em use} of a
formula in a proof we show how a proof-theoretic account can avoid some well-known difficulties
of the representation of intensional phenomena. The key example is binary necessity, where we
read {\em ``A is necessary for B.''} as {\em ``Every proof of B uses A.''} Provided that {\em
A} is an axiom or an atomic proposition we can give a formalized version of this reading. This
theory is compared with the standard modal logic approach to necessity and two examples are
given. Finally we give an outlook over further applications of the proof-theoretic view of
intensionality which turn out to be a nice example of interdisciplinarity between logic,
philosophy, linguistics and computer science.
WSI
Universit\"at T\"ubingen
Factoring predicate argument and scope semantics: underspecified semantics with LTAG
Laura Kallmeyer and Aravind Joshi
This paper proposes a compositional semantics for lexicalized tree-adjoining grammar (LTAG).
The use of tree-local multicomponent derivations allows separation of the semantic contribution
of a lexical item into two parts: one component contributes to the predicate argument structure
whereas the second component contributes to scope semantics. Starting from this idea a
syntax-semantics interface is presented where the composition of a semantic representation
depends only on the derivation structure. It is shown that the derivation structure (and
indirectly the restrictions resulting from the locality of the formalism) allows an appropriate
amount of underspecification. This is illustrated by showing the generation of underspecified
semantic representations for quantifier scope ambiguities.
Sonderforschungsbereich 441 and IRCS
Universit\"at T\"ubingen and University of Pennsylvania and
Institute for Research in Cognitive Science
University of Pennsylvania
The dynamics of tree growth and quantifier construal
Ruth Kempson and Wilfried Meyer-Viol
This paper demonstrates how quantifier-construal can be globally defined without sacrificing
an incremental left-right process of projecting interpretation. Scope statements, collected
during the course of the left-right parsing process and subject to lexically specified
restrictions, are separated from the projection of content for quantifying expressions. Amongst
other scope restrictions, construal of indefinites is shown to resemble pronominal anaphora,
with indefinites defined as taking narrow scope with respect to some element to be chosen, a
free pragmatic choice determining the scope relation feeding algorithmic processes determining
the content of quantified expressions.
Philosophy Department
King's College London
Relating polyadic quantifiers: on the cumulativity-distributivity interplay
Brenda Kennelly and Fabien Reniers
We show how to relate tri-adic quantifiers that express mixed readings of distributivity and
cumulativity within a single 3-place predicate to the dyadic quantifiers that express
distributivity (function composition of monadic quantifiers) and cumulativity (Scha 1981). We
discuss problems with the standard approaches and propose that cumulativity necessarily takes
precedence over distributivity. Consequently, for mixed readings cumulativity is reanalyzed as
a relation between a type $<2>$ and a type $<1>$ quantifier. This new account of cumulativity
generalizes conveniently to cumulative quantifiers of arbitrary type.
OTS
University of Utrecht
Using centering theory to plan coherent texts
Rodger Kibble and Richard Power
Centering theory (CT) has been mostly discussed from the point of view of interpretation
rather than generation, and research has tended to concentrate on problems of anaphora
resolution. This paper examines how centering could fit into the generation task, concentrating
on the implications for {\em planning} of texts and sentences. We show that the CT rules as
they stand do not fit neatly into a ``consensus'' pipe-lined NLG architecture as they appear to
rely on feedback from surface grammatical relations to text planning. We suggest some ways this
problem can be overcome or circumvented and report on initial implementation efforts.
ITRI
University of Brighton
Identification language games
Peter Krause
Identification dialogues are inquiries into which individual a speaker intends to refer to.
Typical identification dialogues occur when a riddle is solved and when a reference is
clarified in a subdialogue. Here, the simplest kind of identification dialogue is studied
abstractly. A version of discourse representation theory (DRT) with presuppositions and an
epistemic operator; and a specification of the propositional attitudes of the participants
together with anchoring relations are used to specify the scoreboard of a language game of
identification dialogues with two participants, the identifier and the informant. The rules are
based on the standard semantics for the representations. A {\em defeasible} notion of {\em
knowing which} object is meant can be formulated. The meaning of the locution {\em which one X
is depends on which one Y is.} is specified.
Institute for Computational Linguistics
University of Stuttgart
Binding by implicit arguments
Alice ter Meulen
A game-theoretic account of anaphoric definite descriptions is first presented for discourse
binding with an implicit existential argument. Verifier¹s claimed existence of a verifying
strategy for the inferred existential statement explains why pronouns cannot be bound by
implicit arguments, until the Falsifier demands execution of that claimed verifying strategy.
It is discussed how DRT, as representational modeltheoretic dynamic semantics, and DPL, as
compositional modeltheoretic dynamic semantics, would differentiate between asserted, inferred
and presupposed indefinites to account for the observations. Implicit arguments are
linguistically economical as they circumvent scope-disambiguation, forcing the inferred
existential to remain in focus, and constrain accommodation of presuppositions.
Department of English
University of Groningen
Modeling ambiguity in a multi-agent system
Christof Monz
This paper investigates the formal pragmatics of ambiguous expressions by modeling ambiguity
in a multi-agent system. Such a framework allows us to give a more refined notion of the kind
of information that is conveyed by ambiguous expressions. We analyze how ambiguity affects the
knowledge of the dialogue participants, and, especially, what they know about each other after
an ambiguous sentence has been uttered. The agents communicate with each other by means of a
{\tt tell}-function, whose application is constrained by an implementation of some of Grice's
maxims. The multi-agent system itself is represented as a Kripke structure and {\tt tell} is an
update function on those structures. This framework enables us to distinguish between the
information conveyed by an ambiguous sentence vs.\ the information conveyed by disjunctions,
and between semantic ambiguity vs.\ perceived ambiguity.
Institute for Logic, Language and Computation
University of Amsterdam
DPL with control elements
Rick W.F. Nouwen
Focus in conditionals can cause a deviating external dynamic behaviour. To account for these
exceptional cases, we constructed a variant of dynamic predicate logic, which uses control
elements to direct information in need of a special treatment to a special location in
information states. On the top-level meaning consists of various relations each contributing
some relevant part of the semantics. Further down meaning is still given by means of ordinary
DPL-relations. The result is a fully incremental logic, which can be used to construct meanings
straight from the natural word order. The system shows that the construction of variants of DPL
can be very useful.
UiL-OTS
University of Utrecht
Semantics for attribute-value theories
Rainer Osswald
First, we show how to reconstruct attribute-value (AV) descriptions from natural language by
regimentation and formalization within first-order predicate logic. The introduction of
appropriate predicate operdators then leads to AV-expressions of the usual Kasper-Rounds type.
We present a slight extension which permits relations between attribute values. A
straightforward modification of standard AV-logic turns out to be sound and complete with
respect to first-oprder derivability granted that attributes are functional. Demonstrating this
is part of our second concern which is to apply geometric logic and locale theory to
AV-theories like HPSG. Viewing AV-theories as propositional geometric theories provides a crisp
characterization of the denotation of an AV-theory as the point space of its classifying
locale.
Informatikzentrum
Fernuniversit\"at Hagen
Modeling coalitional power in modal logic
Marc Pauly
We introduce a modal logic for describing what groups of agents can bring about. Using a
neighborhood semantics, we arrive at the notion of a {\em coalitional model} which associates
with every group of agents the sets of states for which the coalition is effective. We show how
specific classes of coalition models or frames can be used to obtain a dynamic model of
collective action in (1) strategic game forms and (2) extensive game forms of perfect
information. These results draw upon the study of {\em effectivity functions} in social choice
theory and generalize some of the results obtained here.
Centrum voor Wiskunde en Informatica
Questioning to resolve decision problems
Robert van Rooy
Why do we ask questions? Because we want to have some information. But why this particular
kind of information? Because only information of this particular kind is helpful to resolve the
{\em decision problem} that an agent faces. In this paper I argue that questions are asked
because their answers help to resolve the questioners decision problem. By relating questions
to decision problems I show (i) how we can measure the values of questions, and (ii) how
answers can resolve questions in particular circumstances, although the nswer is not exhaustive
and complete in the sense of Groenendijk and Stokhof (1984).
Department of Philosophy
University of Amsterdam
Towards a semantic-based theory of language learning
Isabelle Tellier
The notion of Structural Example has recently emerged in the domain of grammatical inference.
It allows to solve the old difficult problem of learning a grammar from positive examples but
seems to be a very {\em ad hoc} structure for this purpose. In this artcile, we first propose a
formal version of the Principle of Compositionality based on Structural Examples. We then
explain under which conditions the Structural Examples used in the domain of grammatical
inference can be obtained very easily from sentences and their semantic representations, which
are naturally available in the environment of children learning their mother tongue. Structural
Examples thus appear as an interesting intermediate representation between syntax and
semantics. This leads to a new formal model of language learning where semantic information
play a crucial role.
Laboratoire d'Informatique Fondamentale de Lille
Universit\'e Charles de Gaulle-Lille3
Two approaches to modal interaction in discourse
C.F.M. Vermeulen
We have seen several examples of the interaction of modal expressions in discourse. The
intuitive explanation of this interaction is that a modal depends on a contextually given set
of possibilities {\em and} adds a new set to the context, that can be used by subsequent
modalities. \par Two style of formalisation of this explanation were considered, both in
dynamic semantics. One uses a representation language with indexed modalities and gives an
update semanticsfor this language. The other, `algebraic' approach uses a string language with
an interpretation in an algebra of m-{\sc states}. Both styles allow us to represent the
crucial example. For a more general comparison we have a systematic translation between the
formalisms that preserves meaning. We conclude that the algebraic approach is at least as well
suited for the representation of modal interaction in discourse. In addition the algebraic
approach forces us to avoid magical updates and thus ensures a level of computtional realism
that update semantics {\em per se} does not offer.
Departments of Philosophy and Linguistics
Utrecht University
A different game? Game theoretical semantics as a new paradigm
Louise Vigeant
Hintikka claims that Game Theoretical Semantics (GTS) is different enough from the dominant
language theories, e.g. Montague Semantics, to represent a new paradigm in the Kuhnian sense.
GTS differs in the three following ways: minimal syntax, non-compositional and a definition of
truth in terms of strategy. Of these three only the strategic definition of truth proves to be
an irreducible difference between the two types of semantic theories. This new definition of
truth is not enough to define GTS as a different paradigm. What leads Hintikka to claim that
GTS is a new paradigm is his belief that this semantic theory is representative of
Wittgenstein¹s language games. One of the main arguments in his interpretation of Wittgenstein
is that language games are a major innovation in explaining rules. The attempt to incorporate
this innovation into a semantic theory is only a partial step towards a new type of semantic
theory. A Wittgensteinian paradigm must also answer the question of what it is about truth that
Œfits¹ a proposition.
Department of Linguistics
Stanford University
Plural type quantification
Yoad Winter
This paper develops a type theoretical semantics for quantification with plural noun phrases.
This theory, unlike previous ones, sticks to the standard treatment of singular quantification
and uses only one lifting operator per semantic category (predicate, quantifier and determiner)
for plural quantification. Following Bennett (1974), plural individuals are treated as
functions of type $et$. Plural nouns and other plural predicates accordingly denote $(et)t$
functions. Such predicates do not match the standard $(et)((et)t)$ type of determiners.
Following Partee and Rooth (1983), type mismatches are resolved using {\em type shifting
operators}. These operators derive collectivity with plurals, keeping the analysis of singular
noun phrases, where no type mismatch arises, as in Barwise and Cooper (1981). A single type
shifting operator for determiners combines into one reading the {\em existential} shift and the
{\em counting} (neutral) shift of Scha (1981) and Van der Does (1993). This operator combines
the {\em conservativity} principle of generalized quantifier theory with Szabolcsi's (1997)
existential quantification over {\em witness sets}. The unified lift prevents unmotivated
ambiguity as well as the monotonicity ill of existential lifts pointed out by Van Benthem
(1986:52-53).
Computer Science
Technion
A dynamic solution for the problem of validity of practical practical inference
Berislav \v {Z}arni\'c
The paper develops a dynamic framework suitable for analysis of practical propositional
inference. The application of functional approach in dynamic semantics confirms the results
obtained in philosophical analyses of practical inference, regarding, in particular,
defeasibility, undetachability and distinctive quality of conclusion. Contradictory results of
proposed tests of validity and conflicting intuitions on valid forms of practical inference are
reconciled.
Teacher's College
University of Split
Explaining presupposition triggers
Henk Zeevat
The paper proposes three revisions to the standard view of presuppositions: the employment of
optimality theory for the defaults and preferences, the admission of weak antecedents for
presupposition resolution/satisfaction and a fine-grained classification of presupposition
triggers, based on the availability of expression alternatives and the logical requirement of
the presupposition. The treatment is able of dealing with a wide range of phenomena that are
outside the scope of any current presupposition theory.
ILLC/Computational Linguistics
University of Amsterdam
Paul J.E. Dekker