Kutaisi 2011: Ninth International Tbilisi Symposium on Language, Logic and Computation

Abstracts General Programme

[Return to programme grid]

Abstracts Tutorials

Ulle Endriss Computational Social Choice

Social choice theory deals with questions regarding the design and analysis of methods for collective decision making. In recent years there has been a growing interest in the computational aspects of collective decision making, giving rise to the field of computational social choice. This tutorial will provide an introduction to this field, highlighting in particular the role of logic in computational social choice. The first lecture will be devoted to a general introduction and an exposition of the axiomatic method in social choice theory; the second lecture will be about logical languages for preference representation and social choice; and the third lecture will be an introduction to judgment aggregation. No special background knowledge will be expected. Extensive teaching materials are available at

http://www.illc.uva.nl/~ulle/teaching/kutaisi-2011/

[Slides]

Daniel Hole Binding: data, theory, typology

In this tutorial, we will look at binding as found, for instance, in reflexive constructions from three different perspectives. In a first step we will review the data patterns that render necessary a distinction between coreference and binding. Then we will compare different ways to implement the difference in the theory. Our third goal will be to gain an overview of the different ways in which binding is expressed in the languages of the world. [Slides]

Enzo Marra Lucasiewiecz infinite valued propositional logic: from the philosophy of vagueness to the geometry of rational polyhedra

It is the eve of the 2006 Football World Cup Final, when Italy is going to play France. Consider the sentence X=“Italy will score in the match against France”. You do not know for sure whether X will turn out to be true or false after the match. That is why bookmakers take bets on (the event described by) the sentence X: because the information conveyed by X is uncertain. In the last few centuries, science has developed the theory of probabilities to deal with uncertainty. And although this is not always stressed in standard accounts, it is a fact that probability depends on logic. Consider the (compound) sentence X ∨ ¬X=“Either Italy will score or Italy will not score in the match against France”. The information conveyed by this new sentence is no longer uncertain: whatever happens on the day of the match, X ∨ ¬X will turn out to be true; betting on such tautologies, as these sentences are called in logic after Wittgenstein, is all but exciting. So why is X uncertain, whereas X ∨ ¬X, regardless of the factual content of X, is not? The point is that the logic of the original sentence X is classical — and so, in particular, X satisfies the tertium non datur law that X ∨ ¬X is always true. [Continued]

[Slides1a]

[Slides1b]

[Slides2]

[Slides3]

 

[Return to programme grid]

Abstracts Invited Lectures

Alexandru Baltag Learnability via Iterated Belief Revision

We consider iterated belief-revision induced by a stream of incoming ``inputs" (e.g. successive observations), assumed to be all truthful (pieces of) information (satisfied in the ``actual world"). The problem we address is the possibility of learning the truth : in what conditions can we guarantee that the beliefs will eventually converge (after some finite, but not necessarily bounded, number of revisions) to the Truth? So we are interested in (guaranteed) learnability (of the ``actual world", no matter which of the worlds happens to be the actual). This issue is closely related to Gold's concept of ``identifiability in the limit" in Computational Learning Theory, though there are a number of differences: our investigation takes place in the much more general setting of possible worlds semantics; the restriction to computable learning methods is no longer central for us; but instead, the important restriction in our context is to (learning methods that are based on any of the standard) belief-revision methods, and moreover to (the ones that are based on) *iterated* revision (i.e. they are history-independent, being induced by successive iterations of a one-step method, without keeping the memory of all the past beliefs or past observations).

First, we use a known characterization of this concept to show that learnability is equivalent to a topological ``Separation" property of the set of possible worlds (whose strength lies in between the Separation conditions T_0 and T_1). Second, we characterize learnability in Game-Theoretic terms. Third, we analyze the learning power of the most common qualitative belief-revision methods (such as conditioning, lexicographic revision, minimal revision and other methods for updating the standard qualitative models for belief-revision, given either order-theoretically in terms of a plausibility preorder or topologically in terms of a family of nested spheres), as well as of the most common quantitative, probabilistic revision methods (Bayesian conditioning, Jeffrey conditioning and their various extensions to generalized settings such as conditional probabilistic spaces). In order to compare these with other learning methods, we consider the question of whether any of them is “universal” (i.e. as good at tracking the truth as any other learning method). We show that this is not the case, as long as we keep the standard (probabilistic or qualitative) setting for belief-revision. In particular, neither the family of totally well-preordered (plausibility) models, nor the family of classical probability spaces, not that of lexicographic probability spaces, and not even the family of all countably additive conditional-probability spaces, are rich enough to allow (either the qualitative, or the probabilistic) conditioning to be a ``universal" learning method.

On the positive side, we show that if we consider appropriate generalizations of conditioning in a non-standard, non-wellfounded setting, then universality is achieved for some (though not all) of these learning methods. In the qualitative case, this means that we need to allow the prior plausibility relation to be a non-wellfounded (though total) preorder. In the probabilistic case, this means moving to a generalized conditional probability setting, in which the family of the so-called probabilistic ``cores" (in the sense of Van Fraassen) or ``strong beliefs" (in the sense of Battigalli and Siniscalchi) may be non-wellfounded (when ordered by inclusion or logical entailament). (joint work with Nina Gierasimczuk and Sonja Smets)

Nikolaj Bjorner The Satsifiability Modulo Theories solver Z3 and its Applications

Modern software analysis and model-based tools are increasingly complex and multi-faceted software systems. However, at their core is invariably a component using logical formulas for describing states and transformations between system states. In a nutshell, symbolic logic is the calculus of computation. The state-of-the art Satisfiability Modulo Theories (SMT) solver, Z3, developed by Leonardo de Moura and the speaker at Microsoft Research, can be used to check the satisfiability of logical formulas over one or more theories. SMT solvers offer a compelling match for software tools, since several common software constructs map directly into supported theories.

SMT solvers have been the focus of increased recent attention thanks to technological advances and an increasing number of applications. The Z3 SMT solver is used in several program analysis, testing and design tools at Microsoft Research, in the Windows 7 static driver verifier and in many tools outside of Microsoft. It offers a rich interface and set of features and is also the most efficient solver available in most areas exercised by SMT solvers. The talk presents the background of Z3 and offers an overview of some of the many applications.

Peter Beim Graben The horse raced past: Gardenpath processing in dynamical systems

Recently, Hale (2011) has revived classical gardenpath theory (Frazier 1987; Frazier & Clifton 1996) in terms of search heuristics on weighted graphs as problem spaces of syntactic language processing. In his approach, gardenpath effects as well as attachment paradoxes and local coherence emerge from misleading estimates of future processing costs. If processing costs are correctly assessed after sufficient training, however, gardenpath interpretations could be avoided. Therefore, estimated processing costs can be interpreted as control parameters (beim Graben et al. 2004) for sequential decision problems (Rabinovich et al. 2006; Rabinovich et al. 2008) from a dynamical systems point of view.

I shall illustrate this idea by means of Bever's (1970) famous gardenpath "the horse raced past the barn fell". In a first step, I construe the graph of the problem space, by also taking the diagnosis and repair models of Fodor & Inoue (1994) and Lewis (1998) into account. In this framework syntactic reanalysis of the severe gardenpath appears as a switching process
from one search sequence (the gardenpath) to another sequence (the cure). In a dynamical systems approach this switching corresponds to a so-called bifurcation governed by a suitable control parameter. [Slides]

Agi Kurucz Two-dimensional products of modal logics: new results on axiomatisation and decision problems

The product construct is a conceptually simple yet powerful way of combining propositional modal logics. Products of Kripke frames allow us to reflect interactions between modal operators representing time, space, knowledge, actions, etc. Products are also connected to finite variable fragments of classical and intuitionistic predicate logic and to finite dimensional cylindric algebras in algebraic logic, and many of the tools and techniques used to study them have come from these areas.

It has been a kind of `consensus' in algebraic logic that `bad' things like undecidability and non-finite axiomatisability show up at dimensions 3 or greater only. In this talk we survey recent mostly `negative' results on 2D product logics, showing that this is not the case in general, and discuss some relevant techniques. Joint work with Stanislav Kikot and Sergio Marcelino. [Slides]

Prakash Panangaden Duality for Transition Systems

In this talk we consider the problem of representing and reasoning about systems, especially probabilistic systems, with hidden state.  We consider transition systems where the state is not completely visible to an outside observer.  Instead, there are observables that partly identify the state. We show that one can interchange the notions of state and observation and obtain what we call a dual system.  The double dual gives a minimal representation of the behaviour of the original system.  We extend this to nondeterministic systems and to probabilistic transition systems and finally to partially observable Markov decision processes (POMDPs).  In the case of finite automata restricted to one observable, we obtain Brzozowski's algorithm for minimizing finite-state language acceptors.  I will explain the category theoretic reason why this works at the end.  In this more general version we can handle weighted automata and also probabilistic transition systems.

This research began with joint work with colleagues from McGill especially Doina Precup and Chris Hunt.  The recent categorical understanding is joint work with Clemens Kupke and Nick Bezhanishvili.

Vangelis Paschos Moderately exponential approximation: bridging the gap between exact
computation and polynomial approximation

The talk presents a possible trade-off between polynomial approximation and exact computation. We show how using ideas from both fields one can design approximation algorithms for several combinatorial problems achieving ratios that cannot be achieved in polynomial time (unless a very unlikely complexity conjecture is confirmed) with worst-case complexity much lower (though super-polynomial) than that of an exact computation. The talk discusses several techniques developped in this new framework.

 

[Return to programme grid]

Abstracts Contributed Talks

Session Language (La1)

Rusudan Asatiani Typological Peculiarities of the Georgian Passive and the Information Structure

The passive constructions usually are defined functionally: they are qualified as conversive ones of the corresponding active constructions where a patient is promoted to the subject position, and an agent is demoted and transferred into a prepositional phrase. In Georgian morphologically marked “passive” verb form does not always show such a conversion of an active construction and sometimes it actually expresses active semantics: dgeba ‘S/he is standing up’, ekačeba ‘S/he tugs hard at smth./smb.’, ac’veba ‘S/he pushes smth./smb.’, etc. The peculiarities of Georgian passive define the restrictions of their usage in the process of information structuring. The analysis of information structures of the Georgian sentences gives a strong argument to interpret Georgian passive as a grammatical category mostly governed by semantic (and not by syntactic) features. The paper presents a cognitive generative model based on semantic features which supposedly mirrors the hierarchically organized processes of grammaticalization of active, passive and, so called, medial verbs. [Slides]

Umut Özge Turkish Accusative Indefinites and Presuppositionality

The paper concerns the interpretative effects of accusative marking on indefinite direct objects in Turkish. We consider Enç's (1991) proposal associating the accusative-marker with familiarity/partitivity. We argue that once the role of information structure and amount of descriptive content on the familiarity judgements is discerned, the familiarity thesis becomes less tenable. We also argue that an alternative proposal due to Diesing (1992), which associates the marker with existence presuppositions has some problems under closer scrutiny. On the positive side, we argue that the behavior of accusative-marked indefinites in Turkish is parallel to that of anaphoric expressions, and suggest to analyze them in a setting like that of van der Sandt (1992), taking the information structural environment the accusative-marked indefinite comes in as a factor effective on the possibility and/or ease of accommodation. [Slides]

George Chikoidze, Nino Javashvili and Nino Amirezashvili Georgian “Ancestors” of the logical implication

The lion share of mental (philosophical, logical) concepts has its origin in the natural language - “body of thinking”. The most important of those concepts should have their origin in all languages, though with some possible additional nuances which make them more understandable and near for usual users of this language. Particularly, here will be given a short review of Georgian conjunctions with semantics corresponding to the logical implication.

 

[Return to programme grid]

Session Language (La2)

Margot Colinet The French indefinite pronoun 'quoi que ce soit' and the notion of entropy

This work studies the general meaning of the French indefinite pronoun quoi que ce soit (anything whatsoever) which is used in contemporary French as a NPI, sometimes emphatically and less reprensentatively as a FCI. This work seeks to investigate recent propositions which make use of the notion of entropy in order to account for the semantics of NPIs as well as FCIs. Recently Van Rooij (2003), studying NPIs in questions, proposes to reinterpret the notion of strengthening as entropy and quite independently Jayez (2010), studying FCIs, reinterprets the notion of widening in terms of 'entropy'. Both proposals take stock of the litterature about polarity items since the 90's, especially Kadmon and Landman (1993) and Krifka (1995), but go an important step further in the study of indefinites.

Yael Greenberg and Keren Khrizman Bixlal: A general strengthening operator in Hebrew

This paper offers an analysis of the Hebrew particle bixlal, which is inspired by preliminary observations and ideas in Migron (2003) and according to which bixlal is a flexible emphatic particle, i.e. one which leads to a strengthening effect in a variety of ways. We discuss the interpretation and distribution of bixlal in light of theories of polarity sensitivity and focus (Kadmon & Landman 1993, Krifka 1995, Chierchia 2011). We further compare our analysis of bixlal to Anderssen’s (2006) “Widening” analysis of the similar German particle uberhaupt, and claim that at least for bixlal widening is just one of the ways in which strengthening can be satisfied. We support our conclusion with the behavior of bixlal with multidimensional adjectives involving universal and existential quantification over ‘respects’ (Sassoon (2011)).

Jeroen Groenendijk and Floris Roelofsen Algebraic foundations for inquisitive semantics

The paper develops an inquisitive semantics for a first-order language that is motivated by general algebraic concerns. As far as connectives are concerned it coincides with, and thus provides an algebraic foundation for the semantics that was developed in previous work. However, its treatment of quantifiers will diverge from previous work, and a careful assessment of these differences will deepen our understanding of the semantics.

 

[Return to programme grid]

Session Language (La3)

Bjorn Jespersen and Giuseppe Primiero Alleged assassins: realist and constructivist semantics for modal modification

Modal modifiers such as Alleged oscillate between being subsective and being privative. If individual a is an alleged assassin (at some parameter of evaluation) then it is an open question whether a is an assassin (at that parameter). Standardly, modal modifiers are negatively defined, in terms of failed inferences or non-intersectivity or non-extensionality. Such modifiers are in want of a positive definition and a worked-out logical semantics. This paper offers two positive definitions and formal accounts. The realist one is elaborated within Tich\'{y}'s Transparent Intensional Logic and builds on Montague's model-theoretic semantics for modifiers. The constructivist one is based on a variant of Martin-Löf's Constructive Type Theory to accommodate modifiers. [Slides]

Paul Meurer Constructing a large annotated corpus for Georgian – Tools and resources

Large annotated corpora are an invaluable resource in linguistic research. Although such corpora exist for many of the larger languages of the world, there is to date no such resource publicly available for Georgian. In my talk, I will describe my efforts towards building an annotated Georgian corpus, which until now has resulted in a lemmatized and in part morphosyntactically annotated corpus of 120 million words, based on several freely available collections of fiction and non-fiction texts on the Internet. For morphosyntactic annotation, a morphological analyzer with a tagset of 200 tags was developed using the FST (Xerox Finite State Tools) software, whereas disambiguation is done in the Constraint Grammar framework. [Slides]

 

Liana Lortkipanidze and Marina Beridze The issue of Morphological Annotation of the Georgian Dialect Corpus

The Georgian dialect corpus is created at the Arn. Chikobava Institute of Linguistics. Its purpose is to portray the texts’ collection of all Georgian dialects by the corpus technology. At present, there are narrative and lexicographic data of 16 Georgian dialects placed in the corpus. However, potentially the corpus can present the data of other Kartvelian languages as well. Morphological annotation of the corpus is one of the most important stages of the work, being now carried out by the working group. Here in this article, the five-level process of morphological annotation developing is discussed and the mechanism of forming and expanding the bases containing the relevant linguistic information is presented.

 

[Return to programme grid]

Session Language (La4)

Anton Benz and Fabienne Salfner Discourse Structuring Questions and Scalar Implicatures

In this talk, we discuss the dependency between scalar implicatures and discourse structuring questions. We argue that even the most prototypical examples of scalar implicatures depend on explicitly or implicitly given questions under discussion (Roberts, 1996). In particular, we argue against the idea that scalar implicatures are automatically generated by logical form and hence are independent of the general discourse context as proposed by the so-called localist approaches which integrate implicature calculation into compositional semantics (Chierchia, 2004). Our approach is in line with attempts to show that implicatures in general are discourse phenomena, e.g. (Asher and Lascarides, 2003; Breheny et al., 2006; Geurts, 2007; van Kuppevelt, 1996). However, there is no systematic account of implicatures along these lines as yet.

Nino Nijaradze Synonymy of Syntactic Structures

The paper discusses definitions of the concept of syntactic synonymy suggested in the research. Tracing the evolution of the notion it attempts to draw parallels and identify differences between the definitions provided by various authors. Among the many aspects of syntactic synonymy that cause debate, interpretation of the meaning shared by the synonyms is of particular interest. The article claims that defining syntactic synonyms in terms of their shared grammatical meaning, denotation or even signification fails to provide a satisfactory basis for their identification. The paper suggests the concept of propositional structure as the main criterion for defining syntactic synonymy and reviews four categories of syntactic synonyms that can be identified on the basis of this definition. [Slides]

Sumiyo Nishiguchi CCG for Discourse

In Combinatory Categorial Grammar (CCG) (Steedman 1996, 2000, Szabolcsi 1987), questions and focused sentences have been considered to be categories of S. On the other hand, questions have been considered to be sets of possible answers from semantic perspectives (Hamblin 1973). Pragmatically, focus induces a set of alternatives (Rooth 1985,1992). I claim that interrogatives and focused sentences should be functions from a sentence to another sentence in view of their semantics. Such novel categories are now able to combine with the following sentence in a discourse by means of functional application. [Slides]

 

[Return to programme grid]

Session Language (La5)

James Kilbury, Katina Bontcheva, Natalia Mamerow and Younes Samih Historical-Comparative Reconstruction in Finite-State Technology

Computational linguistics (CL) provides tools for many aspects of work in historical linguistics. With software like xfst and foma it is now possible to program replacement rules for sound changes and thus to simulate not only the "downward" derivation of later forms from antecedents, but also efficient "upward" derivation of possible antecedents for given later forms. This corresponds directly to the model of historical sound change in terms of ordered replacement rules, which continues to receive wide acceptance in historical linguistics. We have developed a finite-state based tool to test comparative reconstructions of language families and to compute proto-forms from proposed cognate sets, and have tested the technique with simple examples and data from Slavic and Germanic languages. Moreover, we have built a Java GUI to facilitate the input and display of Unicode characters. [Slides]

Zakharia Pourtskhvanidze Pragmatic Word Order and Particle “ḳi” in Georgian

This paper is a response to a discussion about information structure and its interface in syntax. The aim of the paper is to show that the pragmatic relations of theme-rheme play a significant role in the selection of word order in Georgian. The fixed syntactic position of the particle “ḳi” marks the border between two contrasted informational sets. The contrasting of informational components can be recognized as one of the pragmatic functions of particle “ḳi”. The fixed syntactic position is reserved for the particle “ḳi” and functions obligatory. Left of Particle “ḳi” old, given or presupposed Information is placed and on the right we observe the preferred position of the new, focused Information. In other words, the object of statement - theme - stands before the particle “ḳi” and statement about object - rheme - follows it. The particle “ḳi” is required by the splitting of the message into two contrasted informational sets. Additional evidence in favor of this observation is the circumstance that the explicit focus particle like the Suffix “-ġa”, which conceptualizes additional focus in Georgian, is always located in the text-section after particle “ḳi”. The general screening of issue of the word order in Georgian with pragmatic parameters can cause the rethinking of widely accepted opinions about (extremely) free order in Georgian in favor of the pragmatic regular (fixed) word order.

Kerstin Schwabe Propositional correlates and matrix predicates in German

The paper is about German propositional proforms like 'es' or 'darüber' in constructions like 'Frank hat sich(darüber) gefreut,dass p'(Frank has Refl.Acc ProPP enjoyed that p) or 'Frank hat (es) bedauert,dass p' (Frank has it regretted that p). It discusses their syntactic and semantic relationship to their related clause as well as to their matrix predicate. Arguing against the claim that there are different correlate categories, the paper advocates for a uniform analysis, i.e. for regarding each correlate as a proform referring to a statement expressed by the related clause. Additionally, it will show, exemplarily, that and how a correlate modifies the meaning of the matrix predicate.

 

[Return to programme grid]

Session Logic (Lo1)

Philip Kremer Strong completeness of S4 wrt any dense-in-itself metric space

In the topological semantics for modal logic S4 is well-known to be complete wrt the rational line, Q, and wrt the real line, R: these are special cases of completeness wrt any dense-in-itself metric space. It is customary to strengthen completeness to strong completeness, the claim that any consistent set of formulas is satisfiable at some point in the space in question. For countable languages, the completeness proof for Q can be slightly amended to show not only completeness, but strong completeness, wrt Q. But no similar tweak is available for R, and the questions of strong completeness wrt R has remained open. In the current paper, we prove that S4 is strongly complete wrt to any dense-in-itself metric space -- and therefore wrt R.

Okan Açıksöz and Tahsin Öner On the Expressivity of Dynamic Topological Logics

Since dynamic topological logics were first introduced as formalisms to study topological dynamical systems in 1997, most of the research in this area has been concerned with the axiomatization and decidability issues. We choose to study expressive powers of dynamic topological languages. We aim to put dynamic topological logic in use for reasoning about the orbits of the points in topological dynamical systems. We show that the basic dynamic language does not have enough expressive power to express most of the topologically interesting properties of the orbits. To obtain the enriched language which has intended expressive power, we first expand the language by adding some operators and nominals. Then, we suggest a new interpretation for the dynamic modality. Our results indicate that this new interpretation provides more expressive power than the old one does and so, it is more convenient for reasoning about topological dynamical systems. [Slides]

Levan Uridia Nonmonotonic Logics in Topology

In the paper we introducethe topological semantics of nonmonotonic modal logics by translating the minimal model semantics in topological terms.

 

[Return to programme grid]

Session Logic (Lo2)

Özgüç Güven Are there 'Bolzanian Roots' of Frege?

Bolzano and Frege are two great logicians who influenced the foundation of modern logic. Frege was born in 1848 when Bolzano died. Although they published their works in the same language and they shared similar intellectual interests in the same academic atmosphere there was not any reference in Frege’s writings to Bolzano. Under these circumstances, a question arises: “Did Frege read Bolzano?” Commentators have been discussing this point. Some argue that due to a lack of exact evidence it can’t be straightforwardly claimed that Frege read Bolzano. Others assert that Frege did read Bolzano, especially after 1905, but drawing attention that “there was no influence from Bolzano in Frege’s writings from the Hochleistungsperiade 1890-95” . My point is that even there is no direct evidence, one can figure out whether Frege read Bolzano by digging the possible ‘Bolzanian roots’ of Frege’s philosophy. So in this paper, I want to show the close relationship of Bolzano’s Philosophy with Frege’s by comparing their philosophical attitudes. Related with this idea, I will also be claiming that one can find similarities between Bolzano and Frege even before 1905, since Frege put forward his main philosophical attitude in his works such as Begriffsschrift (1879), Die Grundlagen der Arithmetik (1884), Funktion und Begriff (1891), Uber Sinn und Bedeutung (1892), Uber Begriff und Gegenstand (1892), Grundgesetze der Arithmetik (1893). For supporting my claims I’ll make a comparison between Frege and Bolzano’s philosophy indicating these points: i) Bolzano’s and Frege’s conception of logic. Why did they try to make a new foundation of logic? Were the leitmotif of these foundations the same? ii) They were both against psychologism. Why were they against psychologism? Could their understanding of psychologism said to be same? iii) Bolzano’s ‘Sätze an sich’ are those which have absolute truth values. Are those similar to Frege’s ‘Gedanken’? iv) Their reactions against Kantian philosophy were akin. What was the problem the two had about Kant’s views? v) Frege tried to derive arithmetics from logical principles without any appeal to other cognitive sources. Also Bolzano was a mathematician who had rejected the traditional geometry approach and demanded purely logical foundations for grounding. vi) Bolzano used the terms ‘Sinn’ und ‘Bedeutung’ before Frege. Was his intention related with these terms similar to those of Frege? vii) Bolzano thought that, there are substitutive parts in a proposition and if the truth value of a proposition doesn’t change after substituting those parts it is called an analytic proposition. Is this conception an anticipation of Frege’s conception on analyticity related with function and argument? viii) Bolzano proposed concept of consequence namely derivability / deducibility (Ableitbarkeit). It is the case that a truth is discovered by reference to its consequence. Are Frege’s logical inferences compatible with Bolzano’s concept of consequence? Based on these points, I’ll discuss whether there could be any possible “Bolzanian roots” of Frege.

Nazli Inonu Abduction and Its Uses

Abduction is a logical inference introduced by Charles Sanders Peirce in 1867 as a different kind of inference from deduction and induction. Peirce derived induction and abduction from deduction. He first classified inference as deductive or analytic inference and synthetic inference and then classified synthetic inference as induction and hypothesis. He defined induction as a generalization we make from a number of elements of a class to the whole class. On the other hand, he defined hypothesis as an explanation of a curious circumstance or as an inference from two objects’ resemblance in certain respects to their resemblance in other respects.

Georgy Bronnikov Dynamic sentential doxastic logic

A class of constructions is considered which support logical inference of certain classes within their sentential complements, but not full closure under logical entailment. These include indirect speech, belief reports, hearsay evidentials and clarity assertions. A logic is proposed that is able to characterise classes of inferences, and is thus suitable for analysis of such constructions. The logic is based on a sentential model of belief, with rule application by an agent leading to a change of state. The apparatus of dynamic logic is used to describe these changes.

 

[Return to programme grid]

Session Logic (Lo3)

Dirk Pattinson and Luigi Santocanale How to Cover without Lifting Relations

In his paper `Coalgebraic Logic', Larry Moss introduces a general principle for constructing a modal logic that can be interpreted over coalgebras for an arbitrary endofunctor. The coalgebras play the role of frames, and the endofunctor describes their structure: we can have Kripke frames, probabilistic frames, game frames, to name just a few, by varying the endofunctor. This logic is by now well studied, but depends on a crucial assumption: that the endofunctor under consideration preserves weak pullbacks, excluding important examples like (monotone) neighbourhood frames and conditional frames. The conceptual reason is that the semantics is defined in terms of relation lifting -- which is only well-behaved if the functor under consideration preserves weak pullbacks. Here we present a different take on Moss' approach and show how a (modal) logic can be defined in the same spirit, where relation lifting is replaced by applying the functor to the (local) theory map. [Continued] [Slides]

Oliver Fasching Extension of Gödel logics by unary operators with real-valued semantics

We consider an extension of [0,1]-Gödel logic by a unary operator o that is interpreted by a function f on [0,1] such that f(1)=1 and such that f(x)<f(y) or f(x)=f(y)=1 whenever x<y where x, y in [0,1]. We show how the set of formulas in propositional logic that are valid for all Gödel interpretations and all such $f$ with the above monotonicity property can be described by a Hilbert-Frege system with infinitely many axiom schemes. This is achieved by exploiting an idea in an original paper of Dummett, where propositional Gödel logics was axiomatized. Although the semantics of this logics is real-valued, the operator shares many properties with typical modal operators, which often have a many-worlds semantics.

Ramaz Liparteliani and Phridon Alshibaia MV-algebras with constant elements

In this work we deal with algebraic counterparts of Lukasiewicz logic enriched by finite number of truth constants. A propositional many-valued logical system which turned out to be equivalent to the expansion of Lukasiewicz Logic L by adding into the language a truth-constant r for each real r (0, 1), together with a number of additional axioms was proposed by Pavelka in [1]. We investigate the varieties of algebras, corresponding to of Lukasiewicz logic enriched by finite number of truth constants. Specifically we show that these varieties contain non-trivial minimal subvariety generated by finite linearly ordered algebra which is functionally equivalent to Post algebra. The analysis and charachterizations of appropriate varieties and corresponding logical systems are given. Free and Projective algebras are studied in these varieties as well as projective formulas and unification problem.

 

[Return to programme grid]

Session Logic (Lo4)

Johannes Ebbing, Peter Lohmann and Fan Yang Model Checking for Modal Intuitionistic Dependence Logic

In this paper we consider the complexity of model checking for modal intuitionistic dependence logic (MIDL). MIDL arises from standard modal logic by adding dependence atoms and intuitionistic implication. A dependence atom is stated as =(p_1,...,p_n) where p_1,...,p_n are atomic propositions. =(p_1,...,p_n) means that the value of p_n is fully determined by those of p_1,...,p_(n-1). Intuitionistic implication is stated as F -> G and means that for every part of the structure where F holds, G must hold as well. In this paper we show that the model checking problem for MIDL in general is PSPACE-complete. Furthermore, we consider fragments of MIDL built by restricting the operators allowed in the logic. It turns out that apart from known NP-complete as well as tractable fragments there also are some coNP-complete fragments, e.g. propositional intuitionistic dependence logic (PIDL).

Samuel Bucheli, Roman Kuznets and Thomas Studer Decidability for Justification Logics Revisited

Justification logics are propositional modal-like logics that instead of statements 'A is known' include statements of the form 'A is known for reason t' where the term t can represent an informal justification for A or a formal proof of A. In our present work, we introduce model-theoretic tools, like filtrations and a certain form of generated submodels, in the context of justification logic in order to obtain decidability results. We will not only reproof already known results in a new and simple way but also get new results. In particular, we use our submodel construction to establish decidability for a justification logic with common knowledge for which so far no decidability proof was available. [Slides]

 

[Return to programme grid]

Session Computation (Co1)
Pavel Braslavski and Vadim Kantorov Automatically Assessing Content Quality in Community Question Answering Sites: a Work-in-Progress

We report a work-in-progress on automatic classification of community question answering content into informative and non-informative. We collected a considerable dataset from Otvety@Mail.Ru and manually labeled about 1,500 pages. We use three groups of features for classification: content features (separately for questions and answers), thread activity features, and interaction-based features. Expected results might be implemented into indexing policy of a general-purpose search engine. [Slides]

 

[Return to programme grid]

 

Session Computation (Co2)

Mikheil Rukhaia CERES and Fast Cut-elimination

It is well known that cut-elimination is of nonelementary complexity in general. In this paper we identify classes, where cut-elimination is elementary. We describe two methods of cut-elimination, Gentzen's method and the CERES method. It is proved that CERES nonelementary speeds up Gentzen's method and we will use CERES to identify fast classes of cut-elimination. Idea is to identify classes of LK-proofs, whose characteristic clause set falls into one of the decidable subclasses of first-order logic. We extend some existing classes to more general ones and prove that cut-elimination remains fast. At the end we introduce a new class of LK-proofs and prove that cut-elimination is fast on it.

Avtandil Arabuli, Gia Shervashidze, Eter Soselia, Marine Ivanishvili and Rusudan Asatiani Thesaurus of the Georgian Language

Georgian language is one of the South-Caucasian languages with the oldest literary traditions. The Georgian script was devised around 400 A.D. in order to facilitate the dissemination of Christian literature and one of the most important current tasks of preservation of the Georgian cultural heritage is the study of historical development of the Georgian language based on the analysis of the Georgian sixteen-century-old literary monuments. To this end, it seems essential to publish Thesauri, and the project Thesaurus of the Georgian Language serves this purpose. The goal of the project is a creation of Thesaurus database – the computer processing of a computerized lexical archives. Due to the software the program Thesaurus of the Georgian Language (www.saunje.nekeri.net) is ultimately created, in which lexical items are arranged in an alphabetic order, each of them being provided with its all attested word-forms together with the illustrative contexts; the word-forms themselves are arranged according to the alphabet in the “nest” for each lexical item; as for illustrative material from the dated manuscripts and published texts, they are arranged chronologically; in this way, it is possible to define more exactly the appearance of a certain word in Georgian and to establish the way of its historical development. Besides, a word-form is given in the form it is met in respective original manuscripts, preserving the abbreviations and spelling specificities. The Thesaurus of the Georgian Language will be a significant acquisition not only for Georgian philology, Georgian linguistics and Kartvelology, but also for Georgian culture in general, and it will pave the way to many other projects of utmost importance: ontological, concise, historical, etymological dictionaries, various kinds of morphological and syntax models for computer processing, etc. It attracts widespread public and scientific interest of scholars in Georgia as well as abroad. The software product of the Thesaurus of the Georgian Language is composed, and on the ground of its database a large number of other electronic programs can be created as well.

 

Stanislav Kikot, Roman Kontchakov and Michael Zakharyaschev A note on Ontology-Based Data Access

In the ontology-based data access (OBDA), potentially incomplete relational data is enriched by means of ontologies, representing intensional knowledge of the application domain.We consider the problem of query answering in OBDA. Certain ontology languages have been identied as FO-rewritable (e.g., DL-Lite and sticky-join TGDs), which means that the ontology can be incorporated into the user's query, thus reducing OBDA to standard relational query evaluation. However, all known query rewriting techniques produce exponentially large queries (in the size of the user's query), which presents a diculty to relational database engines. In this talk, we will analyze ontology language do and do not admit polynomial query rewritings.

 

[Return to programme grid]