The Deducteam Seminar

This page is no more maintained since August 2015. See the new page there.

2015  2014  2013  2012  2011  all


2015-07-10, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Term Inference and Unification in Lambda Pi Calculus Modulo, Gaëtan Gilbert (Deducteam, Inria Paris-Rocquencourt)

I will present the algorithm providing term inference in the Dedukti checker.
Unification constraints are produced by bidirectional elaboration, and solving them is delayed safely using guarded terms. Heuristics are then applied as simplification rules with backtracking.

2015-07-03, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Every super-polynomial proof in purely implicational minimal logic has a polynomially sized proof in classical implicational propositional logic, Hermann Haeusler (PUC Rio)

In this talk we show that any formula A with a super-polynomially sized proof in minimal implicational logic has a polynomially-sized proof in classical implicational propositional logic . This fact provides an argument in favor that any classical propositional tautology has short proofs, i.e., NP=CoNP

2015-06-26, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Lambda-calculus, Sharing, and Time Cost Models, Beniamino Accattoli (Parsifal, Inria Saclay)

This talk will survey recent works about the understanding of time cost models for lambda-calculus (in collaboration with Dal Lago, Barenbaum & Mazza, and Sacerdoti Coen). We will first focus on the weak lambda-calculus, the model underlying functional programming languages, for which it is well-known that the number of beta-steps is a reasonable cost-model. The proofs of such a result rely on alternative representations of lambda-calculus extended with some form of sharing, as abstract machines or graphical formalisms. We will provide an abstract view, using as a tool a lambda-calculus with explicit sharing, the Linear Substitution Calculus, and encompassing all main evaluation schemes (call-by-name/value/need). First, we will discuss the overhead due to sharing itself, and then the overhead of implementing it via an abstract machine. The second part of the talk will discuss strong reduction, the model underlying proof-assistants, focusing on the recent result that the number of beta-steps is a reasonable cost-model also in this case. We will explain why an additional layer of sharing, called useful sharing, is required, and then we will reconsider sharing and abstract machines overheads in this new setting.

2015-06-12, 10:00, Rose room, INRIA, 23 avenue d'Italie

From linear logic to process algebras: focusing and partial-order reduction, David Baelde (LSV, ENS Cachan)

I will present some joint work with Lucca Hirschi and Stéphanie Delaune, on designing partial-order reduction techniques for checking trace equivalence in (a fragment of) the applied pi-calculus. These techniques have lead to significant optimizations of the Apte tool for checking equivalence-based security properties of protocols. In this talk, I will focus on the roots/connections of this work with focusing from linear logic.

2015-06-05, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

The exp-log normal form of types and canonical forms of terms, Danko Ilik (Parsifal, Inria Saclay)

In presence of sum types, the lambda calculus does not enjoy uniqueness of eta-long normal forms. The canonicity problem also appears for proof trees of intuitionistic logic. In this talk, I will show how the problem becomes easier if, before choosing a canonical representative from a class of beta-eta-equal terms, one coerces the term into the exp-log normal form of its type. This type normal form arises from the one for exponential polynomials.

2015-05-29, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Synthesis for Communication-centred Programming, Mariangiola Dezani (Turin University)

The increasing number of heterogeneous devices interacting in networks claims for a new programming style, usually called communication-centered programming. In this scenario possible participants to choreographies expose their interfaces, describing the communications they offer. Interaction protocols can be synthesised through a phase of negotiation between participants, in which different pieces of code can be composed in order to get the desired behaviour. At run time some unexpected event can make the current choreography no longer executable. In this case the participants should be able to adapt themselves in order to successfully continue the interaction. In this adaptation both new interfaces and new codes of participants could need to be synthesised.

Works in progress in type theory modulo type isomorphisms, Alejandro Díaz-Caro (National University of Quilmes)

I will recall my last seminar at Deducteam about simply typed lambda calculus modulo type isomorphisms [], then I will show some ongoing works, derived from it, namely, an implementation of this system in Haskell and a two non trivial extensions to polymorphic types modulo isomorphisms.

2015-05-22, 10:30, Vert 2 room, INRIA, 23 avenue d'Italie

Automated verification of security protocols: the case of indistinguishablity, Steve Kremer (Cassis, Inria Nancy & LORIA)

Formal, symbolic techniques are extremely useful for modelling and automatically analyzing security protocols. Initially, these techniques were mainly developed to analyze authentication and confidentiality properties. Both these properties are trace properties and efficient tools for their verification exist. In more recent years anonymity-like properties have received increasing interest. Many flavors of anonymity properties are naturally expressed in terms of indistinguishability and modeled as an observational equivalence in process calculi. However, only very few verification tools are currently able to automate the analysis of such indistinguishability properties. In this talk I will describe the techniques behind the aKiss tool with examples from privacy properties in electronic voting.

2015-04-17, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

The cost of usage in the lambda-calculus, Jean-Jacques Lévy (Pi.R2, Inria)

Joint work with Andrea Asperti (Bologna)

A new "inductive" approach to standardization for the lambda-calculus has been introduced by Xi, allowing him to establish a double-exponential upper bound for the length of the standard reduction relative to an arbitrary reduction. We refine Xi's analysis, obtaining better bounds, especially for computations producing small normal forms. For instance, for terms reducing to a boolean, we are able to prove that the length of the standard reduction is at most a mere factorial of the length of the shortest reduction sequence. The methodological innovation of our approach is that instead to try to count the cost for producing something, as customary, we count the cost of consuming things. The key observation is that the part of a lambda-term that is needed to produce the normal form (or an arbitrary rigid prefix) may rapidly augment along a computation, but can only decrease very slowly (actually, linearly). Work presented at LICS'2013.

2015-04-03, 10:00, Rose room, INRIA, 23 avenue d'Italie

Protocol security: a proof theoretic perspective, Hubert Comon (LSV, ENS Cachan)

I will show how to formulate the insecurity (reachability property) of a cryptographic protocol as the existence of a proof in a well-chosen inference system.
Then, I'll study some proof simplification rules, that yield a property of small attack.
This was the PhD of Vincent Bernat (defended 2006).

2015-03-27, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Formal verification of numerical computations: application to the 1D wave equation, Sylvie Boldo (Toccata, Inria Saclay)

Computer arithmetic has applied formal methods and formal proofs for years. As the systems may be critical and as the properties may be complex to prove (many sub-cases, error-prone computations), a formal guarantee of correctness is a wish that can now be fulfilled. This talk will present a chain of tools to formally verify numerical programs. The idea is to precisely specify what the program requires and ensures. Then, using deductive verification, the tools produce proof obligation that may be proved either automatically or interactively in order to guarantee the correctness of the specifications. Then an application, coming from numerical analysis will be thoroughly presented.

2015-03-20, 10:00, Rose room, INRIA, 23 avenue d'Italie

Analysing privacy-type properties in cryptographic protocols, Stéphanie Delaune (LSV, CNRS & ENS Cachan)

Cryptographic protocols aim at securing communications over insecure networks such as the Internet, where dishonest users may listen to communications and interfere with them. For example, passports are no more pure paper documents. Instead, they contain a chip that stores additional information such as pictures and fingerprints of their holder. In order to ensure privacy, these chips include a mechanism, i.e. a cryptographic protocol, that does not let the passport disclose private information to external users. This is just a single example but of course privacy appears in many other contexts such as RDFIDs technologies or electronic voting.
Actually, the analysis of cryptographic protocols relies on several concepts which are well-etablished in the field of process algebras. In particular, the notion of behavioral equivalence allows one to express indistinguishability which is a crucial concept to model privacy-type security properties (e.g. anonymity, unlinkability, ...) We will discuss some recent advances that we have done to analyse automatically equivalence-based security properties, and we will review some issues that remain to be solved to obtain efficient verification tools.

2015-03-25, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Special session on encodings in Dedukti (Raphaël Cauderlier & Ali Assaf)

2015-03-13, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Semantics for Higher Order Constraint Logic Programming, Jim Lipton (Wesleyan University)

We define a Kripke semantics for Intuitionistic Higher-Order Logic with constraints formulated within Church's Theory of Types via the addition of a new constraint base type. We then define an executable fragment, hohh(C) of the logic: a higher-order constraint logic programming language with typed λ-abstraction, implication and universal quantification in goals and give a modified model theory for this fragment. Both formal systems are sound and complete for their respective semantics. We will sketch a simple semantics-based proof that the language hohh(C) satisfies the uniformity property of Miller et. al.

2015-03-06, 10:00, Rose room, INRIA, 23 avenue d'Italie

Confluence of layered systems, Jean-Pierre Jouannaud (Deducteam, Inria & Université Paris Sud)

Joint work with: Jiaxiang Liu and Mizuhito Ogawa

Knuth and Bendix showed long ago that confluence of a terminating first-order rewrite system can be reduced to the joinability of its finitely many critical pairs. Felgenhauer showed recently that confluence of a left-linear first-order rewrite system can be reduced to the existence of decreasing diagrams for its finitely many parallel critical pairs.
In this presentation, we investigate the case of layered rewrite systems whose rules are non-overlapping at strict subterms, and redex-depth non-increasing. Root overlappings are allowed in either finite or infinite trees. Rules may be non-terminating, non- left linear, or non-right linear. Using novel techniques -sub-rewriting and cyclic unifiers-, we show that layered systems are confluent provided their root critical pairs, in finite or infinite trees, have decreasing diagrams. Robustness of the result will be discussed.

2015-02-20, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Teaching Zenon how to count, Guillaume Bury (Deducteam, Inria & ENS)

Program analysis and verification often involve to verify arithmetic constraints. In order to deal with these constraints, we will show an extension of the tableaux rules for linear arithmetic (also called Presburger arithmetic), as well as a proof search method that relies on the general simplex (i.e the ususal simplex algorithm but without the optimization problem) for rational and real arithmetic, and uses a branch & bound strategy for integer arithmetic. We will then present the results of the implementation of this proof search method in the automated theorem prover Zenon.

2015-02-13, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

A Constructive Proof of the Topological Kruskal Theorem, Jean Goubault-Larrecq (LSV, ENS Cachan)

We give a constructive proof of Kruskal's Tree Theorem—precisely, of a topological extension of it. The proof is in the style of a constructive proof of Higman's Lemma due to Murthy and Russell (1990), and illuminates the role of regular expressions there. In the process, we discover an extension of Dershowitz' recursive path ordering to a form of cyclic terms which we call μ-terms. This all came from recent research on Noetherian spaces, and serves as a teaser for their theory.

2015-02-06, 10:30, Orange 2 room, INRIA, 23 avenue d'Italie

Intuitionnistic checker for classical proofs, The Imitation Game, Zakaria Chihani (Parsifal, Inria & Ecole polytechnique)

Given a Foundational Proof Certificate in classical logic, with or without cuts, we would like to check it in an intuitionnistic checker without any changes required to the proof evidence that was intended for a classical checker.

We will see an overview of existing double negation translations and the flaws that make them unsuitable for our purpose. Then we will describe a unique embedding proposed by Kaustuv Chaudhuri that offers a one-to-one correspondence between focused classical proofs and their focused intuitionnistic counterpart.
We show how we take advantage of that to check the FPC independently from the checker's logic (Classical or Intuitionnnistic)

2015-01-30, 10:30, Orange 2 room, INRIA, 23 avenue d'Italie

Defining the semantics of proof evidence, Dale Miller (Inria Saclay & LIX)

Automatic and interactive theorem provers build proofs in many different formats. These formats range from, say, natural deduction to resolution refutations and from Herbrand instances to oracle strings. I will describe our progress on a multi-year project to design a framework for defining the semantics of a wide range of ``proof languages''. Recent advances in structural proof theory---particularly results surrounding focused proof systems---provides our framework with the flexibility to integrate both deterministic and non-deterministic computation into deductions over classical and intuitionistic logics. I will also outline our development of a reference proof checker for this framework.

2015-01-23, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Wedding Boolean Solvers with Superposition: a Societal Reform, Simon Cruanes (Deducteam, Inria & Ecole polytechnique)

Ever since Mankind learnt how to tame fire and propositional logic, those became efficient tools for solving many other problems. In the context of automated theorem proving, the latter technology proves especially useful. In 2014, A. Voronkov presented Avatar, a simple and efficient way to integrate a boolean SAT solver within a superposition prover. We will present the core ideas of Avatar, its advantages over regular splitting; then we will present a (work in progress) new way to perform structural induction in a superposition prover by using QBF technology -- QBF is an extension of SAT where every boolean variable is explicitly quantified either universally or existentially.


2014-12-16, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Past, Present and Future of Nuprl, Vincent Rahli (PRL, Cornell University)

In this talk I will present the framework we have built within the Nuprl proof assistant to specify, verify and generate provably correct distributed systems. I will then discuss an interesting impact this work had on Nuprl, namely how it prompted us to undertake the verification of Nuprl in Coq. Finally, I will discuss major changes that we have made to Nuprl, many of them driven by our work on formally verifying Nuprl.

2014-12-12, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Double Negation Translations as Morphisms, Olivier Hermant (CRI, MINES ParisTech)

Double negation translations are a tool that allows to translate proofs in classical logic into proofs in intuitionistic logic. They are known since Kolmogorov (1925) and Godel-Gentzen (1933), and have been given many variants and refinements.

In this talk, we will be interested in new variants of double-negation translations, which are morphisms, in the sense that only the connectives are transformed. This kind of translation was introduced by Dowek (2012) and can serve to introduce into the intuitionistic predicate calculus some classical connectives, while the other remain intuitionistic. Unfortunately, this translation is verbose. We will show how, by the use of focusing, we can lessen the number of introduced negation symbols.

2014-11-28, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

New advances on mimp-graph representation, Vaston Costa (Universidade Federal de Goiás)

In this talk we going to present some advances involving minp-graph, a graph representation developed by the TecMF team of PUC-Rio, to represent natural deduction derivations, in particular, it will presented the normalisation process that generates a new minp-graph whose size does not exceed the size of the original one.

2014-11-21, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Zelus, a synchronous language with Ordinary Differential Equations, Marc Pouzet (PARKAS, ENS & Inria)

Zelus is a new programming language for modeling and implementing hybrid systems in which software interacts with a physical environment. From a user's perspective, its main originality is to extend a existing synchronous language with Ordinary Differential Equations (ODEs). The extension is conservative: any synchronous program expressed as data-flow equations and hierarchical automata can be composed arbitrarily with ODEs in the same source code. A dedicated type system and causality analysis ensure that all discrete changes are aligned with zero-crossing events. Programs are statically scheduled and translated into sequential code and the final code is paired with an off-the-shelf numerical solver (Sundials CVODE).

During the talk, I will show some causality issues with Simulink and Modelica and how they are statically detected in Zelus.

This is joint work with Albert Benveniste, Benoit Caillaud, Timothy Bourke and Bruno Pagano.

2014-11-14, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Implementing an interactive proof system, Arnaud Spiwack (CRI, MINES ParisTech)

In this talk, I want to share my experience in implementing a new interactive proof system for Coq, and suggest answers on practical and theoretical grounds to the questions "what?" and "how?". What should an interactive proof system for a rich type theory such as Coq's feature? And how can such an interactive proof system be implemented? To answer the latter question, in particular, I will summon a pot-pourri of computational effects, monads, and impredicative encodings. Specifically, the interactive proof system is a monad, implemented as a stack of monadic effects, and uses impredicative encodings for additional features and efficiency.

If you have always been bewildered by the double-continuation implementation of backtracking, or have forever wondered whether impredicative encoding had any application whatsoever, I will address these. The answer to the latter, by the way is: yes the former.

2014-11-07, 10:30, Orange 2 room, INRIA, 23 avenue d'Italie

A Superposition Calculus for Abductive Reasoning, Nicolas Peltier (CAPP team, LIG, CNRS)

Abduction can be defined as the process of inferring plausible explanations (or hypotheses) from observed facts (conclusions). This form of reasoning plays a central role in system verification, particularly for identifying bugs and providing hints to correct them.
We describe an approach to perform abductive reasoning that is based on the superposition calculus. We show how the inference rules of the superposition calculus can be adapted to obtain a calculus that is deductive complete for clauses built on a given set of ground terms, thus guaranteeing that all required explanations can be generated. This calculus enjoys the same termination properties as the superposition calculus: in particular, it is terminating on ground extensions of decidable theories of interest in software verification.

We describe techniques for storing sets of abduced clauses efficiently, and show how usual trie-based approaches for representing sets of propositional clauses in a compact way can be adapted and extended in order to denote equational clauses up to equivalence (modulo the axioms of equality).
We show how some specific theories such as arithmetic can be handled by relying on external tools for checking satisfiability and eliminating useless terms.
We also identify hints for improvements and provide lines of on-going and future research.

2014-10-31, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

dTL2: Differential Temporal Dynamic Logic with Nested Modalities for Hybrid Systems, Jean-Baptiste Jeannin (Carnegie Mellon University)

Differential Dynamic Logic can express important properties about Cyber-Physical Systems, by combining discrete assignments and control structures with differential equations. However it can only express properties about the end state of a system. In this talk, we introduce the differential temporal dynamic logic dTL2, a logic to specify properties about the intermediate states reached by a hybrid system. It combines differential dynamic logic with temporal logic, and supports some linear time temporal properties of LTL. It extends differential temporal dynamic logic dTL with nested temporalities. We provide a semantics and a proof system for the logic dTL2, and show its usefulness for nontrivial temporal properties of hybrid systems. We take particular care to handle the case of alternating universal dynamic and existential temporal modalities and its dual, solving an open problem formulated in previous work.

2014-10-24, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Cut admissibility by saturation, Guillaume Burel (CPR, CEDRIC, ENSIIE & CNAM)

Deduction modulo is a framework in which theories are integrated into proof systems such as natural deduction or sequent calculus by presenting them using rewriting rules. When only terms are rewritten, cut admissibility in those systems is equivalent to the confluence of the rewriting system, as shown by Dowek, RTA 2003, LNCS 2706. This is no longer true when considering rewriting rules involving propositions. We show that, in the same way that it is possible to recover confluence using Knuth-Bendix completion, one can regain cut admissibility in the general case using standard saturation techniques, namely ordered resolution with selection, and superposition. This work relies on a view of proposition rewriting rules as oriented clauses, like term rewriting rules can be seen as oriented equations. This also leads us to introduce an extension of deduction modulo with conditional term rewriting rules.

2014-10-17, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Relation Algebra, Allegories, and Logic Programming, Emilio Jesús Gallego Arias (CRI, MINES ParisTech)

[Joint work with Jim Lipton and Julio Mariño]

Logic Programming is built on the idea that the correspondence between proof search and computation can be effective as a programming tool.
Efficient execution of logic programs depends on the use of existential --- also calle "logic" --- variables: Unknowns that allow an existential witness to be provided later in the proof search, carrying a global scope.
Traditional semantics for CLP tackle logic variables in the meta-level: they have no place in the actual semantics. In this scenario, algebraic --- or point-free style --- semantics for logic programs are a challenge.
In the 40's, Tarski retook development of Peirce and Schröder relation algebra, proposing its use as a purely algebraic "variable-free" device capturing the whole of set theory.
We draw from Tarski (and Freyd-Maddux) work to develop an algebraic semantics for constraint logic programming.
In the first part of the work [1], a logic program is interpreted or compiled into a relation algebra. All reasoning --- including execution --- is captured by the associated equational theory.
In the second part of the work [2], we build an efficient declarative execution mechanism for Logic Programming. Our machine is based on the categorical version of the calculus of relations --- allegory theory. In particular, we start from the Lawvere Category of a logic program. Lawvere Categories represent algebraic theories and in our setting they capture the signature of the program.
The regular completion of the Lawvere Category of a program generates a tabular allegory --- where every relation is tabulated by a pair of functions --- giving a new semantics to logic programs.
Execution is based on computing normal form of diagrams for relation composition. This single primitive encompasses unification, garbage collection, parameter passing, environment creation and destruction.
The categorical model seems well suited for accommodating extensions, and we discuss constraints, types, functions and monads.

[1] /

2014-10-10, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

A point on fixpoints in posets, Frédéric Blanqui (Deducteam, Inria)

Let (X,≤) be a non-empty strictly inductive poset, that is, a non-empty partially ordered set such that every non-empty chain Y has a least upper bound lub(Y) ∈ X, a chain being a subset of X totally ordered by ≤. We are interested in sufficient conditions such that, given an element a0 ∈ X and a function f:X → X, there is some ordinal k such that ak+1=ak, where ak is the transfinite sequence of iterates of f starting from a0 (ak is then a fixpoint of f):

This note summarizes known results about this problem and provides a slight generalization of some of them.

2014-10-03, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Models and termination of proof-reduction in the lambda-Pi-calculus modulo theory, Gilles Dowek (Deducteam, Inria)

We define a notion of model for the lambda-Pi-calculus modulo theory, a notion of super-consistent theory, and prove that proof-reduction terminates in the lambda-Pi-calculus modulo a super-consistent theory. We prove this way the termination of proof-reduction in two theories in the lambda-Pi-calculus modulo theory, and their consistency: an embedding of Simple type theory and an embedding of the Calculus of constructions.

2014-09-26, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Mechanising large formal proofs: an example, Jean-Raymond Abrial

My thesis is that the formal construction of software systems and that of mathematical proofs take advantage of using similar approaches: abstraction, decomposition, design patterns, etc. We have a certain experience in developing both kind of constructions with the Rodin Platform tool. We felt that the lessons learned form one kind of construction can help improving the other one a nd vice-versa. Our experience in constructing formal proofs of pure mathematical theorems is not very wide but, we think, nevertheless significant, among which are the Cantor-Bernstein theorem, the well-ordering theorem, the recursion theorem, the axiomatic definition of real numbers together with the Archimedean property and the intermediate value theorem, etc.

Recently I came across the very interesting topological proof of the infinitude of prime numbers developed in 1955 by Hillel Fürstenberg. In my presentation, I will use this example to explain how we develop the mechanisation of mathematical proofs with the Rodin tool.

2014-09-19, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Dependent Typing for Multirate Faust, Pierre Jouvelot (CRI, MINES ParisTech)

Faust (Functional Audio Stream) is a purely functional Domain-Specific Language dedicated to signal processing applications in the audio and music domains. Faust has been created more than 10 years ago by Yann Orlarey, at Grame, a Centre national de création musicale, and is used in various institutions over the world such as IRCAM or Stanford CCRMA. We will provide a general introduction to Faust, while focusing on its sophisticated type system. We will discuss some of the features expected to be introduced into the next version of Faust, in particular its multirate extension and vector operations, important for spectral audio processing, and its impact on its static semantics. This work is funded partly by the ANR FEEVER project.

2014-09-12, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

SMT and Traces, Pascal Fontaine (Veridis, Inria & LORIA)

Satisfiability Modulo Theories (SMT) solvers check the satisfiability of logic formulas written in a language where symbols are interpreted in an theory, most often some combination of the empty theory (uninterpreted symbols), an arithmetic theory (e.g. linear arithmetic on integers and reals, or non-linear real arithmetic), and various data-structure theories (e.g. lists, arrays, bit-vectors, strings). In this talk, we present the open-source SMT solver veriT. After reviewing the underlying techniques, we briefly address some current works, in particular quantifier handling and non-linear reasoning. veriT has been designed with proofs in mind; the proof format is introduced and we finally discuss some challenges about proof production and exchange for SMT.

2014-09-05, 11:00, Orange 2 room, INRIA, 23 avenue d'Italie

Theorem Proving using Deep Inference, Kaustuv Chaudhuri (Inria)

"Deep Inference" is a term used to describe logical inference systems where the principal formulas in an inference rule need not be a "top level" formula but can be found in a deeply nested structure. In the most extreme form of deep inference, the calculus of structures, the notion of "formula" and "structure" (or "sequent") coincide, and every logical entailment can be "applied" to any arbitrary subformula. The inference system then behaves exactly as a rewrite system. In this talk I will give just enough of an introduction to the calculus of structures to be able to focus on one aspect that I find personally interesting: exploiting depth in theorem proving. For interactive theorem proving, in particular, I will try to explain why I think deep inference can show us a way out of the linguistic tarpit of formal proof languages.

2014-07-11, 10:00, Bleu 1 room, INRIA, 23 avenue d'Italie

Sharing resources in a proof by means of reference instead of copying, Vaston Costa (Universidade Federal de Goiás)

In logic we face the problem of deal with big deductions, since SAT is NP-complete. In this talk I want to put some ideas on how to short the size of reductions in Dedukti using new graph structure developed to minimal propositional Logic. The main idea is to exchange knowledge about the subject in order to produce a more efficient Dedukti (if possible).

2014-06-27, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Logtk : A Logic ToolKit for Automated Reasoning and its Implementation, Simon Cruanes (Ecole polytechnique & Deducteam, Inria)

We describe the design and implementation of Logtk, an OCaml library for writing automated reasoning tools that deal with (possibly typed) first-order logic. The library provides data structures to represent terms, formulas and substitutions, and algorithms to index terms, unify them. It also contains parsers and a few tools to demonstrate its use. It is the basis of a full-fledged superposition prover.

2014-06-13, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Quantitative Types for Higher-Order Languages, Delia Kesner (PPS, Univ. Paris-Diderot)

In this talk we consider non-idempotent intersection types to characterize different properties of higher-order lambda calculi.
The first part of the talk focuses on the linear substitution calculus, a lambda-calculus with partial substitution acting at a distance that naturally expresses linear-head reduction, a notion of evaluation of proof nets that is strongly related to abstract machines. We define two non-idempotent intersection type systems for the linear substitution calculus. We show that our first (resp. second) quantitative type system characterizes linear-head and weak (resp. strong) normalizing sets of terms. All such characterizations are given by means of combinatorial arguments, i.e. there is a measure based on type derivations which decreases with respect to each considered reduction relation.
The second part of the talk focuses on the inhabitation problem for intersection types. While this problem is known to be undecidable in the case of idempotent intersection, we prove decidability (through a sound and complete algorithm) for the non-idempotent case. We then explain how the inhabitation problem is related to the characterization of the solvability property, and then we focus on the extended framework of pattern calculi, where abstractions are allowed not only on variables but also on algebraic patterns. The solution of the inhabitation problem for non-idempotent intersection types in the algebraic pattern calculus turns out to give a complete characterization of solvability for such a language.
This talk is based on joint works with Antonio Bucciarelli, Simona Ronchi Della Rocca and Daniel Ventura.

2014-05-23, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Alt-Ergo: An SMT Solver For Program Verification, Mohamed Iguernelala (OCamlPro)

In this talk, I will start by introducing the foundations of the Satisfiability Modulo Theories (SMT) technology. Then, I will concentrate on Alt-Ergo: an SMT solver tuned for program verification. In particular, I'll present its main architecture, explain how it handles quantified formulas and conclude by some use cases and recent results. The development of Alt-Ergo was initiated in 2006 by Sylvain Conchon and Evelyne Contejean at "Laboratoire de Recherche en Informatique". Since September 2013, it is maintained and distributed by the OCamlPro Company.

2014-05-16, 11:00, Orange 2 room, INRIA, 23 avenue d'Italie

Agda's Secret -- On the Design and Implementation of the Agda Language, Andreas Abel (Gothenburg University)

In this talk ,I will give a quick overview over the Agda language and then present some aspects of the implementation: internal intermediate languages, type-checking and compilation phases, unification, termination checker. I hope to pass on some valuable experiences to implementors of dependently-typed languages.
The file used during the presentation is available here.

2014-05-16, 09:30, Orange 2 room, INRIA, 23 avenue d'Italie

Church-Rosser Properties of Positional Abstract Rewriting Systems, Jean-Pierre Jouannaud (Ecole Polytechnique, Palaiseau and Tsinghua University, Beijing)

We describe a general purpose abstract Church-Rosser result that captures the existing such results that rely on termination of computations. This is achieved by studying a novel abstract rewriting framework, abstract positional rewriting, which allows to incorporate positions at the abstract level. We also prove a new lifting lemma, which improves over the classical one in the modulo case. Many existing concrete Church-Rosser results are improved, while new ones are obtained in particular for higher-order rewriting at higher types.
The slides of the presentation are available here.

2014-04-25, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Towards an Internalization of the Groupoid Model of Type Theory, Matthieu Sozeau (π.r2, INRIA & Université Paris Diderot)

Joint work with Nicolas Tabareau (Inria Rennes, École des Mines de Nantes).
Homotopical interpretations of Martin-Löf type theory lead toward an interpretation of equality as a richer, more extensional notion. Extensional or axiomatic presentations of the theory with principles based on such models do not yet fully benefit from the power of dependent type theory, that is its computational character. Reconciling intensional type theory with this richer notion of equality requires to move to higher-dimensional structures where equality reasoning is explicit.
We follow this idea and develop an internalization of a groupoid interpretation of Martin-Löf type theory with one universe respecting the invariance by isomorphism principle. Our formal development relies crucially on ad-hoc polymorphism to overload notions of equality and on a conservative extension of Coq’s universe mechanism with polymorphism.

2014-04-25, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

About equality in type theory, Hugo Herbelin (π.r2, INRIA & Université Paris Diderot)

Let us call equality the smallest subset of provable equalities from a given axiomatic equational theory. Some equational theories can be formulated at least partially as rewriting systems and thus reduced to computation. This is e.g. the approach taken in deduction modulo, with, for instance, the equational theories of first-order arithmetic, higher-order logic, or Zermelo set theory formulated as normalising computation rules. In a proof system with proof terms, such as deduction modulo, but also lambda-pi modulo, let us call conversion rule a rule that captures computations without leaving a trace in the proof term.
In Martin-Löf's Intensional Type Theory, in the Calculus of Inductive Constructions and in the general theory of Pure Type Systems, there is a conversion rule which captures the intensional notion of definitional equality. The latter is decidable, and, because it is decidable, a full derivation can be reconstructed from its associated proof term. By construction, conversion is also closed under functional extensionality and it inherently satisfies the uniqueness of equality proofs. In these theories, equality is the smallest subset of equalities provable from definitional equality, i.e. closed under deductive reasoning from definitional equality.
In Martin-Löf's Extensional Type Theory, the conversion rule however embeds equality, making it at the same time extensional, uniquely inhabited and closed under deductive reasoning. It becomes undecidable and a full derivation cannot be reconstructed from the mere knowledge of its associated proof term.
In Martin-Löf's type theory, there is an alternative possible characterisation of equality as the smallest subset of equalities provable from a set of typed congruence rules defined by induction on the constructors of the type. In this view, equality in Pi-types can be defined to be extensional equality and equality in equality types can be defined to be trivial, leading to an intensional theory whose strength is the one of Extensional Type Theory (see Altenkirch's work).
Homotopy Type Theory introduces a form of extensional equality on universes called univalence which is similar to the one early defined by Hofmann and Streicher: two types are equal when they have the same cardinal. This extensionality implies functional extensionality but it contradicts uniqueness of equality proofs: for instance, the identity function and the negation are two distinct proofs of Bool=Bool.
In the talk, we will discuss some issues about equality:

2014-04-11, 14:00, Rose room, INRIA, 23 avenue d'Italie

The FoCaLiZe language: mixing program and proofs, François Pessaux (ENSTA ParisTech)

FoCaLiZe is a programming language allowing to state in a same framework algorithms (programming aspects), properties (logical aspects) that these algorithms must satisfy and proofs these latter hold. It provides high-level constructs to ease development structuring and reuse (modularity, inheritance, late-binding, parameterization) and a proof language allowing to write proofs in a hierarchical way, using a natural deduction style, which rely on the Zenon automated theorem prover to discharge goals ... and the burden of the user. The core programming language is a pure functional language while properties are stated a first-order logic formulae. A FoCaLiZe development is compiled in two target languages: one OCaml source code leading to an executable (or object file) and one Coq source code containing the complete model of the development (i.e. definitions, properties and proofs) that will be ultimately checked by Coq as assessment. A crucial care in this process is to avoid any error from the target languages being returned to the user: a program accepted by the FoCaLiZe compiler must be accepted by both OCaml and Coq. To increase confidence in "what is ran is what is proved" despite two target languages, the compiler has to use a code generation model identical and traceable for both of them.
In this presentation we will first introduce the motivations, design and features choices that impacted the language semantics. A short presentation of the language features will be addressed. Then we will explore the code generation model, emphasizing on the central notion of "dependencies" and its static calculus. Not all the aspects of the compilation will be addressed, however the global process will be sketched. Finally we may discuss of possible extensions of the framework and their impact on the semantics, analyses and code generation model.
The slides of the talk are available here.

2014-04-04, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

The Implicit Calculus of Constructions with Σ-types (ICCΣ), Bruno Bernardo (Ecole polytechnique)

Joint work with Bruno Barras.

We present two type systems: ICCΣ and its annotated variant, AICCΣ. ICCΣ is an extension of Alexandre Miquel's Implicit Calculus of Constructions (ICC) with Σ-types. Since type-checking is (most likely) not decidable in ICCΣ, we introduce also a more verbose variant, AICCΣ, which should fix this issue. The purpose of these systems is to solve the problem of interaction between logical and computational data.

2014-03-28, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Reflecting Inductive Types in Type Theory, Pierre-Evariste Dagand (Gallium, INRIA)

With Conor McBride, I have developed a presentation of inductive families based on a type-theoretic universe. Unlike the traditional, purely syntactic, approaches, our treatment allows us to reflect inductive families within type theory. As a result, inductive types become first-class citizens.
In this talk, I will present such a universe of inductive types and hint at its bootstrap. I will then illustrate the benefits of first-class citizenship. I shall conclude with a few exploratory results, at the intersection between inductive types, equational theories, and rewriting.

Files associated to the talk: DeducteamTalk.agda (another link to this file), Enum.agda and IProp.agda.
A link with comments on the talk.

2014-03-14, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Some Methods of proof compression, Vaston Costa (Universidade Federal de Goiás)

It is well-known that the size of propositional classical proofs can be huge. Proof theoretical studies discovered exponential gaps between cut free (or normal) proofs and the corresponding (non-normal) proofs with cuts (or modus ponens). The task of automatic theorem proving is, on the other hand, usually based on the construction of cut free or only atomic-cuts proofs, since this procedure produces less alternative choices. The aim of this presentation is to discuss some methods of size reduction of deductions.

2014-03-07, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Natural Algorithms, Bernadette Charron-Bost (LIX, CNRS & Ecole polytechnique)

Algorithms offer a rich and expressive language for modeling physical, biological, and social systems. They lay the grounds for numerical simulations and, more importantly, provide a powerful framework for their analysis. In the life sciences, natural algorithms thus appear to play the same role as differential equations in the physical sciences. This entails the need for the development of an "algorithmic calculus", an analog of differential calculus.
I shall introduce this new field of research and some fundamental questions regarding the achievement of consensus, synchronization, and more generally coordination among agents of biological or social systems. I shall present various algorithmic tools that have been recently proposed to analyze natural algorithms and which use classical notions in dynamical systems, in Markov chains, or in spectral graph theory.
The slides of the talk are avaiable here.

2014-02-21, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Objects and Subtyping in the λΠ-calculus modulo, Raphaël Cauderlier (CPR, CNAM & Deducteam, Inria)

Abadi and Cardelli defined in 1996 several object calculi in order to formalize the behaviour of Object Oriented programs and to study object type systems. The major feature of these type systems is subtyping.
Translating subtyping in Dedukti is both challenging, because convertibility between types is symmetrical, and needed, because the type system of Coq has subtyping between sorts.
I will present a shallow encoding of a typed functional object calculus with subtyping in the λΠ-calculus modulo and its implementation in Dedukti.

2014-02-07, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Special session on type isomorphisms: Alejandro Díaz-Caro & Danko Ilik


2013-12-13, 10:00, Bleu 1 room, INRIA, 23 avenue d'Italie

Edward Hermann Haeusler (PUC Rio), A short discussion on non-standard finite computation

This talk will discuss how computational models defined in toposes may lead to non-standard computational models. A possible comparison with hyper-computation is lightly motivated.

2013-11-29, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Edward Hermann Haeusler (PUC Rio), A proof-theoretical discussion on the mechanization of propositional logics

The computational complexity of SAT and TAUT for purely implicational propositional logic is known to be PSPACE-complete. Intuitionistic Propositional Logic is also known to be PSPACE-complete, while Classical Propositional Logic is CONP-complete for Tautology checking and NP-complete for Satisfiability checking. We show how proof-theoretical results on Natural Deduction help analysing the Purely Implicational Propositional Logic complexity regarded its polynomial relationship to Intuitionistic and Classical Propositional Logics. The main feature in this analysis is the sub-formula principle property. We extended the polynomial reduction of purely implicational logic to any propositional logic satisfying the sub-formula principle. Hence, any propositional logic satisfying the sub-formula principle is in PSPACE. We conclude that some well-known logics, such as Propositional Dynamic Logic and S5C (n = 2), for example, hardly satisfy the sub-formula property. On the other hand, these logics have proof procedures. They are told to be mechanizable despite being EXPTIME-complete.

We point out some facts and discuss some questions on how the sub-formula property is related to the mechanization of theorem proving for propositional logics. The relationship between feasible interpolants and the sub-formula property is discussed. Some examples remind us that the relationship between normalization of proofs and the sub-formula property is weak. Finally we discuss some alternative criteria to consider a logic to be mechanisable. This discussion is strongly influenced by A. Carbone analysis on propositional proof complexity. Our purpose is to show how structural proof theory can help us in analysing hard relationships on propositional logic systems.

2013-11-15, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Stéphane Graham-Lengrand (CNRS & Ecole polytechnique), Re-designing the LCF-style architecture for tableaux-like/Prolog-like proof-search and SMT-solving

Psyche is a modular proof-search engine designed for either interactive or automated theorem proving, and aiming at two things: a high level of confidence about the output of the theorem proving process and the ability to apply and combine a wide range of techniques.
It addresses the first aim by adopting and extending the LCF architecture to guarantee, using private types, not only the correctness but also the completeness of proof-search.
It addresses the second by organising a new division of labour between the LCF kernel and its programmed plugins, based on a focused sequent calculus for polarised logic. This calculus generalises tableaux-like proof-search and logic programming, with backtracking mechanisms for the exploration of the search space.
Finally, Psyche features the ability to call decision procedures such as those used in Sat-Modulo-Theories solvers. We therefore illustrate Psyche by using it for SMT-solving (quantifier-free), but we also discuss how to extend Psyche to handle quantifiers, especially the use of meta-variables in presence of backtracking and a background theory.

2013-10-18, 14:00, Vert 2 room, INRIA, 23 avenue d'Italie

Ronan Saillard (CRI, MINES ParisTech & Deducteam, INRIA), Yet Another Complete Rewrite of Dedukti

I will present the new version of Dedukti. I will give benchmarks attesting that it is far more efficient than the previous implementations and I will compare the results with Coq, Twelf and Maude. Then I will discuss the typing condition a rewrite rule should verify to be added in the system and I will show how to get rid of dot patterns.

2013-10-18, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Bruno Barras (Marelle, LIX, CNRS & INRIA Saclay & Ecole polytechnique), An introduction to Homotopy Type Theory

Homotopy Type Theory (HoTT) is a variant of Type Theory aiming at formalizing synthetic proofs in homotopy theory. After defining its main distinctive features, I will show that it is not merely a domain-specific formalism, but rather a foundation which happens to be richer even for general purposes.

An important issue with this formalism is that it has no known computational interpretation yet, as the model of HoTT based on simplicial sets proposed by Voevodsky is not constructive. I will sketch a new model proposed by Coquand that may address this issue.

Finally, if time permits, I'll talk about higher inductive types, that provide a way to define non-trivial homotopy types in a fashion inspired from inductive types.

The slides of the talk are available here.

2013-09-13, 10:00, Orange 1 room, INRIA, 23 avenue d'Italie

Simon Cruanes (Deducteam, INRIA & Ecole polytechnique), Detection of First Order Axiomatic Theories

Automated theorem provers for first-order logic with equality have become very powerful and useful, thanks to both advanced calculi -- such as superposition and its refinements -- and mature implementation techniques. Nevertheless, dealing with some axiomatic theories remains a challenge because it gives rise to a search space explosion. Most attempts to deal with this problem have focused on specific theories, like AC (associative commutative symbols) or ACU (AC with neutral element). Even detecting the presence of a theory in a problem is generally solved in an ad-hoc fashion. We present here a generic way of describing and recognizing axiomatic theories in clausal form first-order logic with equality. Subsequently, we show some use cases for it, including a redundancy criterion that can be applied to some equational theories, extending some work that has been done by Avenhaus, Hillenbrand and Löchner.

2013-07-11, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Kokichi Futatsugi (JAIST-RCSV), Generate and check method for invariant verification in CafeOBJ

Effective coordination of inference (a la theorem proving) and search (a la model checking) is one of the most important and interesting research topics in formal methods. We have developed several kinds of techniques for coordinating inference and search. The generate and check method is a recent development for invariant verification with proof scores in CafeOBJ. The method is based on (1) state representations as sets of observers, and (2) systematic generation of finite state patterns which cover all possible states.

2013-07-05, 10:00, Bleu 1 room, INRIA, 23 avenue d'Italie

Jean-Pierre Jouannaud (University Paris 11, LIX and Tsinghua University), The Computability Path Ordering

Joint work with Frédéric Blanqui, Jianqi Li and Albert Rubio

We shall describe in this talk the computability path ordering CPO, aiming at carrying out termination proof for higher-order calculi. CPO appears to be the ultimate improvement of the higher-order recursive path ordering HORPO (by Albert Rubio and myself) in the sense that this definition captures the essence of computability arguments à la Tait and Girard, therefore explaining the name of the improved ordering. CPO allows to prove strong normalization of higher-order rewrite rules in a rich type discipline which includes function types, inductive types, and dependent types.
Well-foundedness of CPO for function and inductive types is shown by adapting Girard's technique. The case of dependent types is shown by using a termination/non-termination preserving reduction to the former case. Polymorphic types are not allowed so far, nor are arbitrarily many predicative universes.
We will show that our definition of CPO cannot be improved anymore in the sense that the guards present in the recursive calls cannot be relaxed in any natural way without loosing well-foundedness. In this sense, CPO is the end of the road.

2013-05-03, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Benoît Valiron (University of Pennsylvania), A Quantum Programming Language

Quantum devices are the unicorns of the IT world: they do not quite exist, they have semi-magical properties, and they are supposed to be nasty to control. Like the mythical knight, we seek a magical sword to tame the monster. We designed the quantum language Quipper to this end: it is the topic of the talk.

I will present the hypothetical properties of our quantum unicorn, the features that are required to program it, and code snippets of our personal excalibur.

Homework for this talk can be found here.

2013-04-12, 10:00, Orange 1 & 2 rooms, INRIA, 23 avenue d'Italie

KWARC-Deducteam workshop

Slides, abstracts and references can be found here.

2013-03-29, 10:00, Bleu 1 room, INRIA, 23 avenue d'Italie

Géraud Sénizergues (LABRI, Université de Bordeaux), LALBLC: a program testing the equivalence of dpda's

We present a preliminar version of the program "LALBLC" (i.e. LA=LB? Let us Compute).

The equivalence problem for dpda consists in deciding, for two given deterministic pushdown automata A,B, whether they recognize the same language. It was proved in [TCS 2001, G.S.] that this problem is decidable by means of some complete proof systems.

The program LALBLC gives, in some sense, life to the above completeness proof: on an input A,B, the program computes either a proof of their equivalence (w.r.t. to the system D5 of the above reference) or computes a witness of their non-equivalence (i.e. a word that belongs to the symmetric difference of the two languages).

We shall outline the main mathematical objects that are manipulated as well as the main functions performed over these objects. We shall also describe some experimental results obtained so far.

The talk is based on an ongoing, joint, work with P. Henry.

2013-03-22, 10:00, Rose room, INRIA, 23 avenue d'Italie

Bruno Lopes (PUC Rio), Extending Propositional Dynamic Logic to Petri Nets

Propositional Dynamic Logic (PDL) is a multi-modal logic used for specifying and reasoning on sequential programs. Petri Net is a widely used formalism to specify and to analyze concurrent programs with a very nice graphical representation.
In this work, we propose a Propositional Dynamic Logic to reasoning about Petri Nets and an stochastic variation.
First we define a compositional encoding of Petri Nets from basic nets as terms.
Second, we use these terms as PDL programs and provide a compositional semantics to PDL Formulas. Finally we present an axiomatization and prove completeness w.r.t. our semantics.

2013-03-15, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Danko Ilik (independent researcher), Applications and meta-mathematical implications of Double-negation Shift

Double-negation Shift [DNS], or Kuroda's Conjecture, is a logical principle independent of intuitionistic logic, and provable in classical logic, whose foundational importance has been realized by Kreisel who noticed that DNS proves the double-negation translation of the Axiom of Choice. DNS thus allows to prove the relative consistency of classical Analysis with respect to intuitionistic Arithmetic [HA] (plus DNS). Following Kreisel, Spector showed that the Dialectica interpretation of DNS can be proved by functions definable from extending higher-type primitive recursion by so called bar recursion. This approach, Dialectica interpretation plus bar recursion, has so far been the most fruitful one when it comes to extracting programs from proofs in classical Analysis, as can be seen from Kohlenbach's Proof Mining programme. Recently, the speaker, based on observations of Herbelin, built a logical system around delimited control operators, from the theory of programming languages, that can derive DNS in pure logic i.e. in absence of arithmetic axioms. In this talk, I will speak about DNS in presence of arithmetic, showing that HA+DNS still satisfies the Disjunction and Existence properties and can hence be considered a constructive logical system. Moreover, HA+DNS satisfies a form of Church's Rule that says that from a proof a closed sentence, ⊢∀n∃m(A(n,m)), one can extract a recursive function f such that ⊢∀n(A(n,f(n))). However, it is a known fact that DNS *refutes* the formal version of Church's *Thesis*. I will discuss this situation and its consequences on undecidability proofs that reduce the solvability of a problem to that of the Halting Problem. At the end, I will show an application of DNS, based on work of Veldman, to extracting a new realizer for Open Induction on Cantor space, by delimited control operators.

2013-02-15, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Hugo Macedo (Deducteam, INRIA), Mining malware specifications through static reachability analysis

(joint work with Tayssir Touili)
The number of malicious software (malware) is growing out of control. Syntactic signature based detection cannot cope with such growth and manual construction of malware signature databases needs to be replaced by computer learning based approaches. Currently, a single modern signature capturing the semantics of a malicious behavior can be used to replace an arbitrarily large number of old-fashioned syntactical signatures. However, teaching computers to learn malware behaviors is a challenge. Existing work relies on dynamic analysis to extract malicious behaviors, but such technique does not guarantee the coverage of all behaviors. To sidestep this limitation, we show how to learn malware signatures using static reachability analysis. The idea is to model binary programs using pushdown systems (that can be used to model the stack operations occurring during the binary code execution), use reachability analysis to extract behaviors in the form of trees, and use subtrees that are common among the trees extracted from a training set of malware files as signatures. To detect malware, we propose to use a tree automaton to compactly store malicious behavior trees and check if any of the trees extracted from the file under analysis contains a subtree corresponding to a malicious behavior. Experimental data shows that our approach can be used to learn signatures from a set of malware files and use them to detect a much larger set of malware obtaining promising results.

2013-02-01, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Session on automatic theorem proving

2013-01-25, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Ronan Saillard (CRI, Mines ParisTech and Deducteam, INRIA), The Univalence Axiom


2012-11-30, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Florence Clerc (Parsifal, INRIA-Saclay), Embedding constructiveness into classical logic via double negation translations or polarization

Although originally developed as a tool for proving the consistency of classical logic with respect to constructive reasoning, double negation translations have recently been used to embed classical reasoning into intuitionistic proofs. Such translations can then be used to associate constructive functions to classical proofs. In this paper, we show that if we pick a particular double negation translation from a set of known translations, it is possible to establish an isomorphism between the cut free classical sequent proofs of formula B and the intuitionistic sequent calculus proofs of the translation of B. In order to get such a strong correspondence of proofs, this isomorphism is constructed between focused sequent proofs in these two logics. Using the LJF and LKF focused proof systems of Liang & Miller (2009), we can show that focused cut-free classical proofs of a certain polarization of B are isomorphic to a certain double negation translation. In particular, we describe the polarization of classical proofs needed to exactly model the Kuroda translation.

2012-11-23, 10:00, Algorithme room (3rd floor), INRIA, 23 avenue d'Italie

Paulo A. S. Veloso (Federal University of Rio de Janeiro, COPPE: Systems and Computer Engineering Programme), On Graph Calculi for Relations

Binary relations have a long and interesting history with important connections to Logic, (Foundations of) Mathematics and Computing. In many areas, such as these, graphs provide convenient visualization.
This talk will present a graph approach to binary relations. The basic idea is quite natural: represent the fact that a is related to b via relation R by an arrow from a to b labelled by R; then some operations on relations become manipulations on arrows (which represent restrictions).
The plan is as follows: start with a calculus for a simple fragment (which conveys the main ideas) and extend it successively so as to include the other relational operations (hence the plural `calculi’). Graph notation renders the manipulations (expressed by a sound and complete set of rules) rather intuitive, almost playful.

2012-11-16, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Tristan Crolar (CPR, CEDRIC, CNAM), Defining co-inductive types in second-order subtractive logic (and getting a coroutine-based implementation of streams)

Subtractive logic is a conservative extension of intuitionistic logic with a new connective, the subtraction, dual to implication. The subtraction is of course definable in classical logic, and as consequence, we can derive the computational content of this connective: we obtain a restricted form of first-class continuations which correspond to coroutines. In this talk, we will focus on the following application: co-inductive types are definable in second-order subtractive logic using an encoding dual the usual encoding of inductive types (from Fortune, Leivant & O'Donnell). Surprisingly, as a special case, we recover the conventional implementation of streams using coroutines (as in Kahn networks).
Slides of talk: big and small formats.

2012-11-09, 10:00, Orange 1 room, INRIA, 23 avenue d'Italie

Jaime Gaspar (PiR2, INRIA Paris-Rocquencourt), Constructivism: from non-rigorous philosophy to rigorous mathematics

Mathematicians commonly use "non-constructive proofs": for example, they show that an equation has a solution without constructing (telling which is) the solution. However, it is more informative to use "constructive proofs": for example, to show that an equation has a solution by constructing (telling which is) the solution. "Constructivism" is a school of thought that insists on constructive proofs.
In this talk we are going to take a look at constructivism as a: We keep the talk sweet, simple and short, avoiding technicalities and aiming at a general audience.

2012-10-19, 10:00, Orange 2 room, INRIA, 23 avenue d'Italie

Cecilia Englander (PUC Rio), Arbitrary multi-truth-value functions and Natural Deduction

Segerberg presented a general completeness proof for propositional logics. For this purpose, a deductive system was defined in a way that its rules were rules for an arbitrary k-place Boolean operator in a given propositional logic. Each of those rules corresponds to a row on the operator's truth-table. This article extends Segerberg's idea to finite-valued propositional logic. We maintain the idea of defining a deductive system whose rules correspond to rows of truth-tables, but instead of having n types of rules (one for each truth-value), we use a bivalent representation that use the help of separating formulas as defined by Carlos Caleiro and João Marcos.

2012-10-12, 16:00, Orange 1 room, INRIA, 23 avenue d'Italie

Alejandro Díaz-Caro (Université de Paris Ouest and INRIA), Non determinism through type isomophism

We define an equivalence relation on propositions and a proof system where equivalent propositions have the same proofs. The system obtained this way resembles several known non-deterministic and algebraic lambda-calculi. This system enjoys subject reduction and strong normalisation.

2012-10-05, 14:00, Orange 1 room, INRIA, 23 avenue d'Italie

Simon Cruanes (INRIA, CNAM), Reasoning modulo Equality and Equations

2012-09-28, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Vincent Benayoun (CPR, CEDRIC, CNAM), ML Dependency Analysis for Assessors

Critical software needs to obtain an assessment before com- missioning. This assessment is given after a long task of software analysis performed by assessors. They may be helped by tools, used interactively, to build models using information-flow analysis. Tools like SPARK-Ada exist for Ada subsets used for critical software. But some emergent languages such as those of the ML family lack such adapted tools. Providing similar tools for ML languages requires special attention on specific features such as higher-order functions and pattern-matching. This paper presents an information-flow analysis for such a language specifically designed according to the needs of assessors. This analysis can be parametrized to allow assessors getting a view of dependencies at several levels of abstraction and gives the basis for an efficient fault tolerance analysis.

2012-09-07, 10:00, Orange 1 room, INRIA, 23 avenue d'Italie

Nachum Dershowitz (Tel-Aviv University), Termination by Jumping and Escaping

A weak commutation property, called "jumping" (due to Doornbos, Backhouse, and van der Woude), along with an eventual lifting property, called "escaping", are used for proving well-foundedness of a generic, abstract version of the recursive path orderings, and for other termination arguments.

2012-08-03, 10:00, Vert 2 room, INRIA, 23 avenue d'Italie

Mitsuhiro Okada (Keio University), Proof Theory for some diagrammatic logical proofs

We consider the Euler diagram style diagrammatic proofs as an intuitionistic version of diagrammatic logical proofs (in contrast to the Venn style classical diagrammatic logic). We develop some proof theories for this intuitionistic diagrams. We also gives a shallow phase model theoretic completeness. we also discuss cognitive scientific observation related to the diagrammatic inferences, comparing with the corresponding linguistic version.

2012-07-20, 10:00, Rose room, INRIA, 23 avenue d'Italie

Nachum Dershowitz (Tel Aviv University), Canonical Inference

An abstract proof-theoretic framework, inspired by rewriting methods, is proposed. Normal-form proof objects are minimal in some well-founded proof ordering. The approach applies in the equational, Horn-clause, and deduction-modulo cases, among others, as they incorporate formal notions of simplification and redundancy.

2012-07-06, 10:00, Vert 1 room, INRIA, 23 avenue d'Italie

Catherine Dubois (CPR, Cedric, CNAM & ENSIIE and Deducteam, INRIA), A Certified Constraint Solver over Finite Domains

joint work with Matthieu Carlier and Arnaud Gotlieb

Constraint programs such as those written in modern Constraint Programming languages and platforms aim at solving problems coming from optimization, scheduling, planning, etc. Recently CP programs have been used in business-critical or safety-critical areas as well, e.g., e-Commerce, air-traffic control applications, or software verification. This implies a more skeptical regard on the implementation of constraint solvers, especially when the result is that a constraint problem has no solution, i.e., unsatisfiability. For example, in software model checking, using an unsafe constraint solver may result in a dramatic wrong answer saying that a safety property is satisfied while there exist counter- examples. In this paper, we present a Coq formalisation of a constraint filtering algorithm — AC3 and one of its variant AC2001 — and a simple labeling procedure. The proof of their soundness and completeness has been completed using Coq. As a result, a formally certified constraint solver written in OCaml has been automatically extracted from the Coq specification of the filtering and labeling algorithms. The solver, yet not as efficient as specialized existing (unsafe) implementations, can be used to formally certify that a constraint system is unsatisfiable.

2012-06-29, 14:00, Jaune room, INRIA, 23 avenue d'Italie

Vivien Maisonneuve (CRI, Mines ParisTech), From physics to interrupt handlers: the real to float step

The design and the implementation of a control-command application on a microcontroller consist in many steps bridging the physical world model to interrupt handlers code. Proofs on systems are usually performed at the early stages using analysis techniques. These proofs provide invariants we would like to transpose down to the interrupt handlers level, thus allowing to check the correctness of the system implementation. In this talk, we focus on the step leading from a high-level discrete controller description using real numbers to a similar description using-floating numbers.

2012-06-08, Vert 1 and Rose rooms, INRIA, 23 avenue d'Italie

Day on computation in quantic physics and logic

Speakers: Pablo Arrighi (IMAG), Maël Pégny, Pierre Néron (Ecole polytechnique and INRIA), Raphaël Bost (Ecole polytechnique and INRIA), Ali Assaf (Ecole polytechnique and INRIA), Quentin Carbonneaux (Ponts ParisTech and INRIA), Raphaël Cauderlier (ENS Cachan and INRIA), Guillaume Burel (CPR, Cédric, ENSIIE), David Delahaye (CPR, Cédric, CNAM) and Mélanie Jacquel (Siemens and CPR, Cédric, CNAM)
Participants: Jean-Baptiste Joinet, Gilles Dowek, Olivier Hermant, Catherine Dubois, Nachum Dershowitz

2012-05-11, 14:00, Bleu 2 room, INRIA, 23 avenue d'Italie

14h00: Mélanie Boudard (ISEP), Translations between Intuitionistic and Classical Logic

15h00: Olivier Hermant (ISEP), Polarized Translation for Polarized Deduction Modulo

2012-04-06, 10:00, Rose room, INRIA, 23 avenue d'Italie

Simon Cruanes, (SRI, EPFL and Ecole polytechnique), First order reasoning in Yices 2

In this talk, we present different techniques we have explored to make first-order reasoning possible in Yices2 (a state-of-the-art SMT solver), and we describe their implementation. In addition to instantiation mechanisms inspired from the widely used "heuristic E-matching", we discuss the coupling of the ground satisfiability solver with a superposition prover - the technology behind many high performance first-order provers.


2011-12-16, 14:30, Equation room, INRIA, 23 avenue d'Italie

14h30: David Baelde, Rewriting and fixed point definitions

In this talk, we shall see why and how to combine a logic featuring (co)inductive definitions with a modulo (rewriting) layer. We argue that the two features are orthogonal, and show that combining them is a natural way to support both fixed point and recursive definitions. While the former are given over a monotonic (predicate) operator and come with (co)induction rules, the latter are defined over a pre-existing inductive structure, following an extra-logical recursive computation. The combination of the two kind of definitions is key in arguments such as Tait's logical relations and the like. Although normalization has been studied for both deduction modulo and logics of definitions, the combination of the two is difficult due to a very different treatment of equality. This work is thus an excellent opportunity to understand better the differences between modulo and definitions, between various treatments of equality, and more generally to explore the interplay between reasoning and computation.

16h30: Pierre Néron (INRIA and Ecole Polytechnique), Eliminating Division and Square Root in Embedded Programs

This talk presents a semantics based program transformation for a certain class of programs that improves the accuracy of the computations on real numbers representations. This transformation aims to remove the square root and division operations from the original program since these two operations are one of the main causes of the loss of accuracy. This transformation is meant to be used on embedded systems, therefore the produced programs hav to respect the constraints relative to this kind of code.

2011-12-02, 17:30, Vert 1 room, INRIA, 23 avenue d'Italie

17h30: Luiz Carlos Pereira (PUC-Rio), A constructive approach to classical modal logic

2011-11-25, 10:00, CNAM, annexe Montgolfier, rue Conté, Gate 33, 1st Floor, room 19 (seminar CPR)

Sylvain Conchon (LRI, Proval) Alt-Ergo : AC(X) et traces "légères" en Coq

Dans cet exposé, nous présenterons le démonstrateur automatique Alt-Ergo. Après un bref aperçu de son utilisation, nous détaillerons le fonctionnement de son noyau basé sur l'algorithme AC(X) ainsi que sa procédure de décision pour l'arithmétique. Nous présenterons également nos travaux en cours sur la production de traces "légères" en Coq, interprétées à l'aide de la tactique réflexive ergo de S. Lescuyer.

2011-07-04, 10:00, INRIA, 23 avenue d'Italie

10h00: David Delahaye (CPR, CEDRIC, CNAM) and Mélanie Jacquel (CPR, CEDRIC, CNAM and Siemens), an introduction to Atelier B

11h00: Tomás Lungenstrass (Ecole polytechnique), Realizability and Super-Consistency

Proof normalization is an important property for theories to have. The fundamental criterion to establish the normalization of proofs in a theory is super-consistency, a notion that distills the essential tools that the algebra of Reducibility Candidates provides us with to build a model. This notion is necessarily lacking when dealing with set theory, since proof normalization implies the consistency of a theory so it cannot be proved within the theory itself, as a result some sort of normalization relative to a theory is called for. We develop a notion of PHA expressible within an environing theory, into which we then build a realizability interpretation that will act as the "internal model". Given that the environing theory has a standard model for arithmetic, this internal model can then be extracted from the environing theory into an actual model. We prove that the algebra of Reducibility Candidates is expressible and so we reconcile the notions of super-consistency and proof normalization.

2011-06-22, 10:00, INRIA, 23 avenue d'Italie

Mélanie Jacquel (CEDRIC, Siemens): Validation des règles de l'Atelier B

We propose a framework which consists in validating proof rules of the B method, which cannot be automatically proved by the elementary prover of Atelier B and using an external automated theorem prover called Zenon. This framework contains in particular a set of tools, named BCARe and developed by Siemens SAS I MO, which relies on a deep embedding of the B theory within the logic of the Coq proof assistant and allows us to automatically generate the required properties to be checked for a given proof rule. Currently, this tool chain is able to automatically validate a part of the derived rules of the B-Book, as well as some added rules coming from Atelier B and the rule database maintained by Siemens SAS I MO.

Denis Cousineau (Microsoft Research): On completeness of reducibility candidates as a semantics of strong normalization

In this talk, we define a sound and complete semantic criterion, based on reducibility candidates, for strong normalization of theories expressed in minimal deduction modulo /à la Curry/. The use of Curry-style proof-terms allows to build this criterion on the classic notion of pre-Heyting algebras and makes that criterion concern /all/ theories expressed in minimal deduction module. Compared to using Church-style proof-terms, this method provides both a simpler definition of the criterion and a simpler proof of its completeness.

Olivier Hermant (LISITE, ISEP): Presentation of HETS, the "Heterogeneous Tool Set"

A quick and light presentation of HETS, "a parsing, static analysis and proof management tool incorporating various provers and different specification languages, thus providing a tool for heterogeneous specifications. Logic translations are first-class citizens."

2011-05-27, INRIA, 23 avenue d'Italie

Mathieu Boespflug (McGill), Dedukti

Chantal Keller(Ecole polytechnique), Traduction de HOL en Dedukti

2011-05-10, 13:30, Rose room, INRIA, 23 avenue d'Italie

13h30: Jianhua Gao (INRIA), Clausal presentation of theories in deduction modulo

Resolution modulo is an extension of first-order resolution where axioms are replaced by rewrite rules, used to rewrite clauses during the search. In the first version of this method, clauses were rewriten to arbitrary propositions, that needed to be dynamically transformed into clauses. This unpleasant feature can be eliminated when the rewrite system is clausal, i.e. when it transforms clauses to clauses. We show in this talk that how to transform any rewrite system into a clausal one, preserving the existence of cut free proof of any sequent.

15h00: Guillaume Burel (CEDRIC), Experimenting with deduction modulo

Deduction modulo is a generic framework to describe proofs in a theory better than using raw axioms. This is done by presenting the theory through rules rewriting terms and propositions. In CSL 2010, LNCS 6247, p.155--169, we gave theoretical justifications why it is possible to embed a proof search method based on deduction modulo, namely Ordered Polarized Resolution Modulo, into an existing prover. Here, we describe the implementation of these ideas, starting from iProver. We test it by confronting Ordered Polarized Resolution Modulo and other proof-search calculi, using benchmarks extracted from the TPTP Library. For the integration of rewriting, we also compare several implementation techniques, based for instance on discrimination trees or on compilation. These results reveal that deduction modulo is a promising approach to handle proof search in theories in a generic but efficient way.

16h15: Alberto Naibo (U. Paris 1), Fitch's paradox in the light of structural proof theory

Anti-realist epistemic conceptions of truth imply what is called the knowability principle: All truths are possibly known. The principle can be formalized in a bi-modal propositional logic, with an alethic modality and an epistemic modality, by an axiom scheme (KP). The use of classical logic and minimal assumptions about the two modalities lead to the paradoxical conclusion that all truths are known (Om-Cl) [1]. A sequent-calculus reconstruction of Fitch's derivation shows that the presence of axiom scheme KP makes cut elimination problematic. The method of conversion of axioms into rules, successfully employed in [3], cannot be used because of the lack of reducibility of the principle to its atomic instances. Therefore, a labelled approach to modal logic, in the style of [2], is applied. A sound and complete labelled system for intuitionistic and classical bi-modal logic is obtained by which a proof analysis of Fitch's derivation can be made. In particular, the conversion of KP into an inference rule is possible and the cut-elimination procedure can be fully applied. The main consequence is the following admissibility result: if KP is valid, then Om-Cl is valid, and this admissibility holds in intuitionistic logic. The result states that collective omniscience is already assumed in the knowability principle. In conclusion, the knowability principle formalized as KP makes Fitch's derivation a petitio principii rather than a genuine paradox.
[1] Fitch, F., A logical analysis of some value concepts, The Journal of Symbolic Logic, vol.28 (1963), no.2, pp.135-142.
[2] Negri, S., Proof analysis in modal logic, Journal of Philosophical Logic, vol.34 (2005), no.5-6, pp.507-544.
[3] Negri, S. & J. von Plato, Structural Proof Theory, Cambridge University Press, 2001.

2011-04-01, INRIA, 23 avenue d'Italie

The slides of the presentation of this day are available here.

9h: Denis Cousineau (Microsoft Research), Curry-style vs Church-style in deduction modulo.

10h15: Olivier Hermant (LISITE), Generalizing Boolean Algebra for Deduction Modulo.

11h30: Gilles Dowek (INRIA), First-order logic in Dedukti

14h: Guillaume Burel (CEDRIC), Coqine, un traducteur de preuve Coq en Dedukti

15h15: David Delahaye/Catherine Dubois (CEDRIC), Sémantique de Focalize

Home, Members, Software, Seminars