This page is no more maintained since August 2015. See the new page there.
I will present the algorithm providing term inference in the Dedukti checker.
Unification constraints are produced by bidirectional elaboration, and solving them is delayed safely using guarded terms. Heuristics are then applied as simplification rules with backtracking.
In this talk we show that any formula A with a super-polynomially sized proof in minimal implicational logic has a polynomially-sized proof in classical implicational propositional logic . This fact provides an argument in favor that any classical propositional tautology has short proofs, i.e., NP=CoNP
This talk will survey recent works about the understanding of time cost models for lambda-calculus (in collaboration with Dal Lago, Barenbaum & Mazza, and Sacerdoti Coen). We will first focus on the weak lambda-calculus, the model underlying functional programming languages, for which it is well-known that the number of beta-steps is a reasonable cost-model. The proofs of such a result rely on alternative representations of lambda-calculus extended with some form of sharing, as abstract machines or graphical formalisms. We will provide an abstract view, using as a tool a lambda-calculus with explicit sharing, the Linear Substitution Calculus, and encompassing all main evaluation schemes (call-by-name/value/need). First, we will discuss the overhead due to sharing itself, and then the overhead of implementing it via an abstract machine. The second part of the talk will discuss strong reduction, the model underlying proof-assistants, focusing on the recent result that the number of beta-steps is a reasonable cost-model also in this case. We will explain why an additional layer of sharing, called useful sharing, is required, and then we will reconsider sharing and abstract machines overheads in this new setting.
I will present some joint work with Lucca Hirschi and Stéphanie Delaune, on designing partial-order reduction techniques for checking trace equivalence in (a fragment of) the applied pi-calculus. These techniques have lead to significant optimizations of the Apte tool for checking equivalence-based security properties of protocols. In this talk, I will focus on the roots/connections of this work with focusing from linear logic.
In presence of sum types, the lambda calculus does not enjoy uniqueness of eta-long normal forms. The canonicity problem also appears for proof trees of intuitionistic logic. In this talk, I will show how the problem becomes easier if, before choosing a canonical representative from a class of beta-eta-equal terms, one coerces the term into the exp-log normal form of its type. This type normal form arises from the one for exponential polynomials.
The increasing number of heterogeneous devices interacting in networks claims for a new programming style, usually called communication-centered programming. In this scenario possible participants to choreographies expose their interfaces, describing the communications they offer. Interaction protocols can be synthesised through a phase of negotiation between participants, in which different pieces of code can be composed in order to get the desired behaviour. At run time some unexpected event can make the current choreography no longer executable. In this case the participants should be able to adapt themselves in order to successfully continue the interaction. In this adaptation both new interfaces and new codes of participants could need to be synthesised.
I will recall my last seminar at Deducteam about simply typed lambda calculus modulo type isomorphisms [https://hal.inria.fr/hal-01109104], then I will show some ongoing works, derived from it, namely, an implementation of this system in Haskell and a two non trivial extensions to polymorphic types modulo isomorphisms.
Formal, symbolic techniques are extremely useful for modelling and automatically analyzing security protocols. Initially, these techniques were mainly developed to analyze authentication and confidentiality properties. Both these properties are trace properties and efficient tools for their verification exist. In more recent years anonymity-like properties have received increasing interest. Many flavors of anonymity properties are naturally expressed in terms of indistinguishability and modeled as an observational equivalence in process calculi. However, only very few verification tools are currently able to automate the analysis of such indistinguishability properties. In this talk I will describe the techniques behind the aKiss tool with examples from privacy properties in electronic voting.
Joint work with Andrea Asperti (Bologna)
A new "inductive" approach to standardization for the lambda-calculus has been introduced by Xi, allowing him to establish a double-exponential upper bound for the length of the standard reduction relative to an arbitrary reduction. We refine Xi's analysis, obtaining better bounds, especially for computations producing small normal forms. For instance, for terms reducing to a boolean, we are able to prove that the length of the standard reduction is at most a mere factorial of the length of the shortest reduction sequence. The methodological innovation of our approach is that instead to try to count the cost for producing something, as customary, we count the cost of consuming things. The key observation is that the part of a lambda-term that is needed to produce the normal form (or an arbitrary rigid prefix) may rapidly augment along a computation, but can only decrease very slowly (actually, linearly). Work presented at LICS'2013.
I will show how to formulate the insecurity (reachability property) of a
cryptographic protocol as the existence of a proof in a well-chosen
Then, I'll study some proof simplification rules, that yield a property of small attack.
This was the PhD of Vincent Bernat (defended 2006).
Computer arithmetic has applied formal methods and formal proofs for years. As the systems may be critical and as the properties may be complex to prove (many sub-cases, error-prone computations), a formal guarantee of correctness is a wish that can now be fulfilled. This talk will present a chain of tools to formally verify numerical programs. The idea is to precisely specify what the program requires and ensures. Then, using deductive verification, the tools produce proof obligation that may be proved either automatically or interactively in order to guarantee the correctness of the specifications. Then an application, coming from numerical analysis will be thoroughly presented.
Cryptographic protocols aim at securing communications over insecure
networks such as the Internet, where dishonest users may listen to
communications and interfere with them. For example, passports are no more
pure paper documents. Instead, they contain a chip that stores additional information
such as pictures and fingerprints of their holder. In order to ensure privacy, these chips
include a mechanism, i.e. a cryptographic protocol, that does not let
the passport disclose private information to external users.
This is just a single example but of course privacy appears in many other contexts
such as RDFIDs technologies or electronic voting.
Actually, the analysis of cryptographic protocols relies on several concepts which are well-etablished in the field of process algebras. In particular, the notion of behavioral equivalence allows one to express indistinguishability which is a crucial concept to model privacy-type security properties (e.g. anonymity, unlinkability, ...) We will discuss some recent advances that we have done to analyse automatically equivalence-based security properties, and we will review some issues that remain to be solved to obtain efficient verification tools.
I will discuss recent improvements of Focalide, a translator from the FoCaLiZe language to Dedukti. In addition to FoCaLiZe object mechanisms, it is now able to translate proofs using Zenon and the functional programming constructs of FoCaLiZe. Thanks to these enhancements, almost all the standard library of FoCaLiZe can now be checked by Dedukti.
I will present Krajono, the latest addition to the family of Dedukti translations. Krajono translates the proofs of Matita, an interactive theorem prover based on the calculus of inductive constructions (CIC), to Dedukti. What happened to Coqine? How did we solve the obstacles that plagued it? What lessons did we learn from this? Tune in to find out!
We define a Kripke semantics for Intuitionistic Higher-Order Logic with constraints formulated within Church's Theory of Types via the addition of a new constraint base type. We then define an executable fragment, hohh(C) of the logic: a higher-order constraint logic programming language with typed λ-abstraction, implication and universal quantification in goals and give a modified model theory for this fragment. Both formal systems are sound and complete for their respective semantics. We will sketch a simple semantics-based proof that the language hohh(C) satisfies the uniformity property of Miller et. al.
Knuth and Bendix showed long ago that confluence of a terminating first-order rewrite system
can be reduced to the joinability of its finitely many critical pairs. Felgenhauer showed recently that
confluence of a left-linear first-order rewrite system can be reduced to the existence of decreasing diagrams
for its finitely many parallel critical pairs.
In this presentation, we investigate the case of layered rewrite systems whose rules are non-overlapping at strict subterms, and redex-depth non-increasing. Root overlappings are allowed in either finite or infinite trees. Rules may be non-terminating, non- left linear, or non-right linear. Using novel techniques -sub-rewriting and cyclic unifiers-, we show that layered systems are confluent provided their root critical pairs, in finite or infinite trees, have decreasing diagrams. Robustness of the result will be discussed.
Program analysis and verification often involve to verify arithmetic constraints. In order to deal with these constraints, we will show an extension of the tableaux rules for linear arithmetic (also called Presburger arithmetic), as well as a proof search method that relies on the general simplex (i.e the ususal simplex algorithm but without the optimization problem) for rational and real arithmetic, and uses a branch & bound strategy for integer arithmetic. We will then present the results of the implementation of this proof search method in the automated theorem prover Zenon.
We give a constructive proof of Kruskal's Tree Theorem—precisely, of a topological extension of it. The proof is in the style of a constructive proof of Higman's Lemma due to Murthy and Russell (1990), and illuminates the role of regular expressions there. In the process, we discover an extension of Dershowitz' recursive path ordering to a form of cyclic terms which we call μ-terms. This all came from recent research on Noetherian spaces, and serves as a teaser for their theory.
Given a Foundational Proof Certificate in classical logic, with or without cuts,
we would like to check it in an intuitionnistic checker without any changes required
to the proof evidence that was intended for a classical checker.
We will see an overview of existing double negation translations and the flaws that make them unsuitable for our purpose. Then we will describe a unique embedding proposed by Kaustuv Chaudhuri that offers a one-to-one correspondence between focused classical proofs and their focused intuitionnistic counterpart.
We show how we take advantage of that to check the FPC independently from the checker's logic (Classical or Intuitionnnistic)
Automatic and interactive theorem provers build proofs in many different formats. These formats range from, say, natural deduction to resolution refutations and from Herbrand instances to oracle strings. I will describe our progress on a multi-year project to design a framework for defining the semantics of a wide range of ``proof languages''. Recent advances in structural proof theory---particularly results surrounding focused proof systems---provides our framework with the flexibility to integrate both deterministic and non-deterministic computation into deductions over classical and intuitionistic logics. I will also outline our development of a reference proof checker for this framework.
Ever since Mankind learnt how to tame fire and propositional logic, those became efficient tools for solving many other problems. In the context of automated theorem proving, the latter technology proves especially useful. In 2014, A. Voronkov presented Avatar, a simple and efficient way to integrate a boolean SAT solver within a superposition prover. We will present the core ideas of Avatar, its advantages over regular splitting; then we will present a (work in progress) new way to perform structural induction in a superposition prover by using QBF technology -- QBF is an extension of SAT where every boolean variable is explicitly quantified either universally or existentially.
In this talk I will present the framework we have built within the Nuprl proof assistant to specify, verify and generate provably correct distributed systems. I will then discuss an interesting impact this work had on Nuprl, namely how it prompted us to undertake the verification of Nuprl in Coq. Finally, I will discuss major changes that we have made to Nuprl, many of them driven by our work on formally verifying Nuprl.
Double negation translations are a tool that allows to translate proofs in classical logic
into proofs in intuitionistic logic. They are known since Kolmogorov (1925) and Godel-Gentzen (1933),
and have been given many variants and refinements.
In this talk, we will be interested in new variants of double-negation translations, which are morphisms, in the sense that only the connectives are transformed. This kind of translation was introduced by Dowek (2012) and can serve to introduce into the intuitionistic predicate calculus some classical connectives, while the other remain intuitionistic. Unfortunately, this translation is verbose. We will show how, by the use of focusing, we can lessen the number of introduced negation symbols.
In this talk we going to present some advances involving minp-graph, a graph representation developed by the TecMF team of PUC-Rio, to represent natural deduction derivations, in particular, it will presented the normalisation process that generates a new minp-graph whose size does not exceed the size of the original one.
Zelus is a new programming language for modeling and implementing
hybrid systems in which software interacts with a physical environment.
From a user's perspective, its main originality is to extend a existing
synchronous language with Ordinary Differential Equations (ODEs). The
extension is conservative: any synchronous program expressed as
data-flow equations and hierarchical automata can be composed
arbitrarily with ODEs in the same source code. A dedicated type
system and causality analysis ensure that all discrete changes are
aligned with zero-crossing events. Programs are statically
scheduled and translated into sequential code and the final code is paired with an
off-the-shelf numerical solver (Sundials CVODE).
During the talk, I will show some causality issues with Simulink and Modelica and how they are statically detected in Zelus.
This is joint work with Albert Benveniste, Benoit Caillaud, Timothy Bourke and Bruno Pagano.
In this talk, I want to share my experience in implementing a new interactive proof system for Coq,
and suggest answers on practical and theoretical grounds to the questions "what?" and "how?".
What should an interactive proof system for a rich type theory such as Coq's feature? And how can
such an interactive proof system be implemented? To answer the latter question, in particular,
I will summon a pot-pourri of computational effects, monads, and impredicative encodings.
Specifically, the interactive proof system is a monad, implemented as a stack of monadic effects,
and uses impredicative encodings for additional features and efficiency.
If you have always been bewildered by the double-continuation implementation of backtracking, or have forever wondered whether impredicative encoding had any application whatsoever, I will address these. The answer to the latter, by the way is: yes the former.
Abduction can be defined as the process of inferring plausible explanations
(or hypotheses) from observed facts (conclusions). This form of reasoning plays
a central role in system verification, particularly for identifying bugs and
providing hints to correct them.
We describe an approach to perform abductive reasoning that is based on the superposition calculus. We show how the inference rules of the superposition calculus can be adapted to obtain a calculus that is deductive complete for clauses built on a given set of ground terms, thus guaranteeing that all required explanations can be generated. This calculus enjoys the same termination properties as the superposition calculus: in particular, it is terminating on ground extensions of decidable theories of interest in software verification.
We describe techniques for storing sets of abduced clauses efficiently, and show how usual trie-based approaches for representing sets of propositional clauses in a compact way can be adapted and extended in order to denote equational clauses up to equivalence (modulo the axioms of equality).
We show how some specific theories such as arithmetic can be handled by relying on external tools for checking satisfiability and eliminating useless terms.
We also identify hints for improvements and provide lines of on-going and future research.
Differential Dynamic Logic can express important properties about Cyber-Physical Systems, by combining discrete assignments and control structures with differential equations. However it can only express properties about the end state of a system. In this talk, we introduce the differential temporal dynamic logic dTL2, a logic to specify properties about the intermediate states reached by a hybrid system. It combines differential dynamic logic with temporal logic, and supports some linear time temporal properties of LTL. It extends differential temporal dynamic logic dTL with nested temporalities. We provide a semantics and a proof system for the logic dTL2, and show its usefulness for nontrivial temporal properties of hybrid systems. We take particular care to handle the case of alternating universal dynamic and existential temporal modalities and its dual, solving an open problem formulated in previous work.
Deduction modulo is a framework in which theories are integrated into proof systems such as natural deduction or sequent calculus by presenting them using rewriting rules. When only terms are rewritten, cut admissibility in those systems is equivalent to the confluence of the rewriting system, as shown by Dowek, RTA 2003, LNCS 2706. This is no longer true when considering rewriting rules involving propositions. We show that, in the same way that it is possible to recover confluence using Knuth-Bendix completion, one can regain cut admissibility in the general case using standard saturation techniques, namely ordered resolution with selection, and superposition. This work relies on a view of proposition rewriting rules as oriented clauses, like term rewriting rules can be seen as oriented equations. This also leads us to introduce an extension of deduction modulo with conditional term rewriting rules.
[Joint work with Jim Lipton and Julio Mariño]
Logic Programming is built on the idea that the correspondence between proof search and computation can be effective as a programming tool.
Efficient execution of logic programs depends on the use of existential --- also calle "logic" --- variables: Unknowns that allow an existential witness to be provided later in the proof search, carrying a global scope.
Traditional semantics for CLP tackle logic variables in the meta-level: they have no place in the actual semantics. In this scenario, algebraic --- or point-free style --- semantics for logic programs are a challenge.
In the 40's, Tarski retook development of Peirce and Schröder relation algebra, proposing its use as a purely algebraic "variable-free" device capturing the whole of set theory.
We draw from Tarski (and Freyd-Maddux) work to develop an algebraic semantics for constraint logic programming.
In the first part of the work , a logic program is interpreted or compiled into a relation algebra. All reasoning --- including execution --- is captured by the associated equational theory.
In the second part of the work , we build an efficient declarative execution mechanism for Logic Programming. Our machine is based on the categorical version of the calculus of relations --- allegory theory. In particular, we start from the Lawvere Category of a logic program. Lawvere Categories represent algebraic theories and in our setting they capture the signature of the program.
The regular completion of the Lawvere Category of a program generates a tabular allegory --- where every relation is tabulated by a pair of functions --- giving a new semantics to logic programs.
Execution is based on computing normal form of diagrams for relation composition. This single primitive encompasses unification, garbage collection, parameter passing, environment creation and destruction.
The categorical model seems well suited for accommodating extensions, and we discuss constraints, types, functions and monads.
Let (X,≤) be a non-empty strictly inductive poset, that is, a non-empty partially ordered set such that every non-empty chain Y has a least upper bound lub(Y) ∈ X, a chain being a subset of X totally ordered by ≤. We are interested in sufficient conditions such that, given an element a0 ∈ X and a function f:X → X, there is some ordinal k such that ak+1=ak, where ak is the transfinite sequence of iterates of f starting from a0 (ak is then a fixpoint of f):
We define a notion of model for the lambda-Pi-calculus modulo theory, a notion of super-consistent theory, and prove that proof-reduction terminates in the lambda-Pi-calculus modulo a super-consistent theory. We prove this way the termination of proof-reduction in two theories in the lambda-Pi-calculus modulo theory, and their consistency: an embedding of Simple type theory and an embedding of the Calculus of constructions.
My thesis is that the formal construction of software systems and that of mathematical proofs
take advantage of using similar approaches: abstraction, decomposition, design patterns, etc.
We have a certain experience in developing both kind of constructions with the Rodin Platform tool.
We felt that the lessons learned form one kind of construction can help improving the other one a
nd vice-versa. Our experience in constructing formal proofs of pure mathematical theorems is
not very wide but, we think, nevertheless significant, among which are the Cantor-Bernstein theorem,
the well-ordering theorem, the recursion theorem, the axiomatic definition of real numbers together
with the Archimedean property and the intermediate value theorem, etc.
Recently I came across the very interesting topological proof of the infinitude of prime numbers developed in 1955 by Hillel Fürstenberg. In my presentation, I will use this example to explain how we develop the mechanisation of mathematical proofs with the Rodin tool.
Faust (Functional Audio Stream) is a purely functional Domain-Specific Language dedicated to signal processing applications in the audio and music domains. Faust has been created more than 10 years ago by Yann Orlarey, at Grame, a Centre national de création musicale, and is used in various institutions over the world such as IRCAM or Stanford CCRMA. We will provide a general introduction to Faust, while focusing on its sophisticated type system. We will discuss some of the features expected to be introduced into the next version of Faust, in particular its multirate extension and vector operations, important for spectral audio processing, and its impact on its static semantics. This work is funded partly by the ANR FEEVER project.
Satisfiability Modulo Theories (SMT) solvers check the satisfiability of logic formulas written in a language where symbols are interpreted in an theory, most often some combination of the empty theory (uninterpreted symbols), an arithmetic theory (e.g. linear arithmetic on integers and reals, or non-linear real arithmetic), and various data-structure theories (e.g. lists, arrays, bit-vectors, strings). In this talk, we present the open-source SMT solver veriT. After reviewing the underlying techniques, we briefly address some current works, in particular quantifier handling and non-linear reasoning. veriT has been designed with proofs in mind; the proof format is introduced and we finally discuss some challenges about proof production and exchange for SMT.
"Deep Inference" is a term used to describe logical inference systems where the principal formulas in an inference rule need not be a "top level" formula but can be found in a deeply nested structure. In the most extreme form of deep inference, the calculus of structures, the notion of "formula" and "structure" (or "sequent") coincide, and every logical entailment can be "applied" to any arbitrary subformula. The inference system then behaves exactly as a rewrite system. In this talk I will give just enough of an introduction to the calculus of structures to be able to focus on one aspect that I find personally interesting: exploiting depth in theorem proving. For interactive theorem proving, in particular, I will try to explain why I think deep inference can show us a way out of the linguistic tarpit of formal proof languages.
In logic we face the problem of deal with big deductions, since SAT is NP-complete. In this talk I want to put some ideas on how to short the size of reductions in Dedukti using new graph structure developed to minimal propositional Logic. The main idea is to exchange knowledge about the subject in order to produce a more efficient Dedukti (if possible).
We describe the design and implementation of Logtk, an OCaml library for writing automated reasoning tools that deal with (possibly typed) first-order logic. The library provides data structures to represent terms, formulas and substitutions, and algorithms to index terms, unify them. It also contains parsers and a few tools to demonstrate its use. It is the basis of a full-fledged superposition prover.
In this talk we consider non-idempotent intersection types to
characterize different properties of higher-order lambda calculi.
The first part of the talk focuses on the linear substitution calculus, a lambda-calculus with partial substitution acting at a distance that naturally expresses linear-head reduction, a notion of evaluation of proof nets that is strongly related to abstract machines. We define two non-idempotent intersection type systems for the linear substitution calculus. We show that our first (resp. second) quantitative type system characterizes linear-head and weak (resp. strong) normalizing sets of terms. All such characterizations are given by means of combinatorial arguments, i.e. there is a measure based on type derivations which decreases with respect to each considered reduction relation.
The second part of the talk focuses on the inhabitation problem for intersection types. While this problem is known to be undecidable in the case of idempotent intersection, we prove decidability (through a sound and complete algorithm) for the non-idempotent case. We then explain how the inhabitation problem is related to the characterization of the solvability property, and then we focus on the extended framework of pattern calculi, where abstractions are allowed not only on variables but also on algebraic patterns. The solution of the inhabitation problem for non-idempotent intersection types in the algebraic pattern calculus turns out to give a complete characterization of solvability for such a language.
This talk is based on joint works with Antonio Bucciarelli, Simona Ronchi Della Rocca and Daniel Ventura.
In this talk, I will start by introducing the foundations of the Satisfiability Modulo Theories (SMT) technology. Then, I will concentrate on Alt-Ergo: an SMT solver tuned for program verification. In particular, I'll present its main architecture, explain how it handles quantified formulas and conclude by some use cases and recent results. The development of Alt-Ergo was initiated in 2006 by Sylvain Conchon and Evelyne Contejean at "Laboratoire de Recherche en Informatique". Since September 2013, it is maintained and distributed by the OCamlPro Company.
In this talk ,I will give a quick overview over the Agda language and
then present some aspects of the implementation: internal intermediate
languages, type-checking and compilation phases, unification, termination checker.
I hope to pass on some valuable experiences to implementors of dependently-typed languages.
The file used during the presentation is available here.
We describe a general purpose abstract Church-Rosser result that captures the existing
such results that rely on termination of computations. This is achieved by studying a
novel abstract rewriting framework, abstract positional rewriting, which allows to incorporate
positions at the abstract level. We also prove a new lifting lemma, which improves over
the classical one in the modulo case. Many existing concrete Church-Rosser results are improved,
while new ones are obtained in particular for higher-order rewriting at higher types.
The slides of the presentation are available here.
Joint work with Nicolas Tabareau (Inria Rennes, École des Mines de Nantes).
Homotopical interpretations of Martin-Löf type theory lead toward an interpretation of equality as a richer, more extensional notion. Extensional or axiomatic presentations of the theory with principles based on such models do not yet fully benefit from the power of dependent type theory, that is its computational character. Reconciling intensional type theory with this richer notion of equality requires to move to higher-dimensional structures where equality reasoning is explicit.
We follow this idea and develop an internalization of a groupoid interpretation of Martin-Löf type theory with one universe respecting the invariance by isomorphism principle. Our formal development relies crucially on ad-hoc polymorphism to overload notions of equality and on a conservative extension of Coq’s universe mechanism with polymorphism.
Let us call equality the smallest subset of provable equalities from a
given axiomatic equational theory. Some equational theories can be
formulated at least partially as rewriting systems and thus reduced to
computation. This is e.g. the approach taken in deduction modulo,
with, for instance, the equational theories of first-order arithmetic,
higher-order logic, or Zermelo set theory formulated as normalising
computation rules. In a proof system with proof terms, such as
deduction modulo, but also lambda-pi modulo, let us call conversion
rule a rule that captures computations without leaving a trace in the
In Martin-Löf's Intensional Type Theory, in the Calculus of Inductive Constructions and in the general theory of Pure Type Systems, there is a conversion rule which captures the intensional notion of definitional equality. The latter is decidable, and, because it is decidable, a full derivation can be reconstructed from its associated proof term. By construction, conversion is also closed under functional extensionality and it inherently satisfies the uniqueness of equality proofs. In these theories, equality is the smallest subset of equalities provable from definitional equality, i.e. closed under deductive reasoning from definitional equality.
In Martin-Löf's Extensional Type Theory, the conversion rule however embeds equality, making it at the same time extensional, uniquely inhabited and closed under deductive reasoning. It becomes undecidable and a full derivation cannot be reconstructed from the mere knowledge of its associated proof term.
In Martin-Löf's type theory, there is an alternative possible characterisation of equality as the smallest subset of equalities provable from a set of typed congruence rules defined by induction on the constructors of the type. In this view, equality in Pi-types can be defined to be extensional equality and equality in equality types can be defined to be trivial, leading to an intensional theory whose strength is the one of Extensional Type Theory (see Altenkirch's work).
Homotopy Type Theory introduces a form of extensional equality on universes called univalence which is similar to the one early defined by Hofmann and Streicher: two types are equal when they have the same cardinal. This extensionality implies functional extensionality but it contradicts uniqueness of equality proofs: for instance, the identity function and the negation are two distinct proofs of Bool=Bool.
In the talk, we will discuss some issues about equality:
FoCaLiZe is a programming language allowing to state in a same framework
algorithms (programming aspects), properties (logical aspects) that these
algorithms must satisfy and proofs these latter hold. It provides high-level
constructs to ease development structuring and reuse (modularity,
inheritance, late-binding, parameterization) and a proof language allowing
to write proofs in a hierarchical way, using a natural deduction style, which
rely on the Zenon automated theorem prover to discharge goals ... and the
burden of the user. The core programming language is a pure functional
language while properties are stated a first-order logic formulae.
A FoCaLiZe development is compiled in two target languages: one OCaml source
code leading to an executable (or object file) and one Coq source code
containing the complete model of the development (i.e. definitions, properties
and proofs) that will be ultimately checked by Coq as assessment. A crucial
care in this process is to avoid any error from the target languages being
returned to the user: a program accepted by the FoCaLiZe compiler must be
accepted by both OCaml and Coq. To increase confidence in "what is ran is
what is proved" despite two target languages, the compiler has to use a
code generation model identical and traceable for both of them.
In this presentation we will first introduce the motivations, design and features choices that impacted the language semantics. A short presentation of the language features will be addressed. Then we will explore the code generation model, emphasizing on the central notion of "dependencies" and its static calculus. Not all the aspects of the compilation will be addressed, however the global process will be sketched. Finally we may discuss of possible extensions of the framework and their impact on the semantics, analyses and code generation model.
The slides of the talk are available here.
Joint work with Bruno Barras.
We present two type systems: ICCΣ and its annotated variant, AICCΣ. ICCΣ is an extension of Alexandre Miquel's Implicit Calculus of Constructions (ICC) with Σ-types. Since type-checking is (most likely) not decidable in ICCΣ, we introduce also a more verbose variant, AICCΣ, which should fix this issue. The purpose of these systems is to solve the problem of interaction between logical and computational data.
With Conor McBride, I have developed a presentation of inductive families based on
a type-theoretic universe. Unlike the traditional, purely syntactic, approaches,
our treatment allows us to reflect inductive families within type theory. As a result,
inductive types become first-class citizens.
In this talk, I will present such a universe of inductive types and hint at its bootstrap. I will then illustrate the benefits of first-class citizenship. I shall conclude with a few exploratory results, at the intersection between inductive types, equational theories, and rewriting.
Files associated to the talk: DeducteamTalk.agda (another link to this file), Enum.agda and IProp.agda.
A link with comments on the talk.
It is well-known that the size of propositional classical proofs can be huge. Proof theoretical studies discovered exponential gaps between cut free (or normal) proofs and the corresponding (non-normal) proofs with cuts (or modus ponens). The task of automatic theorem proving is, on the other hand, usually based on the construction of cut free or only atomic-cuts proofs, since this procedure produces less alternative choices. The aim of this presentation is to discuss some methods of size reduction of deductions.
Algorithms offer a rich and expressive language for modeling physical, biological, and social
systems. They lay the grounds for numerical simulations and, more importantly, provide a powerful
framework for their analysis. In the life sciences, natural algorithms thus appear to play the
same role as differential equations in the physical sciences. This entails the need for the
development of an "algorithmic calculus", an analog of differential calculus.
I shall introduce this new field of research and some fundamental questions regarding the achievement of consensus, synchronization, and more generally coordination among agents of biological or social systems. I shall present various algorithmic tools that have been recently proposed to analyze natural algorithms and which use classical notions in dynamical systems, in Markov chains, or in spectral graph theory.
The slides of the talk are avaiable here.
Abadi and Cardelli defined in 1996 several object calculi in order to
formalize the behaviour of Object Oriented programs and to study object
type systems. The major feature of these type systems is subtyping.
Translating subtyping in Dedukti is both challenging, because convertibility between types is symmetrical, and needed, because the type system of Coq has subtyping between sorts.
I will present a shallow encoding of a typed functional object calculus with subtyping in the λΠ-calculus modulo and its implementation in Dedukti.
We defined a simply typed, non-deterministic lambda-calculus where isomorphic types
are equated. To this end, an equivalence relation was settled at the term level.
We provide a proof of strong normalisation modulo equivalence. Such a proof is
a non-trivial adaptation of the reducibility method.
This work opens several paths for future work, which I will try to detail in this talk.
In this talk, I will survey results of Mathematical logic from the 1980s that are relevant
to axiomatizability and decidability of type isomorphism in simultaneous presence of sum
and function types, and I will discuss some related open questions.
The slides of the talk.
This talk will discuss how computational models defined in toposes may lead to non-standard computational models. A possible comparison with hyper-computation is lightly motivated.
The computational complexity of SAT and TAUT for purely implicational propositional logic is known to be PSPACE-complete. Intuitionistic Propositional Logic is also known to be PSPACE-complete, while Classical Propositional Logic is CONP-complete for Tautology checking and NP-complete for Satisfiability checking. We show how proof-theoretical results on Natural Deduction help analysing the Purely Implicational Propositional Logic complexity regarded its polynomial relationship to Intuitionistic and Classical Propositional Logics. The main feature in this analysis is the sub-formula principle property. We extended the polynomial reduction of purely implicational logic to any propositional logic satisfying the sub-formula principle. Hence, any propositional logic satisfying the sub-formula principle is in PSPACE. We conclude that some well-known logics, such as Propositional Dynamic Logic and S5C (n = 2), for example, hardly satisfy the sub-formula property. On the other hand, these logics have proof procedures. They are told to be mechanizable despite being EXPTIME-complete.
We point out some facts and discuss some questions on how the sub-formula property is related to the mechanization of theorem proving for propositional logics. The relationship between feasible interpolants and the sub-formula property is discussed. Some examples remind us that the relationship between normalization of proofs and the sub-formula property is weak. Finally we discuss some alternative criteria to consider a logic to be mechanisable. This discussion is strongly influenced by A. Carbone analysis on propositional proof complexity. Our purpose is to show how structural proof theory can help us in analysing hard relationships on propositional logic systems.
Homotopy Type Theory (HoTT) is a variant of Type Theory aiming at formalizing synthetic proofs in homotopy theory. After defining its main distinctive features, I will show that it is not merely a domain-specific formalism, but rather a foundation which happens to be richer even for general purposes.
An important issue with this formalism is that it has no known computational interpretation yet, as the model of HoTT based on simplicial sets proposed by Voevodsky is not constructive. I will sketch a new model proposed by Coquand that may address this issue.
Finally, if time permits, I'll talk about higher inductive types, that provide a way to define non-trivial homotopy types in a fashion inspired from inductive types.
The slides of the talk are available here.
We present a preliminar version of the program "LALBLC" (i.e. LA=LB? Let us Compute).
The equivalence problem for dpda consists in deciding, for two given deterministic pushdown automata A,B, whether they recognize the same language. It was proved in [TCS 2001, G.S.] that this problem is decidable by means of some complete proof systems.
The program LALBLC gives, in some sense, life to the above completeness proof: on an input A,B, the program computes either a proof of their equivalence (w.r.t. to the system D5 of the above reference) or computes a witness of their non-equivalence (i.e. a word that belongs to the symmetric difference of the two languages).
We shall outline the main mathematical objects that are manipulated as well as the main functions performed over these objects. We shall also describe some experimental results obtained so far.
The talk is based on an ongoing, joint, work with P. Henry.
Given a proposition, if it is provable in predicate logic, it is true independently of all the consistency models. however, in the area of model checking, the model is given, and the model checking problem is to prove whether some properties are true in this model. What we are doing is to define a theory in predicate logic such that the propositions provable in this theory are those that are true in a given universe. We used the one-way resolution method to prove such propositions autocratically. There are both positive and negative results in proving such propositions by resolution method.
Deduction modulo aims at providing to automated theorem provers an efficient way of dealing with some well-known theories, like Peano arithmetic, set theory, or group theory. However, many logic problems that use those theories do not advertise it explicitely; in some cases, axioms of the theory may only be discovered during the proof search. We present here a technique to recognize the axioms of a theory during the proof search, independently of the concrete signature of the problem, and to use this information to detect theories defined as sets of axioms.