View: session overviewtalk overviewside by side with other conferences

09:00-10:00 Session 34C: DCM/ITRS joint invited talk (joint with ITRS)
Quantitative types: from Foundations to Applications

ABSTRACT. Quantitative techniques are emerging in different areas of computer science, such as model checking, logic, and automata theory, to face the challenge of today's resource aware computation.

In this talk we give a clean theoretical understanding of the use of resources in quantitative systems, and we introduce different alternatives, suitable for a wide range of powerful models of computation, such as for example pattern matching, control operators and infinitary computations.

We survey some interesting results related to those typing systems such as:
(1) characterization of operational properties,
(2) computation of exact measures for reduction sequences and normal forms,
(3) inhabitation problems, and
(4) observational equivalence.

10:00-10:30 Session 37
Models of Computation that Conserve Data

ABSTRACT. Conservation of data is an idea to build programming languages and hardware that support fixed-space programs: data cannot be created or destroyed. In this paper we give some motivations for this work and discuss some related ideas such as linearity and reversibility. We then give some examples of models of computation that can be adapted to work in this way. We conclude by discussing some ongoing work in this area.

10:30-11:00Coffee Break
11:00-12:30 Session 38C
A syntactic model of mutation and aliasing

ABSTRACT. Traditionally, semantic models of imperative languages use an auxiliary structure which mimics memory. In this way, ownerships and other encapsulation properties need to be reconstructed from the graph structure of such global memory. We present an alternative "syntactic" model where memory is encoded as part of the program rather than as a separate resource. This means that execution can be modelled by just rewriting source code terms, as in semantic models for functional programs. Formally, this is achieved by the block construct, introducing local variable declarations, which play the role of memory when their initializing expressions have been evaluated. In this way, we obtain a language semantics which is more abstract, and directly represents at the syntactic level constraints on aliasing, allowing simpler reasoning about related properties. We illustrate this advantage by expressing in the calculus the "capsule" property, characterizing an isolated portion of memory, which cannot be reached through external references. Capsules can be safely moved to another location in the memory, without introducing sharing. We state that the syntactic model can be encoded in the conventional one, hence efficiently implemented, and outline the proof that the dynamic semantics are equivalent.

Pointing to Private Names

ABSTRACT. Scoped channels, in the pi-calculus, are not nameable, as they are bound and subject to alpha-renaming. For program analysis purposes, however, to identify properties of these channels, it is necessary to talk about them. We present herein a method for uniquely identifying scoped channels.

A Category Theoretic Interpretation of Gandy’s Principles for Mechanisms
SPEAKER: Joseph Razavi

ABSTRACT. Based on Gandy's principles for models of computation we give category-theoretic axioms describing locally deterministic updates to finite objects. Rather than fixing a particular category of states, we describe what properties such a category should have. The computation is modelled by a functor that encodes updating the computation, and we give an abstract account of such functors. We show that every updating functor satisfying our conditions is computable.

12:30-14:00Lunch Break
14:00-15:30 Session 40C
On Higher-Order Probabilistic Computation.

ABSTRACT. Ideas and tools from probability theory are extremely useful in various areas of theoretical computer science, and become essentials in fields like cryptography, in which being able to flip from a fair coin is a necessity rather than a choice. Recently, there has been a growing interest in studying the role of higher-order functions in probabilistic computation. We give some hints about why this indeed deserves to be studied, focusing on cryptography and machine learning. We then overview the metatheory of a probabilistic lambda-calculus, highlighting the (striking) differences with the one of the pure, deterministic, lambda-calculus.

Towards a Formal System for Topological Quantum Computation

ABSTRACT. We study Topological Quantum Computing (TQC) from the perspective of computability theory with the aim of definining a formal system which is able to capture the computational features of TQC. We discuss the mathematical model for TQC, namely Modular Tensor Categories, and their suitability for the construction of a domain of denotational objects similar to the Scott domain of the lambda calculus. This leads us to believe that a formalism similar to the classical lambda calculus can be defined also for TQC.

15:30-16:00Coffee Break
16:00-16:30 Session 42C
Generic Graph Semantics

ABSTRACT. The goal of this ongoing investigations is a more general, more flexible, and more precise generic language for describing algorithms than ordinary sequential abstract state machines. More general, since it will incorporate greater nondeterminism. More flexible, because it allows sequences of assignments in a single step. More precise, on account of the exact control over the order in which values are accessed and the number of times they each are.

16:30-17:30 Session 43 (joint with ITRS)
Polyadic approximations and intersection types (ITRS/DCM joint invited talk)

ABSTRACT. In his paper introducing linear logic, Girard briefly mentioned how full linear logic could be seen, in an informal sense, as the limit of its purely linear fragment.  Several ways of expanding and formalizing this idea have been developed through the years, including what I call polyadic approximations.  In this talk, I will show how these are deeply related to intersection type systems, via a general construction that is sufficiently broad to allow recovering every well known intersection type system and their normalization properties, as well as introducing new type systems in wildly disparate contexts, with applications ranging from deadlock-freedom in the pi-calculus to a type-theoretic proof of the Cook-Levin theorem.  

(Part of this work was developed jointly with Luc Pellissier and Pierre Vial).