FLOC 2018: FEDERATED LOGIC CONFERENCE 2018
PROGRAM FOR TUESDAY, JULY 17TH
Days:
previous day
next day
all days

View: session overviewtalk overview

09:00-10:30 Session 117A: Theory and Security (CAV)
Location: Maths LT1
09:00
Automata vs Linear-Programming Discounted-Sum Inclusion

ABSTRACT. The problem of {\em quantitative inclusion} formalizes the goal of comparing quantitative dimensions between systems such as worst-case execution time, resource consumption, and the like. Such systems are typically represented by formalisms such as weighted logics or weighted automata. Despite its significance in analyzing the quality of computing systems, the study of quantitative inclusion has mostly been conducted from a theoretical standpoint. In this work, we conduct the first empirical study of quantitative inclusion for discounted-sum weighted automata (\DS-inclusion, in short).

Currently, two contrasting approaches for \DS-inclusion exist: the linear-programming based \cct{DetLP} and the purely automata-theoretic \cct{BCV}. Theoretical complexity of \DetLPt is exponential in time and space while of \BCVt is \cct{PSPACE}-complete. All practical implementations of \cct{BCV}, however, are also exponential in time and space. Hence, it is not clear which of the two algorithms renders a superior implementation.

In this work we present the first implementations of these algorithms, and perform extensive experimentation to compare between the two approaches. Our empirical analysis shows how the two approaches complement each other. This is a nuanced picture that is much richer than the one obtained from the theoretical study alone.

09:15
Model checking indistinguishability of randomized security protocols
SPEAKER: Prasad Sistla

ABSTRACT. The design of security protocols is extremely subtle and vulnerable to potentially devastating flaws. As result, many tools and techniques for the automated verification of protocol designs have been developed. Unfortunately, these tools don't have the ability to model and reason about protocols with randomization, which are becoming increasingly prevalent in systems providing privacy and anonymity guarantees. The security metrics of these systems is often formulated through a notion of indistinguishability. In this paper, we give the first algorithms for model checking indistinguishability properties of randomized security protocols against the powerful threat model of a Dolev-Yao attacker. Our techniques are implemented in the Stochastic Protocol ANalayzer (SPAN) and evaluated on several examples. As part of our evaluation, we conduct the first automated analysis of an electronic voting protocol based on the 3-ballot design.

09:30
Lazy Self-Composition for Security Verification
SPEAKER: Aarti Gupta

ABSTRACT. The secure information flow problem, which checks whether low-security outputs of a program are influenced by high-security inputs, has many applications in verifying security properties in programs. In this paper we present lazyself-composition, an approach for verifying secure information flow. It is based on self-composition, where two copies of a program are created on which a safety property is checked. However, rather than an eager duplication of the given program, it uses duplication lazily to reduce the cost of verification. This lazy self-composition is guided by an interplay between symbolic taint analysis on an abstract (single copy) model and safety verification on a refined (two copy) model. We propose two verification methods based on lazy self-composition.The first is a CEGAR-style procedure, where the abstract model associated with taint analysis is refined, on demand, by using a model generated by lazy self-composition. The second is a method based on bounded model checking, where taint queries are generated dynamically during program unrolling to guide lazy self-composition and to conclude an adequate bound for correctness. We have implemented these methods on top of the SEAHORN verification platform and our evaluations show the effectiveness of lazy self-composition.

09:45
SCInfer: Refinement-based Verification of Software Countermeasures against Side-Channel Attacks
SPEAKER: Jun Zhang

ABSTRACT. Power side-channel attacks, capable of deducing secret using statistical analysis techniques, have become a serious threat to devices in cyber-physical systems and the Internet of things. Random masking is a widely used countermeasure for removing the statistical dependence between secret data and side-channel leaks. Although there are some techniques for verifying whether software code has been perfectly masked, they are limited in accuracy and scalability.

To bridge this gap, we propose a refinement-based method for formally verifying masking countermeasures. Our method is more accurate than prior syntactic type inference rule based approaches and more scalable than model-counting based approaches using SAT/SMT solvers.

Indeed, our method can be viewed as a gradual refinement of a set of carefully designed semantic type inference rules for reasoning about distribution types. These rules are kept abstract initially to allow fast deduction, and then made more concrete if necessary, i.e., when the abstract version is not able to completely resolve the verification problem.

We have implemented our method in a tool named SCInfer and evaluated it on cryptographic benchmarks including AES and MAC-Keccak. Our results show that SCInfer significantly outperforms state-of-the-art techniques in terms of both accuracy and scalability.

10:00
Symbolic Algorithms for Graphs and Markov Decision Processes with Fairness Objectives
SPEAKER: Simin Oraee

ABSTRACT. Given a model and a specification, the fundamental model-checking problem asks for algorithmic verification of whether the model satisfies the specification. We consider graphs and Markov decision processes (MDPs), which are fundamental models for reactive systems. One of the very basic specifications that arise in verification of reactive systems is the strong fairness (aka Streett) objective. Given different types of requests and corresponding grants, the objective requires that for each type, if the request event happens infinitely often, then the corresponding grant event must also happen infinitely often. All \omega-regular objectives can be expressed as Streett objectives and hence they are canonical in verification. To handle the state-space explosion, symbolic algorithms are required that operate on a succinct implicit representation of the system rather than explicitly accessing the system. While explicit algorithms for graphs and MDPs with Streett objectives have been widely studied, there has been no improvement of the basic symbolic algorithms. The worst-case numbers of symbolic steps required for the basic symbolic algorithms are as follows: quadratic for graphs and cubic for MDPs. In this work we present the first sub-quadratic symbolic algorithm for graphs with Streett objectives, and our algorithm is sub-quadratic even for MDPs. Based on our algorithmic insights we present an implementation of the new symbolic approach and show that it improves the existing approach on several academic benchmark examples.

10:15
Attracting Tangles to Solve Parity Games

ABSTRACT. Parity games have important practical applications in formal verification and synthesis, especially to solve the model-checking problem of the modal mu-calculus. They are also interesting from the theory perspective, because they are widely believed to admit a polynomial solution, but so far no such algorithm is known.

We propose a new algorithm to solve parity games based on learning tangles, which are strongly connected subgraphs for which one player has a strategy to win all cycles in the subgraph. We argue that tangles play a fundamental role in the prominent parity game solving algorithms. We show that tangle learning is competitive in practice and the fastest solver for large random games.

09:00-10:30 Session 117B (FM)
Location: Blavatnik LT1
09:00
Falsification of Cyber-Physical Systems Using Deep Reinforcement Learning

ABSTRACT. With the rapid development of software and distributed computing, Cyber- Physical Systems (CPS) are widely adopted in many application areas, e.g., smart grid, autonomous automobile. It is difficult to detect defects in CPS models due to the complexities involved in the software and physical systems. To find defects in CPS models efficiently, robustness guided falsification of CPS is introduced. Existing methods use several optimization techniques to generate counterexamples, which falsify the given properties of a CPS. However those methods may require a large number of simulation runs to find the counterexample and is far from practical. In this work, we explore state-of-the-art Deep Reinforcement Learning (DRL) techniques to reduce the number of simulation runs required to find such counterexamples. We report our method and the preliminary evaluation results.

09:30
Dynamic Symbolic Verification of MPI Programs
SPEAKER: Dhriti Khanna

ABSTRACT. The success of dynamic verification techniques for C MPI (Message Passing Interface) programs rest on their ability to address communication non-determinism. As the number of processes in the pro- gram grows, the dynamic verification techniques suffer from the problem of an exponential growth in the size of the reachable state space. In this work, we provide a hybrid verification technique that combines explicit- state dynamic verification with symbolic analysis of message passing pro- grams. The dynamic verification component deterministically replays the execution runs of the program, while the symbolic component encodes a set of interleavings of the observed run of the program in a quantifier-free first order logic formula and verifies the formula for property violations such as assertions and absence of communication deadlocks. In the ab- sence of property violations, it performs analysis to generate a different run of the program that does not fall in the set of already verified runs of the program. We demonstrate the effectiveness of our methodology by our prototype tool Hermes. Our evaluation indicates that for non-single- path MPI programs, the verification time reduces by at least 3 times when Hermes is compared against the state-of-the-art verification tools.

10:00
To Compose, Or Not to Compose, That is the Question: An Analysis of Compositional State Space Generation

ABSTRACT. To combat state space explosion several compositional verification approaches have been proposed. One such approach is compositional aggregation, where a given system consisting of a number of parallel components is iteratively composed and minimised. Compositional aggregation has shown to perform better (in the size of the largest state space in memory at one time) than classical monolithic composition in a number of cases. However, there are also cases in which compositional aggregation performs much worse. It is unclear when one should apply compositional aggregation in favor of other techniques and how it is affected by action hiding and the scale of the model. This paper presents a descriptive analysis following the quantitiative experimental approach. The experiments were conducted in a controlled test bed setup in a computer laboratory environment. A total of eight scaleable models with different network topologies considering a number of varying properties were investigated comprising 119 subjects. This makes it the most comprehensive study done so far on the topic. We investigate whether there is any systematic difference in the success of compositional aggregation based on the model, scaling, and action hiding. Our results indicate that both scaling up the model and hiding more behaviour has a positive influence on compositional aggregation.

09:00-10:30 Session 117C: FM I-Day (FM)
Location: Blavatnik LT2
09:00
From Formal Requirements to Highly Assured Software for Unmanned Aircraft Systems
SPEAKER: Cesar Munoz

ABSTRACT. Operational requirements of safety-critical systems are often written in restricted specification logics. These restricted logics are amenable to automated analysis techniques such as model-checking, but are not rich enough to express complex requirements of unmanned systems. This short paper advocates for the use of expressive logics, such as higher-order logic, to specify the complex operational requirements and safety properties of unmanned systems. These rich logics are less amenable to automation and, hence, require the use of interactive theorem proving techniques. However, these logics support the formal verification of complex requirements such as those involving the physical environment. Moreover, these logics enable validation techniques that increase confidence in the correctness of numerically intensive software. These features result in highly-assured software that may be easier to certify. The feasibility of this approach is illustrated with examples drawn for NASA's unmanned aircraft systems.

09:30
Interlocking Design Automation using Prover Trident

ABSTRACT. In this article we present an industrial-strength approach based on formal methods to develop and check safety-critical interlocking software for railway signaling systems. The Prover Trident approach is developed by Prover Technology to meet industry needs for reduced cost and time-to-market, by capitalizing on the inherent repetitive nature of interlocking systems, in the sense that specific systems can be created and verified efficiently as specific instances of generic principles. This enables a high degree of automation, supported by an industrial-strength toolkit for creation of design and code, with seamless integration of push-button tools for simulation and formal verification. Using this approach, safety assessment relies on formal verification, performed on the design, the software code as well as the binary code. An independent toolset for formal verification that has been developed to meet the applicable certification requirements is used to verify the revenue service code. The basic ideas of this approach have been around for some time [1,2], while the methodology and tools have matured over many industrial application projects for revenue service systems. The presentation highlights the main ingredients in this successful application of formal methods, as well as challenges in establishing this approach for production use in a conservative industry domain.

10:00
Model-Based Testing for Avionics Systems
SPEAKER: Joerg Brauer

ABSTRACT. Model-based testing is considered state-of-the-art in verification and validation of safety critical systems. This paper discusses some experiences of applying the model-based testing tool RTT-MBT for the evacuation function of an aircraft cabin controller. A major challenge of this project was the parametric design of the software, which allows to tailor the software to a certain aircraft configuration via a large number of application parameters. Application parameters thus need to be integrated in the test models. Further challenges consisted of mapping multiple detailed signals of the system under test to a single abstract model variable and vice versa, and handling incremental test model development during an ongoing test campaign. We discuss solutions that we developed to successfully conduct this test campaign.

09:00-10:30 Session 117D: Test of Time Awards and Best DC Paper (ICLP)
09:00
Test of time 20

ABSTRACT. TBD

09:30
TEST OF TIME 10

ABSTRACT. TBD

10:00
Best DC Paper

ABSTRACT. TBD

09:00-10:00 Session 117E: IJCAR Invited Talk: Erika Abraham (IJCAR)
Location: Maths LT2
09:00
Symbolic Computation Techniques in SMT Solving: Mathematical Beauty meets Efficient Heuristics

ABSTRACT. Checking the satisfiability of quantifier-free real-arithmetic formulas is a practically
highly relevant but computationally hard problem. Some beautiful mathematical decision
procedures implemented in computer algebra systems are capable of solving such problems,
however, they were developed for more general tasks like quantifier elimination, therefore
their applicability to satisfiability checking is often restricted. In computer science,
recent advances in satisfiability-modulo-theories (SMT) solving led to elegant embeddings of
such decision procedures in SMT solvers in a way that combines the strengths of symbolic
computation methods and heuristic-driven search techniques. In this talk we discuss such
embeddings and show that they might be quite challenging but can lead to powerful synergies
and open new lines of research.

 

10:00-10:30 Session 118: SMT 3 (IJCAR)
Location: Maths LT2
10:00
A Separation Logic with Data: Small Models and Automation
SPEAKER: Jens Katelaan

ABSTRACT. Separation logic has become a stock formalism for reasoning about programs with dynamic memory allocation. We introduce a variant of separation logic that supports lists and trees as well as inductive constraints on the data stored in these structures. We prove that this logic has the small model property, meaning that for each satisfiable formula there is a small domain in which the formula is satisfiable. As a consequence, the satisfiability and entailment problems for our fragment are in NP and coNP, respectively. Leveraging this result, we describe a polynomial SMT encoding that allows us to decide satisfiability and entailment for our separation logic.

10:30-11:00Coffee Break
11:00-12:15 Session 119A: SAT, SMT and Decision Procedures (CAV)
Location: Maths LT1
11:00
Delta-Decision Procedures for Exists-Forall Problems over the Reals
SPEAKER: Soonho Kong

ABSTRACT. Solving nonlinear SMT problems over real numbers has wide applications in robotics and AI. While significant progress is made in solving quantifier-free SMT formulas in the domain, quantified formulas have been much less investigated. We propose the first delta-complete algorithm for solving satisfiability of nonlinear SMT over real numbers with universal quantification and a wide range of nonlinear functions. Our methods combine ideas from counterexample-guided synthesis, interval constraint propagation, and local optimization. In particular, we show how special care is required in handling the interleaving of numerical and symbolic reasoning to ensure delta-completeness. In experiments, we show that the proposed algorithms can handle many new problems beyond the reach of existing SMT solvers.

11:15
Solving Quantified Bit-Vectors using Invertibility Conditions

ABSTRACT. We present a novel approach for solving quantified bit-vector formulas in Satisfiability Modulo Theories (SMT) based on computing symbolic inverses of bit-vector operators. We derive conditions that precisely characterize when bit-vector constraints are invertible for a representative set of bit-vector operators commonly supported by SMT solvers. We utilize syntax-guided synthesis techniques to aid in establishing these conditions and verify them independently by using several SMT solvers. We show that invertibility conditions can be embedded into quantifier instantiations using Hilbert choice expressions and give experimental evidence that a counterexample-guided approach for quantifier instantiation utilizing these techniques leads to performance improvements with respect to state-of-the-art solvers for quantified bit-vector constraints.

11:30
Understanding and Extending Incremental Determinization for 2QBF

ABSTRACT. Incremental determinization is a recently proposed algorithm for solving quantified Boolean formulas with one quantifier alternation. In this paper, we formalize incremental determinization as a set of inference rules to help understand the design space of similar algorithms. We then present additional inference rules that extend incremental determinization in two ways. The first extension integrates the popular CEGAR principle and the second extension allows us to expand cases and analyze them in separation. The experimental evaluation demonstrates that the extensions significantly improve the performance.

11:45
The Proof Complexity of SMT Solvers
SPEAKER: Robert Robere

ABSTRACT. The resolution proof system has been enormously helpful in deepening our understanding of conflict-driven clause-learning (CDCL) SAT solvers. In the interest of providing a similar proof complexity-theoretic analysis of satisfiability modulo theories (SMT) solvers, we introduce a generalization of resolution called Res(T). We show that many of the known results comparing resolution and CDCL solvers lift to the SMT setting, such as the result of Pipatsrisawat and Darwiche showing that CDCL solvers with ``perfect'' non-deterministic branching and an asserting clause-learning scheme can polynomially simulate general resolution. We also describe a stronger version of Res(T), Res*(T), capturing SMT solvers allowing introduction of new literals.

We analyze the theory EUF of equality with uninterpreted functions, and show that the Res*(EUF) system is able to simulate an earlier calculus introduced by Bjorner and De Moura for the purpose of analyzing DPLL(EUF). Further, we show that Res*(EUF) (and thus SMT algorithms with clause learning over EUF, new literal introduction rules and perfect branching) can simulate the Frege proof system, which is well-known to be far more powerful than resolution. Finally, we prove under the Exponential Time Hypothesis (ETH) that any reduction from EUF to SAT (such as the Ackermann reduction) must, in the worst case, produce an instance of size at least n log n from an instance of size n.

12:00
Model Generation for Quantified Formulas: A Taint-Based Approach

ABSTRACT. We focus in this paper on generating models of quantified first-order formulas over built-in theories, which is paramount in software verification and bug finding. While standard methods are either geared toward proving the absence of solution or targeted to specific theories, we propose a generic approach based on a reduction to the quantifier-free case. Our technique allows thus to reuse all the efficient machinery developed for that context. Experiments show a substantial improvement over state-of-the-art methods.

11:00-12:00 Session 119B: FM Invited Talk: Kim G. Larsen (FM)
Location: Blavatnik LT1
11:00
20 Years of "Real" Real-Time Model Checking

ABSTRACT. In this keynote we review 20 years of industrial application of the UPPAAL Tool Suite for model-based verification, testing, performance evaluation and synthesis. The talk will highlight a number of selected cases, discuss successes, failures and pitfalls in achieving industrial impact as well as tool sustainability in an academic setting.

11:00-12:30 Session 119C: Applications (ICLP)
11:00
Phylotastic: An Experiment in Creating, Manipulating, and Evolving Phylogenetic Biology Workflows Using Logic Programming
SPEAKER: Thanh Nguyen

ABSTRACT. Evolutionary Biologists have long struggled with the challenge of developing analysis workflows in a flexible manner, thus facilitating the reuse of phylogenetic knowledge. An evolutionary biology workflow can be viewed as a plan which composes web services that can retrieve, manipulate, and produce phylogenetic trees.

The Phylotastic project was launched two years ago as a collaboration between evolutionary biologists and computer scientists, with the goal of developing an open architecture to facilitate the creation of such analysis workflows. While composition of web services is a problem that has been extensively explored in the literature, including within the logic programming domain, the incarnation of the problem in Phylotastic provides a number of additional challenges. Along with the need to integrate preferences and formal ontologies in the description of the desired workflow, evolutionary biologists tend to construct workflows in an incremental manner, by successively refining the workflow, by indicating desired changes (e.g., exclusion of certain services, modifications of the desired output). This leads to the need of successive iterations of incremental replanning, to develop a new workflow that integrates the requested changes while minimizing the changes to the original workflow. This paper illustrates how Phylotastic has addressed the challenges of creating and refining phylogenetic analysis workflows using logic programming technology and how such solutions have been used within the general framework of the Phylotastic project.

11:30
Experimenting with robotic intra-logistics domains

ABSTRACT. We introduce the asprilo [1] framework to facilitate experimental studies of approaches addressing complex dynamic applications. For this purpose, we have chosen the domain of robotic intra-logistics. This domain is not only highly relevant in the context of today's fourth industrial revolution but it moreover combines a multitude of challenging issues within a single uniform framework. This includes multi-agent planning, reasoning about action, change, resources, strategies, etc. In return, asprilo allows users to study alternative solutions as regards effectiveness and scalability. Although asprilo relies on Answer Set Programming and Python, it is readily usable by any system complying with its fact-oriented interface format. This makes it attractive for benchmarking and teaching well beyond logic programming. More precisely, asprilo consists of a versatile benchmark generator, solution checker and visualizer as well as a bunch of reference encodings featuring various ASP techniques. Importantly, the visualizer's animation capabilities are indispensable for complex scenarios like intra-logistics in order to inspect valid as well as invalid solution candidates. Also, it allows for graphically editing benchmark layouts that can be used as a basis for generating benchmark suites.

[1] asprilo stands for Answer Set Programming for robotic intra-logistics

12:00
Optimal Scheduling for Exposed Datapath Architectures with Buffered Processing Units by ASP
SPEAKER: Marc Dahlem

ABSTRACT. Conventional processor architectures are restricted in exploiting instruction level parallelism (ILP) due to the relatively low number of programmer-visible registers. Therefore, more recent processor architectures expose their datapaths so that the compiler (1) can schedule parallel instructions to different processing units and (2) can make effective use of local storage of the processing units. Among these architectures, the Synchronous Control Asynchronous Dataflow (SCAD) architecture is a new exposed datapath architecture whose processing units are equipped with first-in first-out (FIFO) buffers at their input and output ports.

In contrast to register-based machines, the optimal code generation for SCAD is still a matter of research. In particular, SAT and SMT solvers were used to generate optimal resource constrained and optimal time constrained schedules for SCAD, respectively. As Answer Set Programming (ASP) offers better flexibility in handling such scheduling problems, we focus in this paper on using an answer set solver for both resource and time constrained optimal SCAD code generation. As a major benefit of using ASP, we are able to generate \emph{all} optimal schedules for a given program which allows one to study their properties. Furthermore, the experimental results of this paper demonstrate that the answer set solver can compete with SAT solvers and outperforms SMT solvers.

11:00-12:30 Session 119D: AR Miscellanea (IJCAR)
Location: Maths LT2
11:00
Probably Half True: Probabilistic Satisfiability over Lukasiewicz Infinitely-valued Logic

ABSTRACT. We study probabilistic reasoning in a context that allows for "partial truths", investigating computational and algorithmic properties of non-classical Lukasiewicz Infinitely-valued Probabilistic Logic. In particular, we study the decision problem over Lukasiewicz Infinitely-valued Probabilistic assignments which we call LIPSAT. Although the search space is initially infinite, we provide linear algebraic methods that guarantee polynomial size witnesses, so that the problem is shown to be NP-complete. An exact algorithm is presented which employs, as a subroutine, the decision problem for Lukasiewicz Infinitely-valued (Non-Probabilistic) Logic, which is also an NP-complete problem. We develop implementations of the algorithms described and discuss the empirical presence of a phase transition behavior for those problems.

11:30
Automated Reasoning about Key Sets
SPEAKER: Miika Hannula

ABSTRACT. Codd's rule of entity integrity stipulates that every table in a database must have a primary key. This means that the attributes that form the primary key must carry no missing information and have unique value combinations. In practice, and in particular in modern applications, data records cannot always meet such requirements. Previous work has proposed the notion of a key set, which can identify more data records uniquely when information is missing. Apart from the proposal, key sets have not been investigated much further in the literature or in real systems. We outline important database applications, and investigate computational limits and techniques to reason automatically about key sets. We establish a binary axiomatization for the implication problem of key sets, and prove its coNP-completeness. In addition, we show that perfect models do not always exist for key sets. Finally, we show that the implication problem for unary key sets by arbitrary key sets has better computational properties. The fragment enjoys a unary axiomatization, is decidable in time quadratic in the input, and perfect models can always be generated.

12:00
An abstraction-refinement framework for reasoning with large theories

ABSTRACT. In this paper we present an abstraction-refinement framework for reasoning with large theories. We consider several types of abstractions based on over and under approximations of first-order theories. We implemented the proposed approached in a theorem prover iProver and evaluated over the TPTP library.

12:00-12:30 Session 120 (FM)
Location: Blavatnik LT1
12:00
View abstraction for systems with component identities

ABSTRACT. The parameterised verification problem seeks to verify all members of some family of systems. We consider the following instance: each system is composed of an arbitrary number of similar component processes, together with a fixed number of server processes; processes communicate via message passing; in particular, each component process has an identity, which may be included in messages, and passed to third parties. We extend the technique of view abstraction of Abdulla et al. to this setting. We give an algorithm and implementation that allows such systems to be verified for an arbitrary number of components. We show how this technique can be applied to a concurrent datatype built from reference-linked nodes, such as a linked list.

12:30-14:00Lunch Break
14:00-15:15 Session 121A: Concurrency (CAV)
Location: Maths LT1
14:00
Partial Order Aware Concurrency Sampling
SPEAKER: Xinhao Yuan

ABSTRACT. We present POS, a concurrency testing approach that samples the partial order of concurrent programs. POS uses a novel prioritybased scheduling algorithm that dynamically reassigns priorities regarding the partial order information and formally ensures that each partial order will be explored with significant probability. POS is simple to implement and provides a probabilistic guarantee of error detection better than state-of-the-art sampling approaches. Evaluations show that POS is effective in covering the partial-order space of micro-benchmarks and finding concurrency bugs in real-world programs, such as Firefox’s JavaScript engine SpiderMonkey.

14:15
Reasoning About TSO Programs Using Reduction and Abstraction

ABSTRACT. We present a method for proving that a program running under the Total Store Ordering (TSO) memory model is robust, i.e., all its TSO computations are equivalent to computations under the Sequential Consistency (SC) semantics. This method is inspired by Lipton's reduction theory for proving atomicity of concurrent programs. For programs which are not robust, we introduce an abstraction mechanism that allows to construct robust programs over-approximating their TSO semantics. This enables the use of proof methods designed for the SC semantics in proving invariants that hold on the TSO semantics of a non-robust program. These techniques have been evaluated on a large set of benchmarks using the infrastructure provided by CIVL, a generic tool for reasoning about concurrent programs under the SC semantics.

14:30
Quasi-Optimal Partial Order Reduction

ABSTRACT. A dynamic partial order reduction (DPOR) algorithm is optimal when it always explores at most one representative per Mazurkiewicz trace. Existing literature suggests the reduction obtained by the non-optimal state-of-the-art Source-DPOR (SDPOR) algorithm is comparable to optimal DPOR. We show the first program with O(n) Mazurkiewicz traces where SDPOR explores O(2^n) redundant schedules and identify the cause of the blow-up as an NP-hard problem. Our main contribution is a new approach, called Quasi-Optimal POR, that can arbitrarily approximate an optimal exploration using a provided constant k. We present an implementation of our method in a new tool called DPU using specialized data structures. Experiments with DPU, including Debian packages, show that optimality is achieved with low values of k, outperforming state-of-the-art tools.

14:45
On the Completeness of Verifying Message Passing Programs under Bounded Asynchrony

ABSTRACT. We address the problem of verifying message passing programs, defined as a set of parallel processes communicating through unbounded FIFO buffers. We introduce a bounded analysis that explores a special type of computations, called k-synchronous. These computations can be viewed as (unbounded) sequences of interaction phases, each phase allowing at most k send actions (by different processes), followed by a sequence of receives corresponding to sends in the same phase. We give a procedure for deciding k-synchronizability of a program, i.e., whether every computation is equivalent (has the same happens-before relation) to one of its k-synchronous computations. We also show that reachability over k-synchronous computations and checking k-synchronizability are both PSPACE-complete. Furthermore, we introduce a class of programs called {\em flow-bounded} for which the problem of deciding whether there exists a k>0 for which the program is k-synchronizable, is decidable.

15:00
Constrained Dynamic Partial Order Reduction
SPEAKER: Miguel Isabel

ABSTRACT. The cornerstone of dynamic partial order reduction (DPOR) is the notion of independence that is used to decide whether each pair of concurrent events p and t are in a race and thus both p.t and t.p must be explored. We present constrained dynamic partial order reduction (CDPOR), an extension of the DPOR framework which is able to avoid redundant explorations based on the notion of conditional independence --the execution of p and t commutes only when certain independence constraints (ICs) are satisfied. ICs can be declared by the programmer, but importantly, we present a novel SMT-based approach to automatically synthesize ICs in a static pre-analysis. A unique feature of our approach is that we have succeeded to exploit ICs within the state-of-the-art DPOR algorithm, achieving exponential reductions over existing implementations.

14:00-15:30 Session 121B (FM)
Location: Blavatnik LT1
14:00
Compositional Reasoning for Shared-variable Concurrent Programs
SPEAKER: Fuyuan Zhang

ABSTRACT. Scalable and automatic formal verification for concurrent systems is always demanding. In this paper, we propose a verification framework to support automated compositional reasoning for concurrent programs with shared variables. Our framework models concurrent programs as succinct automata and supports the verification of multiple important properties. Safety verification and simulations of succinct automata are parallel compositional, and safety properties of succinct automata are preserved under refinements. We generate succinct automata from infinite state concurrent programs in an automated manner. Furthermore, we propose the first automated approach to checking rely-guarantee based simulations between infinite state concurrent programs. We have prototyped our algorithms and applied our tool to the verification of multiple refinements.

14:30
Statistical Model Checking of LLVM Code

ABSTRACT. We present the new tool Lodin for statistical model checking of LLVM-bitcode. Lodin implememts a simulation engine for LLVM- bitcode and implements classic statistical model checking algorithms on top it. The simulation engine implements only the core of LLVM but supports extending this core through a plugin-architecture. Besides the statistical model checking algorithms Lodin also provides an interactive simulation front-end. The simulator front-end was integral for our second contribution - an integration of Lodin into Plasma-Lab. The integration with Plasma-Lab is integral to allow reasoning about rare properties of programs.

15:00
A lightweight deadlock analysis technique of object-oriented programs

ABSTRACT. Deadlock analysis of object-oriented programs that dynamically create threads and objects is complex because these programs may have infinitely many states.

We define a simple calculus featuring recursion, threads and synchronizations that guarantee exclusive access to objects. We detect deadlocks by associating an abstract model to programs -- the lam model -- where it is decidable whether a problematic object dependency (e.g.~a circularity) between threads will be ever manifested.

The analysis is lightweight because the deadlock detection problem is fully reduced to the corresponding one in lams (without using other models). The technique is intended to be an effective tool for the deadlock analysis of mainstream programming languages by defining ad-hoc extraction processes.

14:00-15:30 Session 121C: FM I-Day (FM)
Location: Blavatnik LT2
14:00
Software Safety and Security and AI

ABSTRACT. Static code analysis can be applied to show compliance to coding guidelines, and to demonstrate the absence of critical programming errors, including runtime errors and data races. In recent years, security concerns have become more and more relevant for safety-critical systems, not least due to the increasing importance of highly-automated driving and pervasive connectivity. While in the past, sound static analyzers have been primarily applied to demonstrate classical safety properties they are well suited also to address data safety, and to discover security vulnerabilities. This article gives an overview and discusses practical experience.

14:30
Variant Analysis with QL

ABSTRACT. As new security problems and innovative attacks continue to be discovered, program analysis remains a burgeoning area of research. However, writing new analyses has remained a specialist task, and although declarative and logic programming has been observed to be an excellent fit for phrasing complex analysis, the goal of democratising the creation of new checks has remained elusive.

QL builds on previous work to use Datalog for this purpose, but solves some of the traditional challenges: Its object-oriented nature enables the creation of extensive libraries, and the query optimiser minimises the performance cost of the abstraction layers introduced in this way. QL enables agile security analysis, allowing security response teams to find all variants of a newly discovered vulnerability. Their work can then be leveraged to provide automated on-going checking, thus ensuring that the same mistake never makes it into the code base again.

We will introduce the principles behind QL, and show how we can perform declarative variant analysis on high-profile open-source code bases.

15:00
Object-Oriented Security Proofs

ABSTRACT. By reason about security as object equivalence, you can leverage familiar OO reasoning techniques to write formal, modular, human-readable, machine-checkable proofs.

14:00-15:30 Session 121D: Probabilistic and Constraint LP (ICLP)
Chair:
14:00
Shape Neutral Analysis of Graph-based Data-structures
SPEAKER: Roland Yap

ABSTRACT. Malformed data-structures can lead to runtime errors such as arbitrary memory access or corruption. Despite this, reasoning over data-structure properties for low-level heap manipulating programs remains challenging. In this paper we present a constraint-based program analysis that checks data-structure integrity, w.r.t. given target data-structure properties, as the heap is manipulated by the program. Our approach is to automatically generate a solver for properties using the type definitions from the target program. The generated solver is implemented in Constraint Handling Rules (CHR) extending builtin heap, integer and equality solvers. A key property of our program analysis is that the target data-structure properties are shape neutral, i.e. the analysis does not check for properties relating to a given data-structure graph shape, such as doubly-linked-lists versus trees. Nevertheless, the analysis can detect errors in wide range of datastructure manipulating programs, including those that use lists, trees, DAGs, graphs, etc. We present an implementation based on a specialized shape neutral constraint solver implemented in the Satisfiability Modulo Constraint Handling Rules (SMCHR) system. Experimental results show that our approach works well for real-world C programs.

14:30
A Probabilistic Extension of Action Language BC+
SPEAKER: Yi Wang

ABSTRACT. We present a probabilistic extension of action language BC+. Just like BC+ is defined as a high-level notation of answer set programs for describing transition systems, the proposed language, which we call pBC+, is defined as a high-level notation of LPMLN programs---a probabilistic extension of answer set programs. We show how probabilistic reasoning about transition systems, such as prediction, postdiction, and planning problems, as well as probabilistic diagnosis for dynamic domains, can be modeled in pBC+ and computed using an implementation of LPMLN.

15:00
Constraint-Based Inference in Probabilistic Logic Programs
SPEAKER: Arun Nampally

ABSTRACT. Probabilistic Logic Programs (PLPs) generalize traditional logic programs and allow the encoding of models combining logical structure and uncertainty. In PLP, inference is performed by summarizing the possible worlds which entail the query in a suitable data-structure, and using it to compute the answer probability. Systems such as ProbLog, PITA, etc., use propositional data-structures like explanation graphs, BDDs, SDDs, etc., to represent the possible worlds. While this approach saves inference time due to substructure sharing, there are a number of problems where a more compact data-structure is possible. We propose a data-structure called Ordered Symbolic Derivation Diagram (OSDD) which captures the possible worlds by means of constraint formulas. We describe a program transformation technique to construct OSDDs via query evaluation, and give procedures to perform exact and approximate inference over OSDDs. Our approach has two key properties. Firstly, the exact inference procedure is a generalization of traditional inference, and results in speedup over the latter in certain settings. Secondly, the approximate technique is a generalization of likelihood weighting in Bayesian Networks, and allows us to perform sampling-based inference with lower rejection rate and variance. We evaluate the effectiveness of the proposed techniques through experiments on several problems.

14:00-15:30 Session 121E: System Descriptions (IJCAR)
Chair:
Location: Maths LT2
14:00
The Higher-Order Prover Leo-III

ABSTRACT. The automated theorem prover Leo-III for classical higher-order logic with Henkin semantics and choice is presented. Leo-III is based on extensional higher-order paramodulation and accepts every common TPTP dialect (FOF, TFF, THF), including their recent extensions to rank-1 polymorphism (TF1, TH1). In addition, the prover natively supports almost every normal higher-order modal logic. Leo-III cooperates with first-order reasoning tools using translations to (polymorphic) many-sorted first-order logic and produces verifiable proof certificates. The prover is evaluated on heterogeneous benchmark sets.

14:15
ATPboost: Learning Premise Selection in Binary Setting with ATP Feedback

ABSTRACT. ATPboost is a system for solving sets of large-theory problems by interleaving ATP runs with state-of-the-art machine learning of premise selection from the proofs. Unlike many previous approaches that use multilabel setting, the learning is implemented as binary classification that estimates the pairwise-relevance of ( theorem, premise ) pairs. ATPboost uses for this the XGBoost gradient boosting algorithm, which is fast and has state-of-the-art performance on many tasks. Learning in the binary setting however requires negative examples, which is nontrivial due to many alternative proofs. We discuss and implement several solutions in the context of the ATP/ML feedback loop, and show that ATPboost with such methods significantly outperforms the k-nearest neighbors multilabel classifier.

14:30
Cubicle-W: Parameterized Model Checking on Weak Memory

ABSTRACT. We present Cubicle-W, a new version of the Cubicle model checker to verify parameterized systems under weak memory models. Its main originality is to implement a backward reachability algorithm modulo weak memory reasoning using SMT. Our experiments show that Cubicle-W is expressive and efficient enough to automatically prove safety of concurrent algorithms, for an arbitrary number of processes, ranging from mutual exclusion to synchronization barriers.

14:45
MaedMax: A Maximal Ordered Completion Tool
SPEAKER: Sarah Winkler

ABSTRACT. The equational reasoning tool MaedMax implements maximal ordered completion. This new approach extends the maxSMT-based method for standard completion developed by Klein and Hirokawa (2011) to ordered completion and theorem proving. MaedMax incorporates powerful ground completeness checks and supports certification of proofs by an Isabelle-based certifier. It also provides an order generation mode which can be used to synthesize term orderings for other tools. Experiments show the potential of our approach.

15:00
Cops and CoCoWeb: Infrastructure for Confluence Tools
SPEAKER: Nao Hirokawa

ABSTRACT. In this paper we describe the infrastructure supporting confluence tools and competitions: Cops, the confluence problems database, and CoCoWeb, a convenient web interface for tools that participate in the annual confluence competition.

15:15
FORT 2.0

ABSTRACT. FORT is a tool that implements the first-order theory of rewriting for the decidable class of left-linear right-ground rewrite systems. It can be used to decide properties of a given rewrite system and to synthesize rewrite systems that satisfy arbitrary properties expressible in the first-order theory of rewriting. In this paper we report on the extensions that were incorporated in the latest release (2.0) of FORT. These include witness generation for existentially quantified variables in formulas, support for combinations of rewrite systems, as well as an extension to deal with non-ground terms for properties related to confluence.

15:30-16:00Coffee Break
16:00-17:45 Session 122A: CPS, Hardware, Industrial Applications (CAV)
Location: Maths LT1
16:00
Formal Verification of a Vehicle-to-Vehicle (V2V) Messaging System
SPEAKER: Mark Tullsen

ABSTRACT. Vehicle-to-Vehicle (V2V) communications is a ``connected vehicles'' standard that will likely be mandated in the U.S. within the coming decade. V2V, in which automobiles broadcast to one another, promises improved safety by providing collision warnings, but it also poses a security risk. At the heart of V2V is the communication messaging system, specified in SAE J2735 using the Abstract Syntax Notation One (ASN.1) data description language. Motivated by numerous previous ASN.1 related vulnerabilities, we present the formal verification of an ASN.1 encoder/decoder pair. We describe generating an encoder/decoder pair, implemented in C, using our internally developed ASN.1 toolset. We define self-consistency for encoder/decoder pairs that approximates functional correctness without requiring a formal specification of ASN.1. We then verify self-consistency and memory-safety using symbolic simulation via the Software Analysis Workbench.

16:15
Continuous formal verification of Amazon s2n
SPEAKER: Josiah Dodds

ABSTRACT. We describe formal verification of the open source TLS implementation used in numerous Amazon services, ranging from the Amazon Simple Storage Service (S3) to the Amazon Web Services Software Development Kit (AWS SDK). A key aspect of this proof is continuous checking: to ensure that properties proved remain proved during the lifetime of the software, at each change to the code, proofs are automatically re-established with little to no interaction from the development community. Technical decisions made about the structure of the proof and tools employed were driven by this goal.

16:30
Symbolic Liveness Analysis of Real-World Software

ABSTRACT. Liveness violation bugs are notoriously hard to detect, especially due to the difficulty inherent in applying formal methods to real-world programs. We present a generic and practically useful liveness property which defines a program as being live as long as it will eventually either consume more input or terminate. We show that this property naturally maps to many different kinds of real-world programs. To demonstrate the usefulness of our liveness property, we also present an algorithm that can be efficiently implemented to dynamically find fixed points of the target program during Symbolic Execution. This extends Symbolic Execution, a well known dynamic testing technique, to find a new class of program defects, namely liveness violations, while only incurring a small runtime and memory overhead, as evidenced by our evaluation. The implementation of our method found a total of five previously undiscovered software defects in BusyBox and the GNU Coreutils. All five defects have been confirmed and fixed by the respective maintainers after shipping for years, most of them well over a decade.

16:45
Model Checking Boot Code from AWS Data Centers

ABSTRACT. This paper describes our experience with symbolic model checking in an industrial setting. We have proved that the initial boot code running in data centers at Amazon Web Services is memory safe, an essential step in establishing the security of any data center. Standard static analysis tools cannot be easily used on boot code without modification due to issues not commonly found in higher-level code, including memory-mapped device interfaces, byte-level memory access, and linker scripts. This paper describes automated solutions to these issues and their implementation in the C Bounded Model Checker (CBMC). CBMC is now the first source-level static analysis tool to extract the memory layout described in a linker script for use in its analysis.

17:00
Android Stack Systems
SPEAKER: Zhilin Wu

ABSTRACT. In this paper, we propose Android Stack Systems (ASS), a formal model to capture key mechanisms of Android multi-tasking such as activities, the back stack, launch modes, as well as task affinities. The model is based on pushdown systems with multiple stacks, and focuses on the evolution of the back stack of the Android system when interacting with activities carrying specific launch modes and task affinities. For formal analysis, we study the reachability problem of ASS. While the general problem is shown to be undecidable, we identify expressive fragments for which various verification techniques for pushdown systems or their decidable extensions are harnessed to show that the reachability problem is decidable.

17:15
Formally Verified Montgomery Multiplication

ABSTRACT. We report on a machine assisted verification of an efficient implementation of Montgomery Multiplication which is a widely used method in cryptography for efficient computation of modular exponentiation. We shortly describe the method, give a brief survey of the VeriFun system used for verification, verify a classical as well as a more recent algorithm for computing multiplicative inverses, illustrate proof-technical obstacles encountered upon verification, present the formal proofs and finally report on the effort when creating the proofs. Our work uncovered a serious fault in a state-of-the-art algorithm for computing multiplicative inverses based on Newton-Raphson iteration, thus providing further evidence for the benefit of computer-aided verification.

17:30
Inner and Outer Approximating Flowpipes for Delay Differential Equations
SPEAKER: Sylvie Putot

ABSTRACT. Delay differential equations are fundamental for modelingvnetworked control systems where the underlying network induces delay for retrieving values from sensors or delivering orders to actuators. They are notoriously difficult to integrate, as these are actually functional equations, the initial state being a function. We propose a scheme to compute inner and outer approximated flowpipes for such equations with uncertain initial states and parameters. Inner-approximated flowpipes are guaranteed to contain only reachable states, while outer-approximated flowpipes enclose all reachable states. We also introduce a notion of robust inner-approximation, which we believe opens promising perspectives for verification, beyond property falsification. The efficiency of our approach relies on the combination of Taylor models in time, with an abstraction or parameterization in space based on affine forms, or zonotopes. It also relies on an extension of the mean-value theorem, which allows us to deduce inner-approximated flowpipes, from flowpipes outerapproximating the solution of the DDE and its Jacobian with respect to constant but uncertain parameters and initial conditions. We present some experimental results obtained with our C++ implementation.

16:00-18:00 Session 122B (FM)
Location: Blavatnik LT1
16:00
CompoSAT: Specification-Guided Coverage for Model Finding
SPEAKER: Tim Nelson

ABSTRACT. Model-finding tools like the Alloy Analyzer produce concrete examples of how a declarative specification can be satisfied. These formal tools are useful in a wide range of domains: software design, security, networking, and more. By producing concrete examples, they assist in exploring system behavior and can help find surprising faults.

Specifications usually have many potential candidate solutions, and yet model- finders tend to leave the choice of which examples to present entirely to the underlying solver. This paper closes that gap by exploring notions of coverage for the model-finding domain, yielding a novel, rigorous metric for output quality. These ideas are realized in the tool CompoSAT, which interposes itself between Alloy's constraint-solving and presentation stages to produce ensembles of examples that maximize coverage.

We show that high-coverage ensembles like those CompoSAT produces are useful for, among other things, detecting overconstraint---a particularly insidious form of specification error. We detail the underlying theory of CompoSAT, discuss its implementation, and evaluate it on numerous specifications.

16:30
Approximate Partial Order Reduction
SPEAKER: Sayan Mitra

ABSTRACT. We present a new partial order reduction method for reachability analysis of nondeterministic labeled transition systems over metric spaces. Nondeterminism arises from both the choice of the initial state and the choice of actions, and the number of executions to be explored grows exponentially with their length. We introduce a notion of $\varepsilon$-independence relation over actions that relates approximately commutative actions; $\varepsilon$-equivalent action sequences are obtained by swapping $\varepsilon$-independent consecutive action pairs. Our reachability algorithm generalizes individual executions to cover sets of executions that start from different, but $\delta$-close initial states, and follow different, but $\varepsilon$-independent, action sequences. The constructed over-approximations can be made arbitrarily precise by reducing the $\delta,\varepsilon$ parameters. Exploiting both the continuity of actions and their approximate independence, the algorithm can yield an exponential reduction in the number of executions explored. We illustrate this with experiments on consensus, platooning, and distributed control examples.

17:00
SDN-Actors: Modeling and Verification of SDN Programs
SPEAKER: Albert Rubio

ABSTRACT. Software-Defined Networking (SDN) is a recent networking paradigm that has become increasingly popular in the last decade. It gives unprecedented control over the global behavior of the network and provides a new opportunity for formal methods. Much work has appeared in the last few years on providing bridges between SDN and verification. This paper advances this research line and provides a link between SDN and traditional work on formal methods for verification of distributed software---actor-based modelling. We show how SDN programs can be seamlessly modelled using actors, and thus existing advanced model checking techniques developed for actors can be directly applied to verify a range of properties of SDN networks, including consistency of flow tables, violation of safety policies, and forwarding loops.

17:30
Formal Specification and Verification of Dynamic Parametrized Architectures
SPEAKER: Ivan Stojic

ABSTRACT. We propose a novel approach to the formal specification and verification of dynamic architectures that are at the core of adaptive systems such as critical-infrastructure protection. Key features include run-time reconfiguration based on adding and removing components and connections, resulting in systems with unbounded number of components. We provide a logic-based specification of a Dynamic Parametrized Architecture (DPA), where parameters represent the infinite-state space of possible configurations, and first-order formulas represent the sets of initial configurations and reconfiguration transitions. We encode information flow properties as reachability problems of such DPAs, define a translation into an array-based transition system, and use an SMT-based model checker to tackle a number of case studies.

16:00-17:30 Session 122C: FM I-Day (FM)
Location: Blavatnik LT2
16:00
Z3 and SMT in Industrial R&D

ABSTRACT. Theorem proving has a proud history of elite academic pursuit and select industrial use. Impact, when predicated on acquiring the internals of a formalism or proof environment, is gated on skilled and idealistic adapters. In the case of automatic theorem provers known as Satisfiability Modulo Theories, SMT, solvers, the barrier of entry is shifted to tool builders and their domains. SMT solvers typically provide convenient support for domains that are prolific in software engineering and have in the past decade found widespread use cases in both academia and industry. We describe some of the background developing the Z3 solver, the factors that have played an important role in shaping its use, and an outlook on further development and adaption.

16:30
Evidential and Continuous Integration of Software Verification Tools
SPEAKER: Harald Ruess

ABSTRACT. We are proposing an integrated verification framework for developing certifiable safety- and security-critical software in an agile way. First, the framework supports integrated verification as it applies a combination of complementary formal software analysis methods. Second, the framework is evidential as verification evidences, which form the basis for certification, are automatically generated from pre-defined verification workflow patterns by chaining results from the integrated software analysis tools. Third, the framework is continuous as it is aimed at executing verification and generating corresponding evidences during each iteration of an agile development process.

17:00
Disruptive Innovations for the Development and the Deployment of Fault-Free Software

ABSTRACT. Developing safety critical systems is a very difficult task. Such systems require talented engineers, strong experience and dedication when designing the safety principles of these systems. Indeed it should be demonstrated that no failure or combination of failures may lead to a catastrophic situation where people could be injured or could died because of that system. This article presents disruptive technologies that reduce the effort to develop such systems by providing integrated building blocks easier to use.

16:00-16:30 Session 122D: Doctoral Consortium teaser talks (ICLP)

The session hosts 3 minutes summaries by each Doctoral Consortium Ph.D. student who will give the longer presentation during the ICLP-DC on 18th of July.

Chair:
16:30-18:00 Session 123: Technical Communications II (ICLP)
16:30
Towards Static Performance Guarantees for Programs with Run-time Checks

ABSTRACT. Instrumenting programs for performing run-time checking of properties, such as regular shapes, is a common and useful technique that helps programmers detect incorrect program behaviors. This is specially true in dynamic languages such as Prolog. However, such run-time checks inevitably introduce run-time overhead (in execution time, memory, energy, etc.). Several approaches have been proposed for reducing such overhead, such as eliminating the checks that can statically be proved to always succeed, and/or optimizing the way in which the (remaining) checks are performed. However, there are cases in which it is not possible to remove all checks statically (e.g., open libraries which must check their interfaces, complex properties, unknown code, etc.) and in which, even after optimizations, these remaining checks still may introduce an unacceptable level of overhead. It is thus important for programmers to be able to determine the additional cost due to the run-time checks and compare it to some notion of admissible cost. The common practice used for estimating run-time checking overhead is profiling, which is not exhaustive by nature. Instead, we propose a method that uses static analysis to estimate such overhead, with the advantage that the estimations are functions parameterized by input data sizes. Unlike profiling, this approach can provide guarantees for all possible execution traces, and allows assessing how the overhead grows as the size of the input grows. Our method also extends an existing assertion verification framework to express “admissible” overheads, and statically and automatically checks whether the instrumented program conforms with such specifications. Finally, we present an experimental evaluation of our approach that suggests that our method is feasible and promising.

16:45
Towards Incremental and Modular Context-sensitive Analysis

ABSTRACT. Context-sensitive global analysis of large code bases can be expensive, which can be specially problematic in interactive uses of analyzers. However, in practice each development iteration implies small modifications which are often isolated within a few modules, and analysis cost can be reduced by reusing the results of previous analyses.

This has been achieved to date on one hand through modular analysis, which can reduce the memory consumption and often localize the computation during reanalysis mainly to the modules affected by changes. In parallel, context-sensitive incremental fixpoints have been proposed that achieve cost reductions at finer levels of granularity, such as changes in program lines. However, these fine-grained techniques are not directly applicable to modular programs.

This paper describes, implements, and evaluates a context sensitive, fixpoint analysis algorithm for (Constraint) Logic Programs aimed at achieving both inter-modular (coarse-grain) and intra-modular (fine-grain) incrementality, solving the problems related to propagation of the fine-grain change information and effects across module boundaries, for additions and deletions in multiple modules.

The implementation and evaluation of our algorithm shows encouraging results: the expected advantages of fine-grain incremental analysis carry over to the modular analysis context. Furthermore, the fine-grained propagation of analysis information of our algorithm improves performance with respect to traditional modular analysis even when analyzing from scratch.

17:00
Learning Commonsense Knowledge through Interactive Dialogue
SPEAKER: Benjamin Wu

ABSTRACT. One of the most difficult problems in Artificial Intelligence is related to acquiring commonsense knowledge -- to create a collection of facts and information that an ordinary person should know. In this work, we present a system that, from a limited background knowledge, is able to learn to form simple concepts through interactive dialogue with a user. We approach the problem using a syntactic parser, along with a mechanism to check for synonymy, to translate sentences into a logical formulas represented in Event Calculus using Answer Set Programming (ASP). Reasoning and learning tasks are then automatically generated for the translated text, with learning being initiated through question and answering. The system is capable of learning with no contextual knowledge prior to the dialogue. The system has been evaluated on stories inspired by the Facebook's bAbI's question-answering tasks, and through appropriate question and answering is able to respond accurately to these dialogues.

17:15
A New Proof-theoretical Linear Semantics for CHR

ABSTRACT. Constraint handling rules are a committed-choice language consisting of multiple-heads guarded rules that rewrite constraints into simpler ones until they are solved. We propose a new proof-theoretical declarative linear semantics for Constraint Handling Rules. We demonstrate completeness and soundness of our semantics w.r.t. operational w_t semantics. We propose also a translation from this semantics to linear logic.

17:30
CHRvis: Syntax and Semantics
SPEAKER: Nada Sharaf

ABSTRACT. The work in the paper presents an animation extension CHRvis to Constraint Handling Rules CHR. Visualizations have always helped programmers understand data and debug programs. A picture is worth a thousand words. It can help identify where a problem is or show how something works. It can even illustrate a relation that was not clear otherwise.

17:45
Declarative Algorithms in Datalog with Aggregates: user-friendly formal semantics conducive to performance and scalability
SPEAKER: Carlo Zaniolo

ABSTRACT. Pre-mappable (PreM ) extrema constraints in recursive Datalog programs enable concise declarative formulations for classical algorithms (Zaniolo et al. 2017). The programs expressing these algorithms have formal non- monotonic semantics with efficient and scalable support on multiple platforms (Shkapsky et al. 2016) (Yang et al. 2017). However proving PreM for different programs can be challenging for programmers; thus, in this paper, we introduce simple templates that allow users to verify with ease that their programs have the PreM property along with the rigorous semantics and the efficient and scalable implementation associated with it. We thus obtain simple declarative formulation for classical algorithms in two equivalent versions: one with perfect model semantics and the other with stable model semantics.