View: session overviewtalk overviewside by side with other conferences
09:00 | SPEAKER: Takumi Akazaki ABSTRACT. With the rapid development of software and distributed computing, Cyber- Physical Systems (CPS) are widely adopted in many application areas, e.g., smart grid, autonomous automobile. It is difficult to detect defects in CPS models due to the complexities involved in the software and physical systems. To find defects in CPS models efficiently, robustness guided falsification of CPS is introduced. Existing methods use several optimization techniques to generate counterexamples, which falsify the given properties of a CPS. However those methods may require a large number of simulation runs to find the counterexample and is far from practical. In this work, we explore state-of-the-art Deep Reinforcement Learning (DRL) techniques to reduce the number of simulation runs required to find such counterexamples. We report our method and the preliminary evaluation results. |
09:30 | SPEAKER: Dhriti Khanna ABSTRACT. The success of dynamic verification techniques for C MPI (Message Passing Interface) programs rest on their ability to address communication non-determinism. As the number of processes in the pro- gram grows, the dynamic verification techniques suffer from the problem of an exponential growth in the size of the reachable state space. In this work, we provide a hybrid verification technique that combines explicit- state dynamic verification with symbolic analysis of message passing pro- grams. The dynamic verification component deterministically replays the execution runs of the program, while the symbolic component encodes a set of interleavings of the observed run of the program in a quantifier-free first order logic formula and verifies the formula for property violations such as assertions and absence of communication deadlocks. In the ab- sence of property violations, it performs analysis to generate a different run of the program that does not fall in the set of already verified runs of the program. We demonstrate the effectiveness of our methodology by our prototype tool Hermes. Our evaluation indicates that for non-single- path MPI programs, the verification time reduces by at least 3 times when Hermes is compared against the state-of-the-art verification tools. |
10:00 | To Compose, Or Not to Compose, That is the Question: An Analysis of Compositional State Space Generation SPEAKER: Sander de Putter ABSTRACT. To combat state space explosion several compositional verification approaches have been proposed. One such approach is compositional aggregation, where a given system consisting of a number of parallel components is iteratively composed and minimised. Compositional aggregation has shown to perform better (in the size of the largest state space in memory at one time) than classical monolithic composition in a number of cases. However, there are also cases in which compositional aggregation performs much worse. It is unclear when one should apply compositional aggregation in favor of other techniques and how it is affected by action hiding and the scale of the model. This paper presents a descriptive analysis following the quantitiative experimental approach. The experiments were conducted in a controlled test bed setup in a computer laboratory environment. A total of eight scaleable models with different network topologies considering a number of varying properties were investigated comprising 119 subjects. This makes it the most comprehensive study done so far on the topic. We investigate whether there is any systematic difference in the success of compositional aggregation based on the model, scaling, and action hiding. Our results indicate that both scaling up the model and hiding more behaviour has a positive influence on compositional aggregation. |
09:00 | SPEAKER: Cesar Munoz ABSTRACT. Operational requirements of safety-critical systems are often written in restricted specification logics. These restricted logics are amenable to automated analysis techniques such as model-checking, but are not rich enough to express complex requirements of unmanned systems. This short paper advocates for the use of expressive logics, such as higher-order logic, to specify the complex operational requirements and safety properties of unmanned systems. These rich logics are less amenable to automation and, hence, require the use of interactive theorem proving techniques. However, these logics support the formal verification of complex requirements such as those involving the physical environment. Moreover, these logics enable validation techniques that increase confidence in the correctness of numerically intensive software. These features result in highly-assured software that may be easier to certify. The feasibility of this approach is illustrated with examples drawn for NASA's unmanned aircraft systems. |
09:30 | ABSTRACT. In this article we present an industrial-strength approach based on formal methods to develop and check safety-critical interlocking software for railway signaling systems. The Prover Trident approach is developed by Prover Technology to meet industry needs for reduced cost and time-to-market, by capitalizing on the inherent repetitive nature of interlocking systems, in the sense that specific systems can be created and verified efficiently as specific instances of generic principles. This enables a high degree of automation, supported by an industrial-strength toolkit for creation of design and code, with seamless integration of push-button tools for simulation and formal verification. Using this approach, safety assessment relies on formal verification, performed on the design, the software code as well as the binary code. An independent toolset for formal verification that has been developed to meet the applicable certification requirements is used to verify the revenue service code. The basic ideas of this approach have been around for some time [1,2], while the methodology and tools have matured over many industrial application projects for revenue service systems. The presentation highlights the main ingredients in this successful application of formal methods, as well as challenges in establishing this approach for production use in a conservative industry domain. |
10:00 | SPEAKER: Joerg Brauer ABSTRACT. Model-based testing is considered state-of-the-art in verification and validation of safety critical systems. This paper discusses some experiences of applying the model-based testing tool RTT-MBT for the evacuation function of an aircraft cabin controller. A major challenge of this project was the parametric design of the software, which allows to tailor the software to a certain aircraft configuration via a large number of application parameters. Application parameters thus need to be integrated in the test models. Further challenges consisted of mapping multiple detailed signals of the system under test to a single abstract model variable and vice versa, and handling incremental test model development during an ongoing test campaign. We discuss solutions that we developed to successfully conduct this test campaign. |
14:00 | SPEAKER: Fuyuan Zhang ABSTRACT. Scalable and automatic formal verification for concurrent systems is always demanding. In this paper, we propose a verification framework to support automated compositional reasoning for concurrent programs with shared variables. Our framework models concurrent programs as succinct automata and supports the verification of multiple important properties. Safety verification and simulations of succinct automata are parallel compositional, and safety properties of succinct automata are preserved under refinements. We generate succinct automata from infinite state concurrent programs in an automated manner. Furthermore, we propose the first automated approach to checking rely-guarantee based simulations between infinite state concurrent programs. We have prototyped our algorithms and applied our tool to the verification of multiple refinements. |
14:30 | SPEAKER: Danny Bøgsted Poulsen ABSTRACT. We present the new tool Lodin for statistical model checking of LLVM-bitcode. Lodin implememts a simulation engine for LLVM- bitcode and implements classic statistical model checking algorithms on top it. The simulation engine implements only the core of LLVM but supports extending this core through a plugin-architecture. Besides the statistical model checking algorithms Lodin also provides an interactive simulation front-end. The simulator front-end was integral for our second contribution - an integration of Lodin into Plasma-Lab. The integration with Plasma-Lab is integral to allow reasoning about rare properties of programs. |
15:00 | ABSTRACT. Deadlock analysis of object-oriented programs that dynamically create threads and objects is complex because these programs may have infinitely many states. We define a simple calculus featuring recursion, threads and synchronizations that guarantee exclusive access to objects. We detect deadlocks by associating an abstract model to programs -- the lam model -- where it is decidable whether a problematic object dependency (e.g.~a circularity) between threads will be ever manifested. The analysis is lightweight because the deadlock detection problem is fully reduced to the corresponding one in lams (without using other models). The technique is intended to be an effective tool for the deadlock analysis of mainstream programming languages by defining ad-hoc extraction processes. |
14:00 | SPEAKER: Christian Ferdinand ABSTRACT. Static code analysis can be applied to show compliance to coding guidelines, and to demonstrate the absence of critical programming errors, including runtime errors and data races. In recent years, security concerns have become more and more relevant for safety-critical systems, not least due to the increasing importance of highly-automated driving and pervasive connectivity. While in the past, sound static analyzers have been primarily applied to demonstrate classical safety properties they are well suited also to address data safety, and to discover security vulnerabilities. This article gives an overview and discusses practical experience. |
14:30 | SPEAKER: Pavel Avgustinov ABSTRACT. As new security problems and innovative attacks continue to be discovered, program analysis remains a burgeoning area of research. However, writing new analyses has remained a specialist task, and although declarative and logic programming has been observed to be an excellent fit for phrasing complex analysis, the goal of democratising the creation of new checks has remained elusive. QL builds on previous work to use Datalog for this purpose, but solves some of the traditional challenges: Its object-oriented nature enables the creation of extensive libraries, and the query optimiser minimises the performance cost of the abstraction layers introduced in this way. QL enables agile security analysis, allowing security response teams to find all variants of a newly discovered vulnerability. Their work can then be leveraged to provide automated on-going checking, thus ensuring that the same mistake never makes it into the code base again. We will introduce the principles behind QL, and show how we can perform declarative variant analysis on high-profile open-source code bases. |
15:00 | ABSTRACT. By reason about security as object equivalence, you can leverage familiar OO reasoning techniques to write formal, modular, human-readable, machine-checkable proofs. |
16:00 | SPEAKER: Tim Nelson ABSTRACT. Model-finding tools like the Alloy Analyzer produce concrete examples of how a declarative specification can be satisfied. These formal tools are useful in a wide range of domains: software design, security, networking, and more. By producing concrete examples, they assist in exploring system behavior and can help find surprising faults. Specifications usually have many potential candidate solutions, and yet model- finders tend to leave the choice of which examples to present entirely to the underlying solver. This paper closes that gap by exploring notions of coverage for the model-finding domain, yielding a novel, rigorous metric for output quality. These ideas are realized in the tool CompoSAT, which interposes itself between Alloy's constraint-solving and presentation stages to produce ensembles of examples that maximize coverage. We show that high-coverage ensembles like those CompoSAT produces are useful for, among other things, detecting overconstraint---a particularly insidious form of specification error. We detail the underlying theory of CompoSAT, discuss its implementation, and evaluate it on numerous specifications. |
16:30 | SPEAKER: Sayan Mitra ABSTRACT. We present a new partial order reduction method for reachability analysis of nondeterministic labeled transition systems over metric spaces. Nondeterminism arises from both the choice of the initial state and the choice of actions, and the number of executions to be explored grows exponentially with their length. We introduce a notion of $\varepsilon$-independence relation over actions that relates approximately commutative actions; $\varepsilon$-equivalent action sequences are obtained by swapping $\varepsilon$-independent consecutive action pairs. Our reachability algorithm generalizes individual executions to cover sets of executions that start from different, but $\delta$-close initial states, and follow different, but $\varepsilon$-independent, action sequences. The constructed over-approximations can be made arbitrarily precise by reducing the $\delta,\varepsilon$ parameters. Exploiting both the continuity of actions and their approximate independence, the algorithm can yield an exponential reduction in the number of executions explored. We illustrate this with experiments on consensus, platooning, and distributed control examples. |
17:00 | SPEAKER: Albert Rubio ABSTRACT. Software-Defined Networking (SDN) is a recent networking paradigm that has become increasingly popular in the last decade. It gives unprecedented control over the global behavior of the network and provides a new opportunity for formal methods. Much work has appeared in the last few years on providing bridges between SDN and verification. This paper advances this research line and provides a link between SDN and traditional work on formal methods for verification of distributed software---actor-based modelling. We show how SDN programs can be seamlessly modelled using actors, and thus existing advanced model checking techniques developed for actors can be directly applied to verify a range of properties of SDN networks, including consistency of flow tables, violation of safety policies, and forwarding loops. |
17:30 | SPEAKER: Ivan Stojic ABSTRACT. We propose a novel approach to the formal specification and verification of dynamic architectures that are at the core of adaptive systems such as critical-infrastructure protection. Key features include run-time reconfiguration based on adding and removing components and connections, resulting in systems with unbounded number of components. We provide a logic-based specification of a Dynamic Parametrized Architecture (DPA), where parameters represent the infinite-state space of possible configurations, and first-order formulas represent the sets of initial configurations and reconfiguration transitions. We encode information flow properties as reachability problems of such DPAs, define a translation into an array-based transition system, and use an SMT-based model checker to tackle a number of case studies. |
16:00 | ABSTRACT. Theorem proving has a proud history of elite academic pursuit and select industrial use. Impact, when predicated on acquiring the internals of a formalism or proof environment, is gated on skilled and idealistic adapters. In the case of automatic theorem provers known as Satisfiability Modulo Theories, SMT, solvers, the barrier of entry is shifted to tool builders and their domains. SMT solvers typically provide convenient support for domains that are prolific in software engineering and have in the past decade found widespread use cases in both academia and industry. We describe some of the background developing the Z3 solver, the factors that have played an important role in shaping its use, and an outlook on further development and adaption. |
16:30 | SPEAKER: Harald Ruess ABSTRACT. We are proposing an integrated verification framework for developing certifiable safety- and security-critical software in an agile way. First, the framework supports integrated verification as it applies a combination of complementary formal software analysis methods. Second, the framework is evidential as verification evidences, which form the basis for certification, are automatically generated from pre-defined verification workflow patterns by chaining results from the integrated software analysis tools. Third, the framework is continuous as it is aimed at executing verification and generating corresponding evidences during each iteration of an agile development process. |
17:00 | ABSTRACT. Developing safety critical systems is a very difficult task. Such systems require talented engineers, strong experience and dedication when designing the safety principles of these systems. Indeed it should be demonstrated that no failure or combination of failures may lead to a catastrophic situation where people could be injured or could died because of that system. This article presents disruptive technologies that reduce the effort to develop such systems by providing integrated building blocks easier to use. |