SEFM Keynotes

Wednesday, 09 September


Peter O'Hearn

Moving Fast with Software Verification
Cliff Jones I spent many years doing research in the area of formal methods, and I confess that at the time I didn't have the foggiest idea about software engineering as practiced in industry. Then, in September 2013, with the aquisition of the verification startup Monoidics, I was transplanted into the fast-paced engineering world of Facebook. Now I am strongly of the opinion that understanding of software engineering practice and culture is absolutely essential for the success of formal methods. This change came about from our experiences trying to deploy the Infer static program analyzer inside Facebook. In this talk I will describe our results and experiences deploying Infer, including how it is integrated into Facebook's software development process. I will also make some suggestions of directions for research which may increase the impact of formal methods.

Biography: I am an Engineering Manager at Facebook, where I lead the Static Analysis Tools team, and a Professor of Computer Science at University College London. My work has been in the broad areas of programming languages and logic, ranging from new logics and mathematical models to industrial applications of program proof. With John Reynolds I developed separation logic, a theory which opened up new practical possibilities for program proof. In 2009 I cofounded a software verification startup company, Monoidics Ltd, which was acquired by Facebook in 2013. I did my PhD (1991) at Queen's University in Kingston, Ontario, Canada, under the supervision of Bob Tennent. I held academic positions at Syracuse (1991-1995) and Queen Mary University of London (1996-2012), before moving to University College London in March of 2012. I have been the recipient of an EPSRC Advanced Research Fellowship, a Royal Society Wolfson Research Merit Award, a Most Influential POPL Paper Award, and a Royal Academy of Engineering/Microsoft Research Chair. With David Pym I developed bunched logic, a general logic of resource which was a more abstract cousin of and precursor to separation logic. The work with Reynolds on separation logic extended Hoare's logic to address the problem of tractable reasoning about the mutation of data structures in computer memory; during this work I invented a theory of local reasoning where specifications and proofs mention only the resources (memory, files, etc) accessed by program components, and this led to a significant increase in the scalability of automated reasoning applied to large codebases. Then I proposed a concurrent separation logic, which has made inroads on the problem of modular reasoning about shared-memory concurrent programs. These theoretical advances led to new software tools research: Smallfoot (Calcagno, Berdine and I, 2002-2005), Space Invader (Calcagno, Distefano, Yang and I, 2006-2008), Abductor (again Calcagno, Distefano, Yang and I, JACM'11), and Infer from Monoidics, which is now being developed at Facebook.

More details can be found on: http://www0.cs.ucl.ac.uk/staff/p.ohearn

Thursday, 10th September 2015


Cliff B Jones

Reasoning about Separation using Abstraction and Reification
Cliff Jones Showing that concurrent threads operate on separate portions of their shared state is a way of establishing non-interference. Furthermore, in many useful programs, ownership of parts of the state are exchanged dynamically. Reasoning about separation and ownership of heap-based variables is often conducted using some form of separation logic. This talk examines the issue of separation and investigates the use of abstraction to specify and to reason about separation in program design. Two case studies demonstrate that using separation as an abstraction is a potentially useful approach.

Biography: Cliff B Jones is Professor of Computing Science at Newcastle University. He is best known for his research into "formal methods" for the design and verification of computer systems; under this heading, current topics of research include concurrency, support systems and logics. He was PI on an EPSRC-funded AI4FM project; is currently PI on Taming Concurrency; and is "Partner Investigator" on the Australian funded project "Understanding concurrent programs using rely-guarantee thinking" (ARC grant DP130102901). As well as his academic career, Cliff has spent over twenty years in industry (which might explain why "applicability" is an issue in most of his research). His fifteen years in IBM saw, among other things, the creation -with colleagues in the Vienna Lab- of VDM which is one of the better known "formal methods". Under Tony Hoare, Cliff wrote his Oxford doctoral thesis in two years (and enjoyed the family atmosphere of Wolfson College). From Oxford, he moved directly to a chair at Manchester University. During his time at Manchester, Cliff had a 5-year Senior Fellowship from the research council and later spent a sabbatical at Cambridge for the whole of the Newton Institute event on "Semantics". Much of his research at this time focused on formal (compositional) development methods for concurrent systems. He has been a member of IFIP Working Group 2.3 (Programming Methodology) since 1973 (and was Chair from 1987-96). Cliff is a Fellow of the Royal Academy of Engineering (FREng), ACM, BCS, and IET. He was recently awarded the first FM-E Fellowship.

More details can be found on: http://homepages.cs.ncl.ac.uk/cliff.jones

Friday, 11 September 2015


Edward A. Lee

The Internet of Important Things
Edward Lee Cyber-physical systems are integrations of computation, communication networks, and physical dynamics. Applications include manufacturing, transportation, energy production and distribution, biomedical, smart buildings, and military systems, to name a few. Increasingly, today, such systems leverage Internet technology, despite a significant mismatch in technical objectives. A major challenge today is to make this technology reliable, predictable, and controllable enough for "important" things, such as safety-critical and mission-critical systems. In this talk, I will analyze how emerging technologies can translate into better models and better engineering methods for this evolving Internet of important things. In particular, I will focus on how careful interface specification for components offers opportunities to leverage formal analysis. I will examine an architecture for IoT applications where so-called "accessors" provide an actor-oriented proxy for devices ("things") and services. Following the principles of actor models, an accessor reacts to input stimuli and produces outputs that can stimulate reactions in other accessors or actors. The talk focuses on a specialized form of actor models where inputs and outputs to accessors and actors are time-stamped events, enabling timing-sensitive IoT applications. The interaction between accessors and actors via time-stamped events forms a "horizontal contract," formalized in this talk as an interface automaton. The interaction between an accessor and the thing or service for which it is a proxy is a "vertical contract," also formalized as an interface automaton. Following common practice in network programming, our vertical contract uses an asynchronous atomic callback (AAC) pattern. The formal composition of these interface automata allows us to reason about the combination of a timed actor model and the AAC pattern, enabling careful evaluation of design choices for IoT systems.

Biography: Edward A. Lee is the Robert S. Pepper Distinguished Professor in the Electrical Engineering and Computer Sciences (EECS) department at U.C. Berkeley. His research interests center on design, modeling, and analysis of embedded, real-time computational systems. He is the director of the nine-university TerraSwarm Research Center, a director of Chess, the Berkeley Center for Hybrid and Embedded Software Systems, and the director of the Berkeley Ptolemy project. From 2005-2008, he served as chair of the EE Division and then chair of the EECS Department at UC Berkeley. He is co-author of six books and hundreds of papers. He has led the development of several influential open-source software packages, notably Ptolemy and its various spinoffs. He received the B.S. degree in Computer Science from Yale University in 1979, the S.M. degree in EECS from the Massachusetts Institute of Technology (MIT) in 1981, and the Ph.D. degree in EECS from UC Berkeley in 1986. From 1979 to 1982 he was a member of technical staff at Bell Telephone Laboratories in Holmdel, New Jersey, in the Advanced Data Communications Laboratory. He is a co-founder of BDTI, Inc., where he is currently a Senior Technical Advisor, and has consulted for a number of other companies. He is a Fellow of the IEEE, was an NSF Presidential Young Investigator, and won the 1997 Frederick Emmons Terman Award for Engineering Education.

More details can be found on: http://ptolemy.eecs.berkeley.edu/~eal

Leave a Reply