Semanticists should be the obstetricians of programming languages, not the coroners.
–John Reynolds, 2 December 2004
In the suitably historical location of the Fellows' Library and the adjoining Director's Suite of the London Science Museum, a meeting on "Program Verification and Semantics: Further Work" was held on Thursday, 2 December 2004, under the joint auspices of two British Computer Society (BCS) Specialist Groups: the Computer Conservation Society (CCS; http://www.bcs.org/sg/ccs) and Formal Aspects of Computing Science (FACS; http://www.bcs-facs.org). The event was a follow-up of a previous meeting held in 2001, titled "Program Verification and Semantics: The Early Work," organized through the CCS. (For more details on the previous meeting, see the associated Web site—http://vmoc.museophile.org/pvs01).
The invited speakers at the 2004 event (see Figure 1) were among the most important pioneers in formal methods: John Reynolds (Carnegie Melon University), Gordon Plotkin (University of Edinburgh), and Cliff Jones (University of Newcastle upon Tyne). Roger Johnson (University of London), who is the chair of the CSS, introduced the meeting. The program was split into two sessions, with Cliff Jones chairing the first half, including presentations from Reynolds and Plotkin, and Jonathan Bowen (London South Bank University) chairing the second half.
Reynolds gave the first talk, titled "The Discoveries of Continuations."1He concentrated on the invention and application of the concept of continuations, a key idea in the field of program verification and semantics.2 He asserted that continuations and related concepts were discovered and rediscovered many times in different contexts. He believed this was only partly due to the lack of communication between scientists; instead, it was largely the outcome of the different environments in which the idea could be applied.
Reynolds posited that continuations allow a wide range of program transformations that can be useful both in the interpretation of programs (for example, through operational semantics) and in denotational semantics, in the style of Dana Scott and Christopher Strachey. Continuations were useful because they were an abstraction (of the rest of the program), in terms of a function or procedure that clarified the control structure of imperative programs. Considering the multifunctionality of the concept, it isn't surprising that continuations were independently rediscovered several times, and can't be attributed to a single person.
When the Algol 60 language (see http://en.wikipedia.org/wiki/ALGOL) was launched, there was a burgeoning of research on formal programming languages and their implementation. In an Algol compiler,3,4 the representation of a return address or the target of a jump had to provide both the relevant object code and a dynamic environment (such as a stack reference). In retrospect, according to Reynolds, this pair was a representation of a continuation. Similarly, in Peter Landin's abstract Stack-Environment-Control-Dump (SECD) machine,5 the dump, which described the computation to be executed after the control was concluded, was a representation of a continuation.
Perhaps the earliest description of a use of continuations was given by Adriaan van Wijngaarden6 in 1964 at the International Federation for Information Processing Working Conference on Formal Language Description Languages held in Baden bei Wein. The proceedings were published two years later.7
When Reynolds began to investigate the history of continuations, he was surprised that van Wijngaarden's presentation did not seem to establish the concept within the computer science community, even though Edsger Dijkstra, Tony Hoare, John McCarthy, Douglas McIlroy, and Christopher Strachey all participated in the recorded discussion of the presentation, and other conference attendees included Corrado Böhm, Calvin Elgot, Peter Landin, and Maurice Nivat. However, van Wijngaarden's contribution fell on fallow ground; even Strachey was unaware of the similarities between that work and Christopher Wadsworth's later description of continuations.
It's possible that the lack of communication was due to philosophical differences between van Wijngaarden and other researchers, particularly his abhorrence of abstract syntax and his belief that proper procedures were more basic than functions. A more likely cause, however, was his failure to pinpoint continuations themselves as a significant concept. As pointed out in McIlroy's private recollection to Reynolds:
... an idea can be understood without all its ramifications being seen, even by its originator. Since van Wijngaarden offered no practical examples of continuation passing, nor any theoretical application, save as a trick for proving one isolated and already known result, the value of continuations per se did not come through.8
In December 1969, the Polish researcher Antoni W. Mazurkiewicz circulated a working paper titled "Proving Algorithms by Tail Functions,"9 in which he used an automaton-like concept of an algorithm, for which he proposed a semantics using a tail function. This is an environment that maps labels into command continuations, similar to the continuation semantics of an imperative language with labels and goto instructions. Wadsworth has explicitly acknowledged that his later work was inspired by that of Mazurkiewicz.10
Continuations were introduced by F. Lockwood Morris during a colloquium given at Queen Mary College in London in November 1970.11 At the colloquium, Morris gave a definitional interpreter for a call-by-value functional language with extensions for assignments and labels. Reynolds was in the audience of that lecture and from the beginning appreciated the possibility of using different styles of definitional interpreters varying in their abstractness and circularity, making use of continuations.
Reynolds was himself involved in the spread of these ideas. During a visit to Edinburgh in December 1970, Reynolds described Morris's talk to Rod Burstall and Chris Wadsworth. Wadsworth explained that he had been pursuing the same idea in the context of denotational semantics. Subsequently, Strachey described the idea in a seminar at the Institut de Recherche d'Informatique et d'Automatique (IRIA) in May 1973.12 However, Reynolds is inclined to believe that although the paper was written by Strachey the idea should be attributed to Wadsworth.
The idea of continuations was rediscovered by other researchers, such as James H. Morris. In 1971, he submitted a paper to the Association for Computing and Machinery (ACM) in which he used continuations to transform Algol 60 programs. When the referee pointed out that van Wijngaarden had described a similar transformation seven years earlier, the paper was reduced to a letter to the editor, in which Morris13 showed that the transformation could be used to eliminate procedures that returned compound or higher order values.
In 1972, at the ACM Conference on Proving Assertions about Programs, in Las Cruces, New Mexico, Michael J. Fisher14 presented a paper in which he introduced continuations. This paper is notable for including the first proof about the semantics of continuations. It proved that the continuation-passing style preserves meaning. The result was obtained in a setting where l-expressions15 denote closures.
In 1973, in a short paper at the first Computer Science Conference, held in Columbus, Ohio, and in a longer preliminary report,16 S. Kamal Abdali used continuations—which he called program remainders—to translate Algol 60 into a pure call-by-name l-calculus, rather than the call-by-value l-calculus used by Landin. In linking call-by-name in Algol 60 with call-by-name in the l-calculus, this work was a precursor of the view of Algol-like languages by Oles and Reynolds in the 1980s.
Reynolds stressed two concepts in his contribution: First, he claimed that major ideas such as continuations and related concepts are rarely discovered once, but they keep on being reinvented in different environments and used for various properties that they exhibit. Second, he argued that the relevance of continuations could be attributed to their applicability both in denotational and in operational semantics contexts. According to Reynolds the two approaches are actually complementary, both giving original results in the program semantics field.
The second speaker—Gordon Plotkin—gave a talk on "The Origins of Structural Operational Semantics."17 This was a personal reconstruction of his studies in the area. He was a major researcher in the field, so it was also a recollection of the evolution of operational semantics itself, as well as a passionate account in favor of the use of the operational approach for semantic descriptions of programming languages.
The reconstruction was so enthusiastic that at some points during the talk it was hard to grasp all the deep and thoughtful details of the story. But the general feeling of the audience was that he was telling an account in which he had strongly contributed and indeed largely initiated, so there was a personal feel to his presentation, giving a privileged view of the research covered. The audience not only heard oral testimony, but also witnessed slides hand-written during Plotkin's influential visit to Aarhus University in Denmark.
As a postgraduate Edinburgh University, Plotkin learned l-calculus and was impressed by the abstract SECD Machine of Peter Landin. He was also impressed by McCarthy's contributions to the field, such as the 1962 introduction of abstract syntax, a fundamental tool for all semantics of programming languages. Moreover, there was the Vienna Definition Language (VDL) that was used by the Vienna School to specify a real programming language for the PL/I abstract machine (http://en.wikipedia.org/wiki/PL/I). Plotkin didn't care for this way of doing operational semantics because it was too complex, burying the semantic ideas in technicalities.
At the end of the 1960s, denotational semantics was popular; see, for example, Mike Gordon's thesis on pure LISP, supervised by Rod Burstall from 1969.18 The work by Dana Scott on the Logic of Computable Functions, a simply typed l-calculus, was influential and, in particular, Plotkin wrote a paper19 on Programming Language for Computable Functions (PCF), in which he gave an operational semantics for typed l-calculus with Booleans, natural numbers, and recursion at all types. The idea behind PCF, according to the speaker, was that it was possible to consider the term calculus of the logic as a programming language only if one had an operational semantics. However, Scott did not completely agree with this. Obviously there were also other influences involved in the development of operational semantics, but these are some of the most relevant ones.
Plotkin also pointed out the role of teaching in developing "toy" languages that were suitable for students as well as for the progress of the research itself. In 1973, at the University of Edinburgh, he taught third-year undergraduates and, instead of teaching a course on l-calculus, he tried McCarthy-style recursive function definitions. Subsequently, in 1975, Robin Milner taught Simple Imperative Language to students. He later worked on a structural operational semantics for concurrency, developing the Calculus of Communicating Systems (CCS). In 1979, Milner spent six months in Aarhus on a visiting lectureship that resulted in his classic CCS book.20
Two years later Plotkin himself was invited on the same visiting lectureship, during which he developed Structural Operational Semantics (SOS). He realized at this point that researchers were then concentrating on proving the adequacy of an operational semantics with regards to the denotational semantics, but that the rule-based operational semantics was both mathematically elementary and simpler than the denotational one. So Plotkin imagined the possibility of dropping the denotational semantics and tried to follow up this idea in his course on this subject in Aarhus.21 Many of the linguistic ideas he employed came from denotational semantics, such as Bob Tennant's principles of abstraction, correspondence, and qualification.
In his recollection of the development of SOS, Plotkin focused on two major ideas. The first was that the configurations of SOS were those of an abstract machine, but without low-level detail; rules were used to avoid these complexities.
The second idea related to compositionality. Denotational semantics is compositional in the sense that the meaning of a phrase in a language is given as a function of the meaning of its immediate subphrases. In contrast, with an operational semantics, the meaning of a program is essentially the collection of all the transitions it can make. This is given by the rules and is primitively recursive (structural) rather than compositional in the subphrases, hence the term structural.
According to the speaker, looking back at what has been done so far makes it even clearer that there's much left to do—for example,
Denotational and operational semantics can be seen as counterparts with respect to offering semantics for programming languages. Plotkin was strongly convinced that the operational approach is the more useful for certain goals of formal methods and his talk tried hard to argue in favor of this position. The talk was followed by some lively questions and discussion (see Figure 2).
The final speaker was Cliff Jones,22 who focused his talk on "Language Semantics: The Vienna Part of the Story." His contribution gave a wide and well-organized historical account of the different approaches of language semantics between the 1960s and present day. He concentrated more on the Vienna group's side of the formal languages semantics history, because he spent some time in Vienna working for the IBM research group based there.
Between 1965 and 1968, the Vienna group faced the task of defining the large PL/I. As well as including a multiplicity of features, PL/I was challenging because it included complex constructs such as goto, on units, tasking, and so on. The Vienna group called their semantics Universal Language Definition (ULD-III); however, their operational semantics notation was more widely known as VDL.
At that time, the semantic approach was operational. Included in the list of the influential researchers were Elgot, Landin, and McCarthy with different backgrounds. The effort did achieve its objectives, and even resulted in clarifications of PL/I. But the definitions of the language tended to use "grand states" that gave great difficulty when used in proofs about VDL descriptions. These problems, together with a shift in the interests of the semantics community, pushed the Vienna group toward a change in the paradigm, resulting in a move to what became denotational semantics.
If we consider the operational semantics approach similar to the creation of an interpreter for the language, we can think of the denotational semantics model as the creation of a compiler for the language. In this case, however, it's a compiler that maps to mathematical objects (functions). Strachey and Landin were the first researchers to move toward the denotational semantics perspective,23 and Dana Scott solved some of the theoretical problems of the denotational approach during the 1960s.
During the 1970s the Vienna group worked on a compiler for PL/I that was developed from a denotational definition of PL/I,24 called the Vienna Development Method (VDM; http://www.csr.ncl.ac.uk/vdm). VDM didn't use continuations to model various kinds of jumps (for example, goto statements) as advocated at the Programming Research Group in Oxford. Instead, an exit mechanism was used, based on earlier functional semantics of Algol 60 by Jones and colleagues in the IBM UK Laboratories at Hursley Park.
Denotational definitions could be advantageous in the sense that they expressed more efficaciously the meaning of the syntactic constructs, but they couldn't completely substitute the operational definitions. In particular, they were technically inadequate at giving a correct denotation of concurrency. If operational semantics could avoid the use of unneeded mechanistic features, as in the successful proposal of SOS made by Gordon Plotkin during his Aarhus visit in 1981, the advantages of operational semantics are even clearer.
During the 1980s, Joneswas convinced that the only correct approach to semantics was the denotational one. When he switched to an academic career in 1981, he decided to teach students the mathematically heavy denotational methods for modeling semantics of programming languages. Jones declared that later he changed his view about the best approach for presenting semantics. Jones is now convinced that operational semantics, particularly using the rule-oriented approach introduced by Plotkin, is more useful for modeling nondeterminacy and concurrent processes. According to Jones, operational semantics is good as a thinking tool for creating a programming language, helping to avoid useless complexities. At the end of the meeting, the audience mingled with further informal discussion (see Figure 3).
In summary, the meeting was well-attended by an audience of around 75 people who contributed to the success of the event with questions and anecdotes. Having a joint event between the BCS Computer Conservation Society and FACS boosted numbers considerably and ensured a good mix of people interested in the history of computing and formal methods.
The speakers tried to reconstruct the history of the field—from the early stages during the 1960s to the major achievements of the 1970s and the 1980s—putting this into context with the most recent advances. The passion and wisdom of the speakers was contagious for the audience and gave a sense of the challenges faced during those early years. It also gave attendees the opportunity to share with the speakers' various concerns and possibilities for the future of the field.
For further details about the meeting, including photographs and slides from the talks, see the online information under the Virtual Museum of Computing Web site (http://vmoc.museophile.org/pvs04). We hope it will be possible to hold a third installment in the Program Verification and Semantics series in due course.
John Reynolds, Gordon Plotkin, and Cliff Jones all provided helpful feedback on their sections in this report. Paul Boca also gave useful comments on an early draft. BCS-FACS provided financial sponsorship for the event.
Teresa Numerico
London South Bank University
teresa.numerico@lsbu.ac.uk
Jonathan P. Bowen
London South Bank University
jonathon.bowen@lsbu.ac.uk
A memoir by Per Brinch Hansen is now available at http://brinch-hansen.net/.