**Mail folder:**QED**Next message:**John Harrison: ""Little Theories" and the base logic"**Previous message:**Tobias.Nipkow@informatik.tu-muenchen.de: "Bourbaki/Mathias"**Reply:**guttman@linus.mitre.org: "Re: Verification systems "

From: dam@ai.mit.edu (David McAllester) Date: Thu, 15 Apr 93 19:27:49 EDT Message-id: <9304152327.AA00625@spock> To: qed@mcs.anl.gov Subject: Verification systems Sender: qed-owner Precedence: bulk

In an earlier message I named a variety of verification systems that might be relevant to the QED project. Bob Boyer has suggested that I give some more information and references to these systems. The following list of systems relevant to QED is by no means meant to be exhaustive, it is just the ones I mentioned in my earlier message. A more complete list of reasoning systems can be found in a database constructed by Carolyn Talcott. The data base can be accessed via ftp to sail.stanford.edu in the directory /pub/clt/ARS/. This directory contains a README file. I have taken some liberties in describing the following systems. Authors of these systems (or anyone else) should feel free to correct anything they think is wrong or misleading in these system descriptions. ****************** 1) HOL (Higher Order Logic) Pronounced as either as an acronym (by pronouncing the letters) or as a word rhyming with "small" (there was a recent discussion of this and apparent the world is about 50% in each camp). This is probably the most widely used verification system. It is a "foundational" system in the sense that there is one underlying logic with (arguably) a single intended model (the universe of sets). It uses a higher order logic with ML style polymorphism so that one can define an identity function and apply it to itself. However there is a straightforward set theoretic denotational semantics (different occurances of "the identity function" denote identity functions on different domains). Although it is higher order and classical I believe it is somewhat weaker than ZFC (I do not believe that one can prove that there exists a family of sets containing an infinite set and closed under power set. In other words, it is missing the axiom of replacement.) I am not sure whether it contains the axiom of choice. The interactive verification environment is a descendent of LCF (not described here, but a common ancestor of many current systems). This means that one proves theorems by backward chaining on goals using "tactics" to transform any given goal into a set of subgoals. A tactic is a user written procedure (satisfying machine checkable soundness conditions relative to the foundational logic) that reduces a given goal to subgoals sufficient to impy it. A tactic may perform an arbitrary amount of automated theorem proving in trying to get a small set of simplified subgoals. Simple goals are proven automatically (reduce to no subgoals) by a variety of standard "low level" tactics. The most comon application of HOL is the verification of computer hardware. o M. Gordon. HOL: A proof generating system for higher-order logic. In G. Birtwistle and P. A. Subrahmanyam, editors, VLSI Specification, Verification, and Synthesis. Kluwer, 1987. ****************** 2) The Boyer-Moore prover (also known as NQTHM). This is probably the second most widely used system. It is also foundational. The logic consists of expressions definable in the pure (no side effects) version of the programming language Lisp. These expressions have a fixed meaning which can be defined via an operational semantics for Lisp. The atomic formulas of this logic are equations between lisp terms (such as (EQUAL (NUMBERP X) (TRUE))). The Boyer-Moore dialect of lisp does not allow closures (lambda expressions) and hence is strictly first order. Clearly this language can only be used to describe finite objects. It is most effective when used for proving properties of primitive recursive functions (the notion of primitive recursion being generalized from natural numbers to Lisp data structures). This is probably the most highly automated and effective foundational verification system. It has been used to verify a large number of theorems in elementary number theory. Like HOL it is also used for verification. I believe that the Boyer-Moore prover has been more widely used than HOL in the verification of software. Thousands of programs have been proven to meet specifications using the Boyer-Moore prover. This includes compilers, assemblers and C programs compiled to 68020 machine instructions. The user interface is "Socratic". By this I mean that the user types a series of statements (called events). The system either accepts or rejects each statement as it is given. If a statement is rejected it must be broken down into "simpler" statements. If a statement is accepted a new proof technique is installed (such as a rewrite rule) that can be used to verify latter statements. Socratic proofs are easier to "replay" under modified definitions than are the interaction traces in LCF derived interfaces such as that of HOL. Socratic proofs are also more human readable as proof texts. The primary inference mechanism is term rewriting together with a large collection of heuristics for performing mathematical induction. Although the system is controled with user written rewrite rules (entered as steps in a Socratic proof) there are no user defined tactics in the sense of LCF. @book{boyer-moore-acl, author = {Boyer, Robert S. and Moore, J. Strother}, title = {A Computational Logic}, year = 1979, publisher = {Academic Press}, ISBN = {0-12-122952-1}, comment = {description of logic, heuristics, and many examples} } @book{boyer-moore-aclh, author = {Boyer, Robert S. and Moore, J. Strother}, title = {A Computational Logic Handbook}, year = 1988, publisher = {Academic Press}, ISBN = {0-12-122950-5}, order-info = {This book may be ordered by calling Academic Press at (800)321-5068 or (314)528-8110 for Alaska, Hawaii, Missouri or by writing to the Academic Press Order Department, 465 S. Lincoln Drive, Troy, MO 63379.} comment = {comprehensive user's manual} } ****************** 3) nuPRL (pronounced "new pearl"). This is a mature and fairly widely used foundational system. The underlying logic is derived from Martin Lof type theory and is based on the Curry-Howard isomorphism between formulas and types. This is sometimes called the "formulas as types" paradigm. This paradigm is constructive --- nuPRL is a constructivist foundational system. The user interface is an LCF descendent based on user defined tactics for goal reduction. It has been used to verify a variety of theorems in number theory as well as hardware and software verification. R. L. Constable et. al. Implementing Mathematics with the Nuprl Proof Development System Prentice Hall 1986 ****************** IMPS was described in an earlier message. IMPS has an LCF like interface and a foundational logic based on the "little theories" method of organizing mathematical theorems. I believe that IMPS uses higher order logic as a foundation. It seems likely that one could translate theorem statements between HOL and IMPS in a direct semantics-preserving manner. The same should be true of the other systems in the Talcott data base (see above) that are based on higher order logic. ****************** Ontic is our system. The logic is an extension of the semantics of Lisp so that all objects denotable by formulas in ZFC can be denoted by expression in this "Lisp". The foundational logic is somewhat stronger than ZFC (it can prove the existence of an inaccessible cardinal). The user interface is based on the text editor emacs --- one writes a natural deduction proof as a piece of text. A keyboard command instructs the system to read the proof. The system either accepts the proof or moves the cursor to the first step that it does not accept. This is similar to the Boyer-Moore style of socratic proof except that the natural deduction style allows a context of suppositions to be shared by all the steps in a branch of a case analysis. No "procedural" information is associated with proof steps (in the Boyer-Moore system proof steps are anotated with instructions as to how the information is to be used by the prover). The inference mechansisms are constructed using a forward chaining (bottom up) logic programming language. However, a single set of inference mechanisms are used over all proofs and there are no user defined tactics in the sense of LCF. An early version of this system was used to verify the Stone representation theorem for Boolean lattices. This system is still highly experimental and not released. However a draft user's manual can be getten by ftp from the machine ftp.ai.mit.edu in the file /com/ontic/ontic-manual.ps. This is a user's manual but expert readers should be able to derive the set theoretic denotational semantics from the description in the manual. David McAllester