Mathematical logic has been, in good part, developed and pursued with the hope of providing practical algorithmic tools for doing reasoning, both in everyday life and in mathematics, first by hand or mechanical means, and later by electronic computers. In this text elementary traditional logic is presented side by side with its algorithmic aspects, i.e., the syntax and semantics of first--order logic up to completeness and compactness, and developments in theorem proving that were inspired by the possibilities of using computers. Here we are referring to Robinson's resolution theorem proving, and to the Knuth--Bendix procedure to obtain term rewrite systems, both using the key idea of (most general) unification.
Thus we find the choice of topics important as well as accessible to a wide range of students in the mathematical sciences. These topics are rich in basic algorithms, giving the students a desirable hands-on experience. No background in abstract algebra or analysis is assumed, yet the material is definitely mathematical logic, logic for mathematics and computer science that is developed and analyzed using mathematical methods.
One simple theme is that there are really several self-contained proof systems of interest to mathematical logic, some more suitable than others for particular kinds of questions. Not counting Aristotle's syllogisms Chapters 1-5 corresponding to the five logical systems we consider, and Chapter 6 is continuation of Chapter 5.
Chapter 1 briefly reviews Aristotle's syllogisms which dominated logic for two millenia, and then presents Boole's work on the algebra of logic. Boole's work inaugurated modern mathematical logic. Boole said that he had reduced logical reasoning to reasoning about equations concerning classes of objects, and for the latter he employed methods analogous to well-known methods of working with equations in elementary algebra. The algebra of logic as applied in the calculus of classes was the main form of mathematical logic for half a century, and attracted such well-known persons as Lewis Carroll, the author of Alice's Adventures in Wonderland.
Chapter 2 examines propositional logic in considerable detail. Propositional logic is similar to the calculus of classes, but here the focus is on reasoning about propositions, not classes of objects. In modern treatments of mathematical logic it has completely replaced the calculus of classes, in good part because it provides a convenient stepping stone to the study of first-order logic (where you are allowed to quantify over elements). The calculus of classes more naturally leads to the calculus of relations. Then, by the addition of quantifiers over relations, it becomes essentially a second--order logic. Second--order logic is rather difficult to study, since it lacks a decent set of axioms and rules of inference, so we will have little to say about it.
After studying the notion of truth and truth-equivalence in the propositional logic we look at PC, a Frege-Hilbert style proof system. PC is designed to be reasonably easy to learn and work with, and to provide lots of simple exercises for students. It is related to the system RS (Rosser's System) used by Copi in his text ``Symbolic Logic'', a system that he derived from earlier work of Rosser. However PC is significantly different from RS as it is not based on natural deduction and has a very restricted replacement axiom.
Next we look at a resolution version of propositional logic because resolution theorem proving is a mainstay of modern automated theorem proving. Both the Davis-Putnam algorithm and general resolution theorem proving are explained. The chapter concludes with examples of sets of clauses that have been used to show limitations to the speed of general resolution theorem proving.
These notes are unusual in the detail in which they look at propositional logic. This choice has been made because propositional logic is a good training ground for students. And also because it is an area of interest in modern automated theorem proving, for two reasons. First the work of Löwenheim, Skolem, and Herbrand showed that one can reduce any mathematical question to a question about the satisfiability of a sequence of propositional formulas that one can algorithmically generate from the mathematical question. We will see how to do this in Chapter 5. And the second reason is that rather modest problems in propositional logic can be very difficult for computers. (This is directly connected with the outstanding open question of whether or not polynomial time is the same as nondeterministic polynomial time, i.e., does P = NP?)
Chapter 3 turns to equational logic. Reasoning with equations is one of the most basic skills we learn in mathematics. And, as already mentioned, Boole's calculus of classes was based on reasoning about equations. In this chapter we first show that the rules one learns in high school are all you need to do proofs with equations. This was first rigorously proved by G. Birkhoff in the mid 1930's. However, from the computational point of view, this is not particularly helpful in finding a proof that an equation follows from some other equations. This leads us into the study of term rewrite systems, a proposal of Knuth to use `directed equations' to expedite the search for proofs in equational logic. In this section we learn one of the fundamental algorithms of automated theorem proving, namely how to find the most general unifier of two terms. This is then used in the Knuth-Bendix procedure, a procedure to convert equations into directed equations.
Chapter 4 looks at resolution theorem proving for predicate clause logic as developed by Robinson in the mid 1960's. His goal was to short circuit the method of Löwenheim, Skolem, and Herbrand by stopping at an intermediate step of their procedure (to reduce a mathematical statement to a sequence of propositional formulas), namely at a step when one has clauses, and then apply resolution to the clauses to arrive at the desired proof. It was in this work that Robinson used most general unifiers, also employed by Knuth a few years later in his work on term rewrite systems. One of the best known logic programming languages, Prolog, incorporates resolution theorem proving.
Chapter 5 returns to traditional first-order logic. There is a leisurely introduction to expressing mathematical ideas in first-order logic in two familiar contexts, namely that of the natural numbers and that of graphs. Then the general first-order logic is formulated, fundamental equivalences are studied, and the method of putting a formula into prenex form is given. Next follows the crucial technique of skolemization, a procedure to convert a formula into a closely related quantifier-free formula. Skolemization gives the final tool needed to formulate the Löwenheim, Skolem, and Herbrand method of reducing a first-order sentence to a set of clauses for resolution theorem proving, as well as to a sequence of propositional formulas. These reductions lead to the conclusion that any of these systems is theoretically adequate to tackle heavy duty problems in mathematics.
In all of the proof systems presented here we are interested in formulating a simple set of axioms and rules of inference for proving theorems, and we make a special point in each case of proving soundness, i.e., that the axioms are true and the rules lead to correct conclusions, and completeness, i.e., that we have enough axioms and rules to prove all the theorems in the logic under consideration. The completeness proofs tend to be quite different in the different proof systems, with Gödel's completeness theorem for first-order logic being the most difficult.
Chapter 6 gives a detailed proof of Gödel's completeness theorem for a particular first-order proof system, using Henkin's technique of adding witness formulas. For clarity each key step of the proof is allotted its own section.
Another common feature of these logics is that they have a compactness theorem. Compactness basically says that something holds in an infinite situation iff it holds in its finite parts, e.g., an infinite graph is 3-colorable iff each finite subgraph is 3-colorable. The compactness theorems follow easily from the completeness theorems. However our approach also shows that one can simply bootstrap the compactness theorem from propositional logic to predicate clause logic to first-order logic. This sequence gives a rather simple proof of compactness for first-order logic.
There are four appendices. Appendix A gives a simple timetable for the development of mathematical logic and computing. Appendix B is a worksheet detailing the steps in a logical development of the natural numbers from Peano's axioms - I like for my students to be familiar with this classic work before graduating, and it is an excellent review of mathematical induction. I usually give a portion of this on the first problem assignment. Appendix C has explicit suggestions for writing up definitions and proofs by induction over propositional formulas. Appendix D is another worksheet, this one providing a detailed proof of the completeness of the elegant propositional proof system of Frege and Lukasiewicz.
There is one notational device used in this text that is not so common, namely the use of two symbols for equality, and =. Starting with Chapter 3, Equational Logic, we use the former symbol when we are working with equations in a particular logical system, and the latter in any context to mean `is the same as'. For example the expression is to be read: F(x,y) is the equation .
These notes have been designed to give the student an understanding of the most basic ideas from mathematical logic, namely an introduction to the syntax and semantics of formal proof systems, completeness and compactness, as well as significant connections with automated theorem proving. We hope that this introduction and the historical comments made throughout the notes will help the reader see the natural unfolding of the ideas.
The reader will find supplementary text and access to software at the World Wide Web site:
Stanley Burris
Waterloo, 1997