Hierarchy of logical systems

This post is a generalization to one of my previous posts, Abstractions with Set Theory.

At its core, mathematical logic deals with mathematical concepts expressed using formal logical systems.

These systems, though they differ in many details, share the common property of considering only expressions in a fixed formal language.

Here’s the hierarchy of these logical systems:

  1. Propositional logic
    This branch of logic is concerned with the study of propositions (whether they are True or False) that are formed by other propositions with the use of logical connectives.

    The most basic logical connectives are AND \land, OR \lor, and NOT \lnot.

    The connectives are commutative. Here are their values (T stands for True, F for false):
    T \land T = T, everything else is F.
    F \lor F = F, everything else is T.
    \lnot F = T, \lnot T = F.

    We can also use variables to represent statements.

    For example, we can say “a = Salad is organic”, and thus a is a True statement.
    Another example is “a = Rock is organic”, and thus a is a False statement.
    “a = Hi there!” is neither a True nor a False statement.

    Propositional logic defines an argument to be a list of propositions. For example, given the two propositions A \lor B, \lnot B we can conclude A.

    An argument is valid iff for every row where the propositions are True, the conclusion is also True.

    An easy way to check the validity of this argument is to use the definitions above and draw a table with all possible values of A and B.

    A	B	A OR B	NOT B
    F	F	F		T
    F	T	T		F
    T	F	T		T
    T	T	T		F
    

    In this case, the row where all of the propositions are true is 3. We see that the conclusion A is also True, so the argument is valid and will hold for any value we put in place of A or B.

    Besides using tables to check for values, we can also construct proofs given a natural deduction system.

    We can use the system’s rules to either prove or disprove a statement.

  2. First-order logic
    This logical system extends propositional logic by additionally covering predicates and quantifiers.

    A predicate P(x) takes as an input x, and produces either True or False. For example, having “P(x) = x is a organic”, then P(Salad) is True, but P(Rock) is False.

    Note that in set theory, P would be a subset of a relation, i.e. P \subseteq A \times \{ True, False \}. When working with other systems we need to be careful, as this is not the case with FOL. In the case of FOL, we have P(Salad) = True, P(Rock) = False, etc as atomic statements (i.e. they cannot be broken down into smaller statements).

    There are two quantifiers introduced: forall (universal quantifier) \forall and exists (existential quantifier) \exists.

    In the following example the universal quantifier says that the predicate will hold for all possible choices of x: \forall x P(x)
    In the following example the existential quantifier says that the predicate will hold for at least one choice of x: \exists x P(x)

  3. Second-order logic, …, Higher-order (nth-order) logic
    First-order logic quantifies only variables that range over individuals; second-order logic, in addition, also quantifies over sets; third-order logic also quantifies over sets of sets, and so on.

For example, Peano’s axioms (the system that defines natural numbers) are a mathematical concept expressed using a combination of first-order and second-order logic.

This concept consists of a set of axioms for the natural numbers, and all of them (except the ninth, induction axiom) are statements in first-order logic.

The base axioms can be augmented with arithmetical operations of addition, multiplication and the order relation, which can also be defined using first-order axioms.

The axiom of induction is in second-order, since it quantifies over predicates.

In my next post we’ll have a look at intuitionistic logic, a special logical system based on type theory.

Advertisements

Curry–Howard correspondence

For most of my schooling (basic school, high school), I’ve had a private math teacher thanks to my parents. I believe this has shaped my career path for the better.

I’ve always wanted to be a programmer, and I used to be interested in nothing else. Now I am lucky for the past years to be doing exactly what I’ve always wanted, and that is programming.

Back in basic school, I remember one thing that one of my math teachers kept reminding me: “Study math, it is very closely related to programming”. Now I think I really understand what that statement means.

In any case, I’ve recently started digging into Lean Theorem Prover by Microsoft Research. Having some Haskell experience, and experience with mathematical proofs, the tutorial is mostly easy to follow.

I don’t have much experience with type theory, but I do know some stuff about types from playing with Haskell. I’ve heard about the Curry-Howard correspondence a bunch of times, and it kind of made sense, but I haven’t really understood it in depth. So, by following the Lean tutorial, I got to get introduced to it.

An excerpt from Wikipedia:

In programming language theory and proof theory, the Curry–Howard correspondence (also known as the Curry–Howard isomorphism or equivalence, or the proofs-as-programs and propositions- or formulae-as-types interpretation) is the direct relationship between computer programs and mathematical proofs.

In simpler words, a proof is a program, and the formula it proves is the type for the program.

Now as an example, consider your neat function of swapping 2 values of a product type:

swap :: (a, b) -> (b, a)
swap (a, b) = (b, a)

What the Curry-Howard correspondence says is that this has an equivalent form of a mathematical proof.

Although it may not be immediately obvious, think about the following proof:
Given P and Q, prove that Q and P.
What you do next is use and-introduction and and-elimination to prove this.

How does this proof relate to the swap code above? To answer that, we can now consider these theorems within Lean:

variables p q : Prop
theorem and_comm : p ∧ q → q ∧ p := fun hpq, and.intro (and.elim_right hpq) (and.elim_left hpq)

variables a b : Type
theorem swap (hab : prod a b) : (prod b a) := prod.mk hab.2 hab.1

Lean is so awesome it has this #check command that can tell us the complete types:

#check and_comm -- and_comm : ∀ (p q : Prop), p ∧ q → q ∧ p
#check swap     -- swap     : Π (a b : Type), a × b → b × a

Now the shapes are apparent.

We now see the following:

  • and.intro is equivalent to prod.mk (making a product)
  • and.elim_left is equivalent to the first element of the product type
  • and.elim_right is equivalent to the second element of the product type
  • forall is equivalent to the dependent type pi-type
    A dependent type is a type whose definition depends on parameters. For example, consider the polymorphic type List a. This type depends on the value of a. So List Int is a well defined type, or List Bool is another example.

    More formally, if we’re given A : Type and B : A -> Type, then B is a set of types over A.
    That is, B contains all types B a for each a : A.
    We denote it as Pi a : A, B a.

As a conclusion, it’s interesting how logical AND being commutative is isomorphic to a product type swap function, right? 🙂

Code checklist

Here is a checklist of some of the most important stuff (in my opinion) that should be ran through any code change, regardless of a programming language:

  • Does what it’s supposed to do
    • Process it mentally (proof of correctness)
    • Automated tests
    • Manual tests
  • Can’t be further broken into smaller pieces
  • Is written in a generalized way (re-usage)
  • Has no repeating parts
  • My future-self and coworkers can understand it
    • Is documented
    • Is readable
  • Conforms to (any) coding standards for consistency
  • Separation of concerns

With experience, one doesn’t need to think about these, since they occur naturally.

There might be a couple of more that I missed, but these should be a good base.