This is a file in the archives of the Stanford Encyclopedia of Philosophy.

version history

Stanford Encyclopedia of Philosophy

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

This document uses XHTML/Unicode to format the display. If you think special symbols are not displaying correctly, see our guide Displaying Special Characters.
last substantive content change

Relevance Logic

Relevance logics are non-classical logics. Called ‘relevance logics’ in North America and ‘relevant logics’ in Britain and Australasia, these systems developed as attempts to avoid the paradoxes of material and strict implication. Among the paradoxes of material implication are Among the paradoxes of strict implication are the following: Relevance logicians claim that what is unsettling about these so-called paradoxes is that in each of them the antecedent seems irrelevant to the consequent.

In addition, relevance logicians have had qualms about certain inferences that classical logic makes valid. For example, the inference

The moon is made of green cheese. Therefore, either it is raining in Ecuador now or it is not.
Again here there seems to be a failure of relevance. The conclusion seems to have nothing to do with the premise. Relevance logicians have attempted to construct logics that reject theses and arguments that commit "fallacies of relevance".

At this point some confusion is natural about what relevant logicians have attempted to do. They have not given formal criteria of relevance that any true implication must meet, altough some relevant logicans have interpreted the semantics for relevance logic using informal notions of relevance (see the section "Semantics" below). Instead, relevant logic is relevant in two ways: (1) Relevance logics do not force us to accept any irrelevances. That is, they do not make valid any of the paradoxes. (2) Some relevance logics, through their proof theory, yield a relevant notion of proof (see the section "Proof Theory" below).

In this article we will give a brief and relatively non-technical overview of the field of relevant logic.


Our exposition of relevant logic is backwards to most found in the literature We will begin, rather than end, with the semantics, since most philosophers at present are semantically inclined.

The semantics that I present here is the ternary relation semantics due to Richard Routley and Robert K. Meyer. There are algebraic semantics due to J. Michael Dunn and Alasdair Urquhart and operational semantics produced by Kit Fine. These systems are interesting in their own right, but we do not have room to discuss them here.

The idea behind the ternary relation semantics is rather simple. Consider C.I. Lewis' attempt to avoid the paradoxes of material implication. He added a new connective to classical logic, that of strict implication. In post-Kripkean semantic terms, A B is true at a world w if and only if for all w' such that w' is accessible to w, either A fails in w' or B obtains there. Now, in Kripke's semantics for modal logic, the accessibility relation is a binary relation. It holds between pairs of worlds. Unfortunately, from a relevant point of view, the theory of strict implication is still irrelevant. That is, we still make valid formulae like p (q q). We can see quite easily that the Kripke truth condition forces this formula on us.

Like the semantics of modal logic, the semantics of relevance logic relativises truth of formulae to worlds. But Routley and Meyer go modal logic one better and use a three-place relation on worlds. This allows there to be worlds at which q q fails and that in turn allows worlds at which p (q q) fails. Their truth condition for on this semantics is the following:

A B is true at a world a if and only if for all worlds b and c such that Rabc (R is the accessibility relation) either A is false at b or B is true at c.
For new customers, it takes some time to get used to this truth condition. But with a little work it can be seen to be just a generalisation of the Kanger-Kripke truth condition for strict implication (just set b = c).

But how should this accessibility relation be interpreted philosophically? Oddly, there has not been very much work on this, but there has been some. Mares (1997) uses a theory of information due to David Israel and John Perry (see Israel and Perry (1990)). On this interpreation, in addition to other information a world contains informational links, such as laws of nature, conventions, and so on. Thus, for example, a Newtonian world will contain the information that all matter attracts all other matter. In information-theoretic terms, this world contains the information that two things' being material carries the information that they attract each other. On this view, Rabc if and only if, according to the links in a, all the information carried by what obtains in b is contained in c. Thus, for example, if a is a Newtonian world and the information that x and y are material is contained in b, then the information that x and y attract each other is contained in c.

Another similar interpretation is given in Barwise (1993) and developed in Restall (1996). On this view, worlds are taken to be information-theoretic "sites". Rabc means that a is an information-theoretic channel between b and c. Both this channel-theoretic interpretation of the accessibiliity relation and the other information-theoretic interpretation attempt to provide the formal semantics with an informal notion of relevance. On these interpretations, what is needed for an implication to be true is that the antecedent carry the information that the consequent obtains. The antecedent must be informationally relevant to the consequent.

By itself, the ternary relation is not sufficient to avoid all the paradoxes of implication. Given what we have said so far, it is not clear how the semantics can avoid paradoxes such as (p & ~p) q and p (q ~q). These paradoxes are avoided by the inclusion of inconsistent and non-bivalent worlds in the semantics. For, if there were no worlds at which p & ~p holds, then, according to our truth condition for the arrow, (p & ~p) q would also hold everywhere. Likewise, if q ~q held at every world, then p (q ~q) would be universally true.

This brings us to the semantics for negation. The use of non-bivalent and inconsistent worlds requires a non-classical truth condition for negation. In the early 1970s, Richard and Val Routley invented their "star operator" to treat negation. The operator is an operator on worlds. For each world a, there is a world a*. And

~A is true at a if and only if A is false at a*.
Once again, we have the difficulty of interpreting a part of the formal semantics. What may be the nicest interpretation of the Routley star is that of Dunn (1993). Dunn uses a binary relation, C, on worlds. ‘Cab’ means that b is compatable with a. a*, then, is the maximal world (the world containing the most information) that is compatable with a.

There are other semantics for negation. One, due to Dunn and developed by Routley, is a four-valued semantics. This semantics is treated in the entry on paraconsistent logics.

Proof Theory

There is now a lage variety of approaches to proof theory for relevant logics. There is a Gentzen system for the negation-free fragment of the logic R by J.M. Dunn and an elegant and very general Gentzen-style approach called "Display Logic" recently developed by Nuel Belnap. But here I will only deal with an treatment that most philosophers will find somewhat familiar, that is, the natural deduction system due to Anderson and Belnap.

Anderson and Belnap's natural deduction system is based on Fitch's natural deduction systems for classical and intuitionistic logic. The easiest way to understand this technique is by looking at an example.

1. A{1} hyp
2. (A B){2} hyp
3. B{1,2} 1,2, E
4. ((A B) B){1} 2,3, I
5. A ((A B) B) 1,4, I
The numbers in set brackets indicate the assuptions used to prove the formula. We will call them ‘indices’. The idea here is that for an assumption to be counted as helping to generate the conclusion, an index denoting the assumption must be appear in the deduction and at some later point be discharged. This ensures that each premise is really used in the deduction. This natural deduction system gives an intuitive understanding of relevance in proofs. The indices keep track of which assumptions are used. For an argument to be valid in this system, all assumptions stated must really be used.

Now, it might seem that the system of indices allows irrelevant premises to creep in. One way in which it might appear that irrelevances can intrude is through the use of a rule of conjunction introduction. That is, it might seem that we can always add in an irrelevant premise by doing, say, the following:

1. A{1} hyp
2. B{2} hyp
3. (A & B){1,2} 1,2, &I
4. B{1,2} 3, &E
5. (B B){1} 2,4, I
6. A (B B) 1,5, I
To a relevance logician, the first premise is completely out of place here. To block moves like this, Anderson and Belnap give the following conjunction introduction rule:
From Ai and Bi to infer (A & B)i.
This rule says that two formulae must have the same index before the rule of conjunction introduction can be used.

There is, of course, a lot more to the natural deduction system, but this will suffice as for our current purposes. The theory of relevance that is captured by at least some relevant logics can be understood by how the corresponding natural deduction system understands a real use of a premise and how the rules are allowed to access premises.

Systems of Relevance Logic

Historically, the central systems of relevance logic have been the logic E of entailment and the system R or relevant implication. E was supposed to capture strict relevant implication. But, when a necessity operator and the appropriate modal axioms were added to R (to produce the logic NR), it was discovered that the resulting modal system was different from E. This has left some relevant logicians with a quandary. They have to decide whether to take NR to be the system of strict relevant implication, or to claim that NR was somehow defficient and that E stands as the system of strict relevant implication.

On the other hand, there are those relevance logicians who reject both R and E. On one hand there is Arnon Avron who has used semantic arguments to motivate logics stronger than R. On the other hand there are logicians like Ross Brady, John Slaney, Steve Giabrione, Richard Sylvan, Graham Priest and Greg Restall who have argued for the acceptance of systems weaker than R or E. Among the points in favour of weaker these systems is that, unlike R or E, many of them are decidable.

On an extreme end of the spectrum is the logic S of R.K. Meyer, Errol Martin and Robin Dwyer. This logic contains no theorems of the form A A. In other words, according to S, no proposition implies itself and no argument of the form ‘A, therefore A’ is valid. Thus, this logic does not make valid any circular arguments.

Applications of Relevance Logic

Apart from the motivating applications of providing better formalisms of our pre-formal notions of implication and entailment, relevance logic has been put to various uses in philosophy and computer science. Here I will list just a few.

Dunn has developed a theory of intrinsic and essential properties based on relevant logic. This is his theory of relevant predication. Briefly put, a thing i has a property F relevantly if x(x=i F(x)). Informally, an object has a property relevantly if being that thing relevantly implies having that property. Since the truth of the consequent of a relevant implication is by itself insufficient for the truth of that implication, things can have properties irrelevantly as well as relevantly. Dunn's formulation would seem to capture at least one sense in which we use the notion of an intrisic property. Adding modality to the language allows for a formilisation of the notion of an essential property.

Meyer has produced a variant of Peano arithmetic based on the relevance logic, R. Meyer gives a finitary proof that his relevant arithmetic does not have 0 = 1 as a theorem. Thus Meyer solves one of Hilbert's central problems in the context of relevant arithmetic: He shows using finitary means that relevant arithmetic is absolutely consistent. Unfortunately, as Meyer and Friedman have shown, relevant arithmetic does not contain all of the theorems of classical Peano arithmetic. Hence we cannot infer from this that classical Peano arithmetic is absolutely consistent (see Meyer and Friedman (1992)).

In a similar vein, Ross Brady and others have used weak relevant logics as bases for set theories. Brady shows that a weak relevant logic together with some set-theoretic axioms that include a naive comprehension principle is not trivial. That is, not every proposition can be proved in this system.

Anderson (1967) formulates a system of deontic logic based on R. This system avoids some of the standard problems with more traditional deontic logics. For example, the rule of necessitation from A's being a theorem to OA's being a theorem is rejected. Thus, it does not say that all theorems ought to be the case.

Mares and Fuhrmann (1995) present a theory of counterfactual conditionals based on relevant logic. This theory avoids the analogs of the paradoxes of implication that appear in standard logics of counterfactuals.

Relevant logics have been used in computer science as well as in philosophy. Linear logics, a branch of logic discoverd by the French logician Girard, is a logic of computational resources. Linear logic is, in fact, a weak relevant logic with the addition of two operators.


An extremely good, although slightly out of date, bibliography on relevance logic was put together by Robert Wolff and is in Anderson, Belnap and Dunn (1992). What follows is a brief list of some of the more influential works in the field and works that are referred to above.

Books on Relevance Logic and Introductions to the Field:

Other Works Cited:

Other Internet Resources

[Please contact the author with suggestions.]

Related Entries

logic: modal | logic: paraconsistent | mathematics: inconsistent

Copyright © 1998
Edwin Mares

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

Stanford Encyclopedia of Philosophy