
In propositional logic, a propositional formula is a type of syntactic formula which is well formed. If the values of all variables in a propositional formula are given, it determines a unique truth value. A propositional formula may also be called a propositional expression, a sentence, or a sentential formula.
A propositional formula is constructed from simple propositions, such as "five is greater than three" or propositional variables such as p and q, using connectives or logical operators such as NOT, AND, OR, or IMPLIES; for example:
- (p AND NOT q) IMPLIES (p OR q).
In mathematics, a propositional formula is often more briefly referred to as a "proposition", but, more precisely, a propositional formula is not a proposition but a formal expression that denotes a proposition, a formal object under discussion, just like an expression such as "x + y" is not a value, but denotes a value. In some contexts, maintaining the distinction may be of importance.
Propositions
For the purposes of the propositional calculus, propositions (utterances, sentences, assertions) are considered to be either simple or compound. Compound propositions are considered to be linked by sentential connectives, some of the most common of which are "AND", "OR", "IF ... THEN ...", "NEITHER ... NOR ...", "... IS EQUIVALENT TO ..." . The linking semicolon ";", and connective "BUT" are considered to be expressions of "AND". A sequence of discrete sentences are considered to be linked by "AND"s, and formal analysis applies a recursive "parenthesis rule" with respect to sequences of simple propositions (see more below about well-formed formulas).
- For example: The assertion: "This cow is blue. That horse is orange but this horse here is purple." is actually a compound proposition linked by "AND"s: ( ("This cow is blue" AND "that horse is orange") AND "this horse here is purple" ) .
Simple propositions are declarative in nature, that is, they make assertions about the condition or nature of a particular object of sensation e.g. "This cow is blue", "There's a coyote!" ("That coyote IS there, behind the rocks."). Thus the simple "primitive" assertions must be about specific objects or specific states of mind. Each must have at least a subject (an immediate object of thought or observation), a verb (in the active voice and present tense preferred), and perhaps an adjective or adverb. "Dog!" probably implies "I see a dog" but should be rejected as too ambiguous.
- Example: "That purple dog is running", "This cow is blue", "Switch M31 is closed", "This cap is off", "Tomorrow is Friday".
For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simple sentences, although the result will probably sound stilted.
Relationship between propositional and predicate formulas
The predicate calculus goes a step further than the propositional calculus to an "analysis of the inner structure of propositions" It breaks a simple sentence down into two parts (i) its subject (the object (singular or plural) of discourse) and (ii) a predicate (a verb or possibly verb-clause that asserts a quality or attribute of the object(s)). The predicate calculus then generalizes the "subject|predicate" form (where | symbolizes concatenation (stringing together) of symbols) into a form with the following blank-subject structure " ___|predicate", and the predicate in turn generalized to all things with that property.
- Example: "This blue pig has wings" becomes two sentences in the propositional calculus: "This pig has wings" AND "This pig is blue", whose internal structure is not considered. In contrast, in the predicate calculus, the first sentence breaks into "this pig" as the subject, and "has wings" as the predicate. Thus it asserts that object "this pig" is a member of the class (set, collection) of "winged things". The second sentence asserts that object "this pig" has an attribute "blue" and thus is a member of the class of "blue things". One might choose to write the two sentences connected with AND as:
- p|W AND p|B
The generalization of "this pig" to a (potential) member of two classes "winged things" and "blue things" means that it has a truth-relationship with both of these classes. In other words, given a domain of discourse "winged things", p is either found to be a member of this domain or not. Thus there is a relationship W (wingedness) between p (pig) and { T, F }, W(p) evaluates to { T, F } where { T, F } is the set of the Boolean values "true" and "false". Likewise for B (blueness) and p (pig) and { T, F }: B(p) evaluates to { T, F }. So one now can analyze the connected assertions "B(p) AND W(p)" for its overall truth-value, i.e.:
- ( B(p) AND W(p) ) evaluates to { T, F }
In particular, simple sentences that employ notions of "all", "some", "a few", "one of", etc. called logical quantifiers are treated by the predicate calculus. Along with the new function symbolism "F(x)" two new symbols are introduced: ∀ (For all), and ∃ (There exists ..., At least one of ... exists, etc.). The predicate calculus, but not the propositional calculus, can establish the formal validity of the following statement:
- "All blue pigs have wings but some pigs have no wings, hence some pigs are not blue".
Identity
Tarski asserts that the notion of IDENTITY (as distinguished from LOGICAL EQUIVALENCE) lies outside the propositional calculus; however, he notes that if a logic is to be of use for mathematics and the sciences it must contain a "theory" of IDENTITY. Some authors refer to "predicate logic with identity" to emphasize this extension. See more about this below.
An algebra of propositions, the propositional calculus
This section is written like a personal reflection, personal essay, or argumentative essay that states a Wikipedia editor's personal feelings or presents an original argument about a topic.(June 2021) |
An algebra (and there are many different ones), loosely defined, is a method by which a collection of symbols called variables together with some other symbols such as parentheses (, ) and some sub-set of symbols such as *, +, ~, &, ∨, =, ≡, ∧, ¬ are manipulated within a system of rules. These symbols, and well-formed strings of them, are said to represent objects, but in a specific algebraic system these objects do not have meanings. Thus work inside the algebra becomes an exercise in obeying certain laws (rules) of the algebra's syntax (symbol-formation) rather than in semantics (meaning) of the symbols. The meanings are to be found outside the algebra.
For a well-formed sequence of symbols in the algebra —a formula— to have some usefulness outside the algebra the symbols are assigned meanings and eventually the variables are assigned values; then by a series of rules the formula is evaluated.
When the values are restricted to just two and applied to the notion of simple sentences (e.g. spoken utterances or written assertions) linked by propositional connectives this whole algebraic system of symbols and rules and evaluation-methods is usually called the propositional calculus or the sentential calculus.
While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions (e.g. the commutative and associative laws for AND and OR), some do not (e.g. the distributive laws for AND, OR and NOT).
Usefulness of propositional formulas
Analysis: In deductive reasoning, philosophers, rhetoricians and mathematicians reduce arguments to formulas and then study them (usually with truth tables) for correctness (soundness). For example: Is the following argument sound?
- "Given that consciousness is sufficient for an artificial intelligence and only conscious entities can pass the Turing test, before we can conclude that a robot is an artificial intelligence the robot must pass the Turing test."
Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reduction and minimization techniques to simplify their designs.
Synthesis: Engineers in particular synthesize propositional formulas (that eventually end up as circuits of symbols) from truth tables. For example, one might write down a truth table for how binary addition should behave given the addition of variables "b" and "a" and "carry_in" "ci", and the results "carry_out" "co" and "sum" Σ:
- Example: in row 5, ( (b+a) + ci ) = ( (1+0) + 1 ) = the number "2". written as a binary number this is 102, where "co"=1 and Σ=0 as shown in the right-most columns.
row | b | a | ci | (b+a)+ci | co | Σ | |
---|---|---|---|---|---|---|---|
0 | 0 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 0 | 1 | 1 | 0 | 1 | |
2 | 0 | 1 | 0 | 1 | 0 | 1 | |
3 | 0 | 1 | 1 | 2 | 1 | 0 | |
4 | 1 | 0 | 0 | 1 | 0 | 1 | |
5 | 1 | 0 | 1 | 2 | 1 | 0 | |
6 | 1 | 1 | 0 | 2 | 1 | 0 | |
7 | 1 | 1 | 1 | 3 | 1 | 1 |
Propositional variables
The simplest type of propositional formula is a propositional variable. Propositions that are simple (atomic), symbolic expressions are often denoted by variables named p, q, or P, Q, etc. A propositional variable is intended to represent an atomic proposition (assertion), such as "It is Saturday" = p (here the symbol = means " ... is assigned the variable named ...") or "I only go to the movies on Monday" = q.
Truth-value assignments, formula evaluations
Evaluation of a propositional formula begins with assignment of a truth value to each variable. Because each variable represents a simple sentence, the truth values are being applied to the "truth" or "falsity" of these simple sentences.
Truth values in rhetoric, philosophy and mathematics
The truth values are only two: { TRUTH "T", FALSITY "F" }. An empiricist puts all propositions into two broad classes: analytic—true no matter what (e.g. tautology), and synthetic—derived from experience and thereby susceptible to confirmation by third parties (the verification theory of meaning). Empiricists hold that, in general, to arrive at the truth-value of a synthetic proposition, meanings (pattern-matching templates) must first be applied to the words, and then these meaning-templates must be matched against whatever it is that is being asserted. For example, my utterance "That cow is blue!" Is this statement a TRUTH? Truly I said it. And maybe I am seeing a blue cow—unless I am lying my statement is a TRUTH relative to the object of my (perhaps flawed) perception. But is the blue cow "really there"? What do you see when you look out the same window? In order to proceed with a verification, you will need a prior notion (a template) of both "cow" and "blue", and an ability to match the templates against the object of sensation (if indeed there is one).[citation needed]
Truth values in engineering
Engineers try to avoid notions of truth and falsity that bedevil philosophers, but in the final analysis engineers must trust their measuring instruments. In their quest for robustness, engineers prefer to pull known objects from a small library—objects that have well-defined, predictable behaviors even in large combinations, (hence their name for the propositional calculus: "combinatorial logic"). The fewest behaviors of a single object are two (e.g. { OFF, ON }, { open, shut }, { UP, DOWN } etc.), and these are put in correspondence with { 0, 1 }. Such elements are called digital; those with a continuous range of behaviors are called analog. Whenever decisions must be made in an analog system, quite often an engineer will convert an analog behavior (the door is 45.32146% UP) to digital (e.g. DOWN=0 ) by use of a comparator.
Thus an assignment of meaning of the variables and the two value-symbols { 0, 1 } comes from "outside" the formula that represents the behavior of the (usually) compound object. An example is a garage door with two "limit switches", one for UP labelled SW_U and one for DOWN labelled SW_D, and whatever else is in the door's circuitry. Inspection of the circuit (either the diagram or the actual objects themselves—door, switches, wires, circuit board, etc.) might reveal that, on the circuit board "node 22" goes to +0 volts when the contacts of switch "SW_D" are mechanically in contact ("closed") and the door is in the "down" position (95% down), and "node 29" goes to +0 volts when the door is 95% UP and the contacts of switch SW_U are in mechanical contact ("closed"). The engineer must define the meanings of these voltages and all possible combinations (all 4 of them), including the "bad" ones (e.g. both nodes 22 and 29 at 0 volts, meaning that the door is open and closed at the same time). The circuit mindlessly responds to whatever voltages it experiences without any awareness of TRUTH or FALSEHOOD, RIGHT or WRONG, SAFE or DANGEROUS.[citation needed]
Propositional connectives
Arbitrary propositional formulas are built from propositional variables and other propositional formulas using propositional connectives. Examples of connectives include:
- The unary negation connective. If
is a formula, then
is a formula.
- The classical binary connectives
. Thus, for example, if
and
are formulas, so is
.
- Other binary connectives, such as NAND, NOR, and XOR
- The ternary connective IF ... THEN ... ELSE ...
- Constant 0-ary connectives ⊤ and ⊥ (alternately, constants { T, F }, { 1, 0 } etc. )
- The "theory-extension" connective EQUALS (alternately, IDENTITY, or the sign " = " as distinguished from the "logical connective"
)
Connectives of rhetoric, philosophy and mathematics
The following are the connectives common to rhetoric, philosophy and mathematics together with their truth tables. The symbols used will vary from author to author and between fields of endeavor. In general the abbreviations "T" and "F" stand for the evaluations TRUTH and FALSITY applied to the variables in the propositional formula (e.g. the assertion: "That cow is blue" will have the truth-value "T" for Truth or "F" for Falsity, as the case may be.).
The connectives go by a number of different word-usages, e.g. "a IMPLIES b" is also said "IF a THEN b". Some of these are shown in the table.
b only if a | |||||||||||
b IS SUFFICIENT FOR a | b PRECISELY WHEN a | ||||||||||
a IS NECESSARY FOR b | b IF AND ONLY IF a; b IFF a | ||||||||||
inclusive OR | IF b THEN a | b IS NECESSARY AND SUFFICIENT FOR a | |||||||||
negation | negation | conjunction | disjunction | implication | biconditional | ||||||
variables | NOT b | NOT a | b AND a | b OR a | b IMPLIES a | b IS logically equivalent TO a *** | f IS A tautology | NEITHER a NOR b | b stroke a | exclusive OR | |
---|---|---|---|---|---|---|---|---|---|---|---|
b | a | ¬(b) | ¬(a) | (b ∧ a) | (b ∨ a) | (b → a) | (b ↔ a) | (f = formula) | (a NOR b) | (b|a) | various |
F | F | T | T | F | F | T | T | T | T | T | F |
F | T | T | F | F | T | T | F | T | F | T | T |
T | F | F | T | F | T | F | F | T | F | T | T |
T | T | F | F | T | T | T | T | T | F | F | F |
Engineering connectives

In general, the engineering connectives are just the same as the mathematics connectives excepting they tend to evaluate with "1" = "T" and "0" = "F". This is done for the purposes of analysis/minimization and synthesis of formulas by use of the notion of minterms and Karnaugh maps (see below). Engineers also use the words logical product from Boole's notion (a*a = a) and logical sum from Jevons' notion (a+a = a).
logical product | logical sum | half-adder (no carry) | |||||||
---|---|---|---|---|---|---|---|---|---|
exclusive OR | |||||||||
row number | variables | NOT | NOT | AND | OR | NAND | NOR | XOR | |
b*21+a*20 | b | a | ~(b) | ~(a) | (b & a) | (b ∨ a) | ~(b & a) | ~(b ∨ a) | ⊕ |
0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 |
1 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 1 |
2 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 |
3 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 |
CASE connective: IF ... THEN ... ELSE ...
The IF ... THEN ... ELSE ... connective appears as the simplest form of CASE operator of recursion theory and computation theory and is the connective responsible for conditional goto's (jumps, branches). From this one connective all other connectives can be constructed (see more below). Although " IF c THEN b ELSE a " sounds like an implication it is, in its most reduced form, a switch that makes a decision and offers as outcome only one of two alternatives "a" or "b" (hence the name switch statement in the C programming language).
The following three propositions are equivalent (as indicated by the logical equivalence sign ≡ ):
- ( IF 'counter is zero' THEN 'go to instruction b ' ELSE 'go to instruction a ') ≡
- ( (c → b) & (~c → a) ) ≡ ( ( IF 'counter is zero' THEN 'go to instruction b ' ) AND ( IF 'It is NOT the case that counter is zero' THEN 'go to instruction a ) " ≡
- ( (c & b) ∨ (~c & a) ) ≡ " ( 'Counter is zero' AND 'go to instruction b ) OR ( 'It is NOT the case that 'counter is zero' AND 'go to instruction a ) "
Thus IF ... THEN ... ELSE—unlike implication—does not evaluate to an ambiguous "TRUTH" when the first proposition is false i.e. c = F in (c → b). For example, most people would reject the following compound proposition as a nonsensical non sequitur because the second sentence is not connected in meaning to the first.
- Example: The proposition " IF 'Winston Churchill was Chinese' THEN 'The sun rises in the east' " evaluates as a TRUTH given that 'Winston Churchill was Chinese' is a FALSEHOOD and 'The sun rises in the east' evaluates as a TRUTH.
In recognition of this problem, the sign → of formal implication in the propositional calculus is called material implication to distinguish it from the everyday, intuitive implication.
The use of the IF ... THEN ... ELSE construction avoids controversy because it offers a completely deterministic choice between two stated alternatives; it offers two "objects" (the two alternatives b and a), and it selects between them exhaustively and unambiguously. In the truth table below, d1 is the formula: ( (IF c THEN b) AND (IF NOT-c THEN a) ). Its fully reduced form d2 is the formula: ( (c AND b) OR (NOT-c AND a). The two formulas are equivalent as shown by the columns "=d1" and "=d2". Electrical engineers call the fully reduced formula the AND-OR-SELECT operator. The CASE (or SWITCH) operator is an extension of the same idea to n possible, but mutually exclusive outcomes. Electrical engineers call the CASE operator a multiplexer.
d1 | d2 | ||||||||||||||||||||||||||||||||||||||
row | c | b | a | ( | ( | c | → | b | ) | & | ( | ~ | ( | c | ) | → | a | ) | ) | =d1 | ( | ( | c | & | b | ) | ∨ | ( | ~ | ( | c | ) | & | a | ) | ) | =d2 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | ||||||||||||||||||
1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | ||||||||||||||||||
2 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | ||||||||||||||||||
3 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | ||||||||||||||||||
4 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | ||||||||||||||||||
5 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | ||||||||||||||||||
6 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | ||||||||||||||||||
7 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 1 |
IDENTITY and evaluation
The first table of this section stars *** the entry logical equivalence to note the fact that "Logical equivalence" is not the same thing as "identity". For example, most would agree that the assertion "That cow is blue" is identical to the assertion "That cow is blue". On the other hand, logical equivalence sometimes appears in speech as in this example: " 'The sun is shining' means 'I'm biking' " Translated into a propositional formula the words become: "IF 'the sun is shining' THEN 'I'm biking', AND IF 'I'm biking' THEN 'the sun is shining'":
- "IF 's' THEN 'b' AND IF 'b' THEN 's' " is written as ((s → b) & (b → s)) or in an abbreviated form as (s ↔ b). As the rightmost symbol string is a definition for a new symbol in terms of the symbols on the left, the use of the IDENTITY sign = is appropriate:
- ((s → b) & (b → s)) = (s ↔ b)
Different authors use different signs for logical equivalence: ↔ (e.g. Suppes, Goodstein, Hamilton), ≡ (e.g. Robbin), ⇔ (e.g. Bender and Williamson). Typically identity is written as the equals sign =. One exception to this rule is found in Principia Mathematica. For more about the philosophy of the notion of IDENTITY see Leibniz's law.
As noted above, Tarski considers IDENTITY to lie outside the propositional calculus, but he asserts that without the notion, "logic" is insufficient for mathematics and the deductive sciences. In fact the sign comes into the propositional calculus when a formula is to be evaluated.
In some systems there are no truth tables, but rather just formal axioms (e.g. strings of symbols from a set { ~, →, (, ), variables p1, p2, p3, ... } and formula-formation rules (rules about how to make more symbol strings from previous strings by use of e.g. substitution and modus ponens). the result of such a calculus will be another formula (i.e. a well-formed symbol string). Eventually, however, if one wants to use the calculus to study notions of validity and truth, one must add axioms that define the behavior of the symbols called "the truth values" {T, F} ( or {1, 0}, etc.) relative to the other symbols.
For example, Hamilton uses two symbols = and ≠ when he defines the notion of a valuation v of any well-formed formulas (wffs) A and B in his "formal statement calculus" L. A valuation v is a function from the wffs of his system L to the range (output) { T, F }, given that each variable p1, p2, p3 in a wff is assigned an arbitrary truth value { T, F }.
v(A) ≠ v(~A) | i |
v(A → B) = F if and only if v(A) = T and v(B) = F | ii |
The two definitions (i) and (ii) define the equivalent of the truth tables for the ~ (NOT) and → (IMPLICATION) connectives of his system. The first one derives F ≠ T and T ≠ F, in other words " v(A) does not mean v(~A)". Definition (ii) specifies the third row in the truth table, and the other three rows then come from an application of definition (i). In particular (ii) assigns the value F (or a meaning of "F") to the entire expression. The definitions also serve as formation rules that allow substitution of a value previously derived into a formula:
v(A→B) | ||||
( | v(A) | → | v(B) | ) |
F | T | F | ||
F | T | T | ||
T | F | F | ||
T | T | T |
Some formal systems specify these valuation axioms at the outset in the form of certain formulas such as the law of contradiction or laws of identity and nullity. The choice of which ones to use, together with laws such as commutation and distribution, is up to the system's designer as long as the set of axioms is complete (i.e. sufficient to form and to evaluate any well-formed formula created in the system).
More complex formulas
As shown above, the CASE (IF c THEN b ELSE a ) connective is constructed either from the 2-argument connectives IF ... THEN ... and AND or from OR and AND and the 1-argument NOT. Connectives such as the n-argument AND (a & b & c & ... & n), OR (a ∨ b ∨ c ∨ ... ∨ n) are constructed from strings of two-argument AND and OR and written in abbreviated form without the parentheses. These, and other connectives as well, can then be used as building blocks for yet further connectives. Rhetoricians, philosophers, and mathematicians use truth tables and the various theorems to analyze and simplify their formulas.
Electrical engineering uses drawn symbols and connect them with lines that stand for the mathematicals act of substitution and replacement. They then verify their drawings with truth tables and simplify the expressions as shown below by use of Karnaugh maps or the theorems. In this way engineers have created a host of "combinatorial logic" (i.e. connectives without feedback) such as "decoders", "encoders", "mutifunction gates", "majority logic", "binary adders", "arithmetic logic units", etc.
Definitions
A definition creates a new symbol and its behavior, often for the purposes of abbreviation. Once the definition is presented, either form of the equivalent symbol or formula can be used. The following symbolism =Df is following the convention of Reichenbach. Some examples of convenient definitions drawn from the symbol set { ~, &, (, ) } and variables. Each definition is producing a logically equivalent formula that can be used for substitution or replacement.
- definition of a new variable: (c & d) =Df s
- OR: ~(~a & ~b) =Df (a ∨ b)
- IMPLICATION: (~a ∨ b) =Df (a → b)
- XOR: (~a & b) ∨ (a & ~b) =Df (a ⊕ b)
- LOGICAL EQUIVALENCE: ( (a → b) & (b → a) ) =Df ( a ≡ b )
Axiom and definition schemas
The definitions above for OR, IMPLICATION, XOR, and logical equivalence are actually schemas (or "schemata"), that is, they are models (demonstrations, examples) for a general formula format but shown (for illustrative purposes) with specific letters a, b, c for the variables, whereas any variable letters can go in their places as long as the letter substitutions follow the rule of substitution below.
- Example: In the definition (~a ∨ b) =Df (a → b), other variable-symbols such as "SW2" and "CON1" might be used, i.e. formally:
- a =Df SW2, b =Df CON1, so we would have as an instance of the definition schema (~SW2 ∨ CON1) =Df (SW2 → CON1)
Substitution versus replacement
Substitution: The variable or sub-formula to be substituted with another variable, constant, or sub-formula must be replaced in all instances throughout the overall formula.
- Example: (c & d) ∨ (p & ~(c & ~d)), but (q1 & ~q2) ≡ d. Now wherever variable "d" occurs, substitute (q1 & ~q2):
- (c & (q1 & ~q2)) ∨ (p & ~(c & ~(q1 & ~q2)))
Replacement: (i) the formula to be replaced must be within a tautology, i.e. logically equivalent ( connected by ≡ or ↔) to the formula that replaces it, and (ii) unlike substitution its permissible for the replacement to occur only in one place (i.e. for one formula).
- Example: Use this set of formula schemas/equivalences:
- ( (a ∨ 0) ≡ a ).
- ( (a & ~a) ≡ 0 ).
- ( (~a ∨ b) =Df (a → b) ).
- start with "a": a
- Use 1 to replace "a" with (a ∨ 0): (a ∨ 0)
- Use the notion of "schema" to substitute b for a in 2: ( (a & ~a) ≡ 0 )
- Use 2 to replace 0 with (b & ~b): ( a ∨ (b & ~b) )
- (see below for how to distribute "a ∨" over (b & ~b), etc.)
Inductive definition
The classical presentation of propositional logic (see Enderton 2002) uses the connectives . The set of formulas over a given set of propositional variables is inductively defined to be the smallest set of expressions such that:
- Each propositional variable in the set is a formula,
is a formula whenever
is, and
is a formula whenever
and
are formulas and
is one of the binary connectives
.
This inductive definition can be easily extended to cover additional connectives.
The inductive definition can also be rephrased in terms of a closure operation (Enderton 2002). Let V denote a set of propositional variables and let XV denote the set of all strings from an alphabet including symbols in V, left and right parentheses, and all the logical connectives under consideration. Each logical connective corresponds to a formula building operation, a function from XXV to XXV:
- Given a string z, the operation
returns
.
- Given strings y and z, the operation
returns
. There are similar operations
,
, and
corresponding to the other binary connectives.
The set of formulas over V is defined to be the smallest subset of XXV containing V and closed under all the formula building operations.
Parsing formulas
The following "laws" of the propositional calculus are used to "reduce" complex formulas. The "laws" can be verified easily with truth tables. For each law, the principal (outermost) connective is associated with logical equivalence ≡ or identity =. A complete analysis of all 2n combinations of truth-values for its n distinct variables will result in a column of 1's (T's) underneath this connective. This finding makes each law, by definition, a tautology. And, for a given law, because its formula on the left and right are equivalent (or identical) they can be substituted for one another.
- Example: The following truth table is De Morgan's law for the behavior of NOT over OR: ~(a ∨ b) ≡ (~a & ~b). To the left of the principal connective ≡ (yellow column labelled "taut") the formula ~(b ∨ a) evaluates to (1, 0, 0, 0) under the label "P". On the right of "taut" the formula (~(b) ∨ ~(a)) also evaluates to (1, 0, 0, 0) under the label "Q". As the two columns have equivalent evaluations, the logical equivalence ≡ under "taut" evaluates to (1, 1, 1, 1), i.e. P ≡ Q. Thus either formula can be substituted for the other if it appears in a larger formula.
P | taut | Q | ||||||||||||||||||||
b | a | ( | ~ | ( | b | V | a | ) | ≡ | ( | ~ | ( | b | ) | & | ~ | ( | a | ) | ) | ) | |
0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | |||||||||||
0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | |||||||||||
1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | |||||||||||
1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 |
Enterprising readers might challenge themselves to invent an "axiomatic system" that uses the symbols { ∨, &, ~, (, ), variables a, b, c }, the formation rules specified above, and as few as possible of the laws listed below, and then derive as theorems the others as well as the truth-table valuations for ∨, &, and ~. One set attributed to Huntington (1904) (Suppes:204) uses eight of the laws defined below.
If used in an axiomatic system, the symbols 1 and 0 (or T and F) are considered to be well-formed formulas and thus obey all the same rules as the variables. Thus the laws listed below are actually axiom schemas, that is, they stand in place of an infinite number of instances. Thus ( x ∨ y ) ≡ ( y ∨ x ) might be used in one instance, ( p ∨ 0 ) ≡ ( 0 ∨ p ) and in another instance ( 1 ∨ q ) ≡ ( q ∨ 1 ), etc.
Connective seniority (symbol rank)
In general, to avoid confusion during analysis and evaluation of propositional formulas, one can make liberal use of parentheses. However, quite often authors leave them out. To parse a complicated formula one first needs to know the seniority, or rank, that each of the connectives (excepting *) has over the other connectives. To "well-form" a formula, start with the connective with the highest rank and add parentheses around its components, then move down in rank (paying close attention to the connective's scope over which it is working). From most- to least-senior, with the predicate signs ∀x and ∃x, the IDENTITY = and arithmetic signs added for completeness:
- ≡
- (LOGICAL EQUIVALENCE)
- →
- (IMPLICATION)
- &
- (AND)
- ∨
- (OR)
- ~
- (NOT)
- ∀x
- (FOR ALL x)
- ∃x
- (THERE EXISTS AN x)
- =
- (IDENTITY)
- +
- (arithmetic sum)
- *
- (arithmetic multiply)
- '
- (s, arithmetic successor).
Thus the formula can be parsed—but because NOT does not obey the distributive law, the parentheses around the inner formula (~c & ~d) is mandatory:
- Example: " d & c ∨ w " rewritten is ( (d & c) ∨ w )
- Example: " a & a → b ≡ a & ~a ∨ b " rewritten (rigorously) is
- ≡ has seniority: ( ( a & a → b ) ≡ ( a & ~a ∨ b ) )
- → has seniority: ( ( a & (a → b) ) ≡ ( a & ~a ∨ b ) )
- & has seniority both sides: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~a ∨ b) ) )
- ~ has seniority: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) ∨ b) ) )
- check 9 ( -parenthesis and 9 ) -parenthesis: ( ( ( (a) & (a → b) ) ) ≡ ( ( (a) & (~(a) ∨ b) ) )
- Example:
- d & c ∨ p & ~(c & ~d) ≡ c & d ∨ p & c ∨ p & ~d rewritten is ( ( (d & c) ∨ ( p & ~((c & ~(d)) ) ) ) ≡ ( (c & d) ∨ (p & c) ∨ (p & ~(d)) ) )
Commutative and associative laws
Both AND and OR obey the commutative law and associative law:
- Commutative law for OR: ( a ∨ b ) ≡ ( b ∨ a )
- Commutative law for AND: ( a & b ) ≡ ( b & a )
- Associative law for OR: (( a ∨ b ) ∨ c ) ≡ ( a ∨ (b ∨ c) )
- Associative law for AND: (( a & b ) & c ) ≡ ( a & (b & c) )
Omitting parentheses in strings of AND and OR: The connectives are considered to be unary (one-variable, e.g. NOT) and binary (i.e. two-variable AND, OR, IMPLIES). For example:
- ( (c & d) ∨ (p & c) ∨ (p & ~d) ) above should be written ( ((c & d) ∨ (p & c)) ∨ (p & ~(d) ) ) or possibly ( (c & d) ∨ ( (p & c) ∨ (p & ~(d)) ) )
However, a truth-table demonstration shows that the form without the extra parentheses is perfectly adequate.
Omitting parentheses with regards to a single-variable NOT: While ~(a) where a is a single variable is perfectly clear, ~a is adequate and is the usual way this literal would appear. When the NOT is over a formula with more than one symbol, then the parentheses are mandatory, e.g. ~(a ∨ b).
Distributive laws
OR distributes over AND and AND distributes over OR. NOT does not distribute over AND or OR. See below about De Morgan's law:
- Distributive law for OR: ( c ∨ ( a & b) ) ≡ ( (c ∨ a) & (c ∨ b) )
- Distributive law for AND: ( c & ( a ∨ b) ) ≡ ( (c & a) ∨ (c & b) )
De Morgan's laws
NOT, when distributed over OR or AND, does something peculiar (again, these can be verified with a truth-table):
- De Morgan's law for OR: ¬(a ∨ b) ≡ (¬a ^ ¬b)
- De Morgan's law for AND: ¬(a ^ b) ≡ (¬a ∨ ¬b)
Laws of absorption
Absorption, in particular the first one, causes the "laws" of logic to differ from the "laws" of arithmetic:
- Absorption (idempotency) for OR: (a ∨ a) ≡ a
- Absorption (idempotency) for AND: (a & a) ≡ a
Laws of evaluation: Identity, nullity, and complement
The sign " = " (as distinguished from logical equivalence ≡, alternately ↔ or ⇔) symbolizes the assignment of value or meaning. Thus the string (a & ~(a)) symbolizes "0", i.e. it means the same thing as symbol "0" ". In some "systems" this will be an axiom (definition) perhaps shown as ( (a & ~(a)) =Df 0 ); in other systems, it may be derived in the truth table below:
c | taut | c | |||||||||||
a | ( | ( | a | & | ~ | ( | a | ) | ) | ≡ | 0 | ) | |
0 | 0 | 0 | 1 | 0 | 1 | 0 | |||||||
1 | 1 | 0 | 0 | 1 | 1 | 0 |
- Commutation of equality: (a = b) ≡ (b = a)
- Identity for OR: (a ∨ 0) = a or (a ∨ F) = a
- Identity for AND: (a & 1) = a or (a & T) = a
- Nullity for OR: (a ∨ 1) = 1 or (a ∨ T) = T
- Nullity for AND: (a & 0) = 0 or (a & F) = F
- Complement for OR: (a ∨ ~a) = 1 or (a ∨ ~a) = T, law of excluded middle
- Complement for AND: (a & ~a) = 0 or (a & ~a) = F, law of contradiction
Double negative (involution)
- ¬(¬a) ≡ a
Well-formed formulas (wffs)
A key property of formulas is that they can be uniquely parsed to determine the structure of the formula in terms of its propositional variables and logical connectives. When formulas are written in infix notation, as above, unique readability is ensured through an appropriate use of parentheses in the definition of formulas. Alternatively, formulas can be written in Polish notation or reverse Polish notation, eliminating the need for parentheses altogether.
The inductive definition of infix formulas in the previous section can be converted to a formal grammar in Backus-Naur form:
<formula> ::= <propositional variable> | ( ¬ <formula> ) | ( <formula> ∧ <formula>) | ( <formula> ∨ <formula> ) | ( <formula> → <formula> ) | ( <formula> ↔ <formula> )
It can be shown that any expression matched by the grammar has a balanced number of left and right parentheses, and any nonempty initial segment of a formula has more left than right parentheses. This fact can be used to give an algorithm for parsing formulas. For example, suppose that an expression x begins with . Starting after the second symbol, match the shortest subexpression y of x that has balanced parentheses. If x is a formula, there is exactly one symbol left after this expression, this symbol is a closing parenthesis, and y itself is a formula. This idea can be used to generate a recursive descent parser for formulas.
Example of parenthesis counting:
This method locates as "1" the principal connective — the connective under which the overall evaluation of the formula occurs for the outer-most parentheses (which are often omitted). It also locates the inner-most connective where one would begin evaluatation of the formula without the use of a truth table, e.g. at "level 6".
start | ( | ( | ( | c | & | d | ) | V | ( | p | & | ~ | ( | ( | c | & | ~ | ( | d | ) | ) | ) | ) | ) | = | ( | ( | ( | c | & | d | ) | V | ( | p | & | d | ) | ) | V | ( | p | & | ~ | ( | c | ) | ) | ) | ) | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 0 | 1 | 2 | 3 | 3 | 3 | 3 | 2 | 2 | 3 | 3 | 3 | 3 | 4 | 5 | 5 | 5 | 5 | 6 | 6 | 5 | 4 | 3 | 3 | 1 | 1 | 2 | 3 | 4 | 4 | 4 | 4 | 3 | 3 | 4 | 4 | 4 | 4 | 3 | 2 | 2 | 3 | 3 | 3 | 3 | 3 | 3 | 3 | 2 | 1 | 0 |
Well-formed formulas versus valid formulas in inferences
The notion of valid argument is usually applied to inferences in arguments, but arguments reduce to propositional formulas and can be evaluated the same as any other propositional formula. Here a valid inference means: "The formula that represents the inference evaluates to "truth" beneath its principal connective, no matter what truth-values are assigned to its variables", i.e. the formula is a tautology. Quite possibly a formula will be well-formed but not valid. Another way of saying this is: "Being well-formed is necessary for a formula to be valid but it is not sufficient." The only way to find out if it is both well-formed and valid is to submit it to verification with a truth table or by use of the "laws":
- Example 1: What does one make of the following difficult-to-follow assertion? Is it valid? "If it's sunny, but if the frog is croaking then it's not sunny, then it's the same as saying that the frog isn't croaking." Convert this to a propositional formula as follows:
- " IF (a AND (IF b THEN NOT-a) THEN NOT-a" where " a " represents "its sunny" and " b " represents "the frog is croaking":
- ( ( (a) & ( (b) → ~(a) ) ≡ ~(b) )
- This is well-formed, but is it valid? In other words, when evaluated will this yield a tautology (all T) beneath the logical-equivalence symbol ≡ ? The answer is NO, it is not valid. However, if reconstructed as an implication then the argument is valid.
- "Saying it's sunny, but if the frog is croaking then it's not sunny, implies that the frog isn't croaking."
- Other circumstances may be preventing the frog from croaking: perhaps a crane ate it.
- Example 2 (from Reichenbach via Bertrand Russell):
- "If pigs have wings, some winged animals are good to eat. Some winged animals are good to eat, so pigs have wings."
- ( ((a) → (b)) & (b) → (a) ) is well formed, but an invalid argument as shown by the red evaluation under the principal implication:
W | G | arg | |||||||||||||
a | b | ( | ( | ( | a | -> | b | ) | & | b | ) | -> | a | ) | |
0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | |||||||
0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | |||||||
1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | |||||||
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
Reduced sets of connectives

A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formula with just the connectives in that set. There are many complete sets of connectives, including ,
, and
. There are two binary connectives that are complete on their own, corresponding to NAND and NOR, respectively. Some pairs are not complete, for example
.
The stroke (NAND)
The binary connective corresponding to NAND is called the Sheffer stroke, and written with a vertical bar | or vertical arrow ↑. The completeness of this connective was noted in Principia Mathematica (1927:xvii). Since it is complete on its own, all other connectives can be expressed using only the stroke. For example, where the symbol " ≡ " represents logical equivalence:
- ~p ≡ p|p
- p → q ≡ p|~q
- p ∨ q ≡ ~p|~q
- p & q ≡ ~(p|q)
In particular, the zero-ary connectives (representing truth) and
(representing falsity) can be expressed using the stroke:
IF ... THEN ... ELSE
This connective together with { 0, 1 }, ( or { F, T } or { ,
} ) forms a complete set. In the following the IF...THEN...ELSE relation (c, b, a) = d represents ( (c → b) ∨ (~c → a) ) ≡ ( (c & b) ∨ (~c & a) ) = d
- (c, b, a):
- (c, 0, 1) ≡ ~c
- (c, b, 1) ≡ (c → b)
- (c, c, a) ≡ (c ∨ a)
- (c, b, c) ≡ (c & b)
Example: The following shows how a theorem-based proof of "(c, b, 1) ≡ (c → b)" would proceed, below the proof is its truth-table verification. ( Note: (c → b) is defined to be (~c ∨ b) ):
- Begin with the reduced form: ( (c & b) ∨ (~c & a) )
- Substitute "1" for a: ( (c & b) ∨ (~c & 1) )
- Identity (~c & 1) = ~c: ( (c & b) ∨ (~c) )
- Law of commutation for V: ( (~c) ∨ (c & b) )
- Distribute "~c V" over (c & b): ( ((~c) ∨ c ) & ((~c) ∨ b )
- Law of excluded middle (((~c) ∨ c ) = 1 ): ( (1) & ((~c) ∨ b ) )
- Distribute "(1) &" over ((~c) ∨ b): ( ((1) & (~c)) ∨ ((1) & b )) )
- Commutivity and Identity (( 1 & ~c) = (~c & 1) = ~c, and (( 1 & b) ≡ (b & 1) ≡ b: ( ~c ∨ b )
- ( ~c ∨ b ) is defined as c → b Q. E. D.
In the following truth table the column labelled "taut" for tautology evaluates logical equivalence (symbolized here by ≡) between the two columns labelled d. Because all four rows under "taut" are 1's, the equivalence indeed represents a tautology.
d | taut | d | |||||||||||||||||||||||||||||
rows | c | b | a | ( | ( | ( | c | & | b | ) | V | ( | ~ | ( | c | ) | & | a | ) | ) | ≡ | ( | ~ | ( | c | ) | V | b | ) | ) | |
0,1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | |||||||||||||||
2,3 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | |||||||||||||||
4,5 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | |||||||||||||||
6,7 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 1 | 1 |
Normal forms
An arbitrary propositional formula may have a very complicated structure. It is often convenient to work with formulas that have simpler forms, known as normal forms. Some common normal forms include conjunctive normal form and disjunctive normal form. Any propositional formula can be reduced to its conjunctive or disjunctive normal form.
Reduction to normal form

Reduction to normal form is relatively simple once a truth table for the formula is prepared. But further attempts to minimize the number of literals (see below) requires some tools: reduction by De Morgan's laws and truth tables can be unwieldy, but Karnaugh maps are very suitable a small number of variables (5 or less). Some sophisticated tabular methods exist for more complex circuits with multiple outputs but these are beyond the scope of this article; for more see Quine–McCluskey algorithm.
Literal, term and alterm
In electrical engineering, a variable x or its negation ~(x) can be referred to as a literal. A string of literals connected by ANDs is called a term. A string of literals connected by OR is called an alterm. Typically the literal ~(x) is abbreviated ~x. Sometimes the &-symbol is omitted altogether in the manner of algebraic multiplication.
- Examples
- a, b, c, d are variables. ((( a & ~(b) ) & ~(c)) & d) is a term. This can be abbreviated as (a & ~b & ~c & d), or a~b~cd.
- p, q, r, s are variables. (((p ∨ ~(q) ) ∨ r) ∨ ~(s) ) is an alterm. This can be abbreviated as (p ∨ ~q ∨ r ∨ ~s).
Minterms
In the same way that a 2n-row truth table displays the evaluation of a propositional formula for all 2n possible values of its variables, n variables produces a 2n-square Karnaugh map (even though we cannot draw it in its full-dimensional realization). For example, 3 variables produces 23 = 8 rows and 8 Karnaugh squares; 4 variables produces 16 truth-table rows and 16 squares and therefore 16 minterms. Each Karnaugh-map square and its corresponding truth-table evaluation represents one minterm.
Any propositional formula can be reduced to the "logical sum" (OR) of the active (i.e. "1"- or "T"-valued) minterms. When in this form the formula is said to be in disjunctive normal form. But even though it is in this form, it is not necessarily minimized with respect to either the number of terms or the number of literals.
In the following table, observe the peculiar numbering of the rows: (0, 1, 3, 2, 6, 7, 5, 4, 0). The first column is the decimal equivalent of the binary equivalent of the digits "cba", in other words:
- Example
- cba2 = c*22 + b*21 + a*20:
- cba = (c=1, b=0, a=1) = 1012 = 1*22 + 0*21 + 1*20 = 510
This numbering comes about because as one moves down the table from row to row only one variable at a time changes its value. Gray code is derived from this notion. This notion can be extended to three and four-dimensional hypercubes called Hasse diagrams where each corner's variables change only one at a time as one moves around the edges of the cube. Hasse diagrams (hypercubes) flattened into two dimensions are either Veitch diagrams or Karnaugh maps (these are virtually the same thing).
When working with Karnaugh maps one must always keep in mind that the top edge "wrap arounds" to the bottom edge, and the left edge wraps around to the right edge—the Karnaugh diagram is really a three- or four- or n-dimensional flattened object.
decimal equivalent of (c, b, a) | c | b | a | minterm |
---|---|---|---|---|
0 | 0 | 0 | 0 | (~c & ~b & ~a) |
1 | 0 | 0 | 1 | (~c & ~b & a) |
3 | 0 | 1 | 1 | (~c & b & a) |
2 | 0 | 1 | 0 | (~c & b & ~a) |
6 | 1 | 1 | 0 | (c & b & ~a) |
7 | 1 | 1 | 1 | (c & b & a) |
5 | 1 | 0 | 1 | (c & ~b & a) |
4 | 1 | 0 | 0 | (c & ~b & ~a) |
0 | 0 | 0 | 0 | (~a & ~b & ~c) |
Reduction by use of the map method (Veitch, Karnaugh)
Veitch improved the notion of Venn diagrams by converting the circles to abutting squares, and Karnaugh simplified the Veitch diagram by converting the minterms, written in their literal-form (e.g. ~abc~d) into numbers. The method proceeds as follows:
Produce the formula's truth table
Produce the formula's truth table. Number its rows using the binary-equivalents of the variables (usually just sequentially 0 through n-1) for n variables.
- Technically, the propositional function has been reduced to its (unminimized) conjunctive normal form: each row has its minterm expression and these can be OR'd to produce the formula in its (unminimized) conjunctive normal form.
Example: ((c & d) ∨ (p & ~(c & (~d)))) = q in conjunctive normal form is:
- ( (~p & d & c ) ∨ (p & d & c) ∨ (p & d & ~c) ∨ (p & ~d & ~c) ) = q
However, this formula be reduced both in the number of terms (from 4 to 3) and in the total count of its literals (12 to 6).
row | Minterms | p | d | c | ( | ( | c | & | d | ) | ∨ | ( | p | & | ~ | ( | ( | c | & | ~ | ( | d | ) | ) | ) | ) | ) | Active minterms | Formula in conjunctive normal form |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | ( ~p & ~d & ~c ) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | ||||||||||||||
1 | ( ~p & ~d & c) | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | ||||||||||||||
2 | ( ~p & d & ~c ) | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | ||||||||||||||
3 | ( ~p & d & c ) | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | (~p & d & c) | |||||||||||||
4 | ( p & ~d & ~c ) | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | (~p & d & c) | |||||||||||||
5 | ( p & ~d & c ) | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | ||||||||||||||
6 | ( p & d & ~c ) | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | (p & d & ~c) | |||||||||||||
7 | ( p & d & c ) | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | ( p & d & c ) | |||||||||||||
q | = (~p&d&c) ∨ (~p&d&c) ∨ (p&d&~c ) ∨ (p&d&c ) |
Create the formula's Karnaugh map

Use the values of the formula (e.g. "p") found by the truth-table method and place them in their into their respective (associated) Karnaugh squares (these are numbered per the Gray code convention). If values of "d" for "don't care" appear in the table, this adds flexibility during the reduction phase.
Reduce minterms
Minterms of adjacent (abutting) 1-squares (T-squares) can be reduced with respect to the number of their literals, and the number terms also will be reduced in the process. Two abutting squares (2 x 1 horizontal or 1 x 2 vertical, even the edges represent abutting squares) lose one literal, four squares in a 4 x 1 rectangle (horizontal or vertical) or 2 x 2 square (even the four corners represent abutting squares) lose two literals, eight squares in a rectangle lose 3 literals, etc. (One seeks out the largest square or rectangles and ignores the smaller squares or rectangles contained totally within it. ) This process continues until all abutting squares are accounted for, at which point the propositional formula is minimized.
For example, squares #3 and #7 abut. These two abutting squares can lose one literal (e.g. "p" from squares #3 and #7), four squares in a rectangle or square lose two literals, eight squares in a rectangle lose 3 literals, etc. (One seeks out the largest square or rectangles.) This process continues until all abutting squares are accounted for, at which point the propositional formula is said to be minimized.
Example: The map method usually is done by inspection. The following example expands the algebraic method to show the "trick" behind the combining of terms on a Karnaugh map:
- Minterms #3 and #7 abut, #7 and #6 abut, and #4 and #6 abut (because the table's edges wrap around). So each of these pairs can be reduced.
Observe that by the Idempotency law (A ∨ A) = A, we can create more terms. Then by association and distributive laws the variables to disappear can be paired, and then "disappeared" with the Law of contradiction (x & ~x)=0. The following uses brackets [ and ] only to keep track of the terms; they have no special significance:
- Put the formula in conjunctive normal form with the formula to be reduced:
- q = ( (~p & d & c ) ∨ (p & d & c) ∨ (p & d & ~c) ∨ (p & ~d & ~c) ) = ( #3 ∨ #7 ∨ #6 ∨ #4 )
- Idempotency (absorption) [ A ∨ A) = A:
- ( #3 ∨ [ #7 ∨ #7 ] ∨ [ #6 ∨ #6 ] ∨ #4 )
- Associative law (x ∨ (y ∨ z)) = ( (x ∨ y) ∨ z )
- ( [ #3 ∨ #7 ] ∨ [ #7 ∨ #6 ] ∨ [ #6 ∨ #4] )
- [ (~p & d & c ) ∨ (p & d & c) ] ∨ [ (p & d & c) ∨ (p & d & ~c) ] ∨ [ (p & d & ~c) ∨ (p & ~d & ~c) ].
- Distributive law ( x & (y ∨ z) ) = ( (x & y) ∨ (x & z) ) :
- ( [ (d & c) ∨ (~p & p) ] ∨ [ (p & d) ∨ (~c & c) ] ∨ [ (p & ~c) ∨ (c & ~c) ] )
- Commutative law and law of contradiction (x & ~x) = (~x & x) = 0:
- ( [ (d & c) ∨ (0) ] ∨ [ (p & d) ∨ (0) ] ∨ [ (p & ~c) ∨ (0) ] )
- Law of identity ( x ∨ 0 ) = x leading to the reduced form of the formula:
- q = ( (d & c) ∨ (p & d) ∨ (p & ~c) )
Verify reduction with a truth table
row | Minterms | p | d | c | ( | ( | d | & | c | ) | ∨ | ( | p | & | d | ) | ∨ | ( | p | & | ~ | ( | c | ) | ) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | ( ~p & ~d & ~c ) | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | |||||||||
1 | ( ~p & ~d & c) | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | |||||||||
2 | ( ~p & d & ~c ) | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | |||||||||
3 | ( ~p & d & c ) | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | |||||||||
4 | ( p & ~d & ~c ) | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | |||||||||
5 | ( p & ~d & c ) | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | |||||||||
6 | ( p & d & ~c ) | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | |||||||||
7 | ( p & d & c ) | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | |||||||||
q |
Impredicative propositions
Given the following examples-as-definitions, what does one make of the subsequent reasoning:
- (1) "This sentence is simple." (2) "This sentence is complex, and it is conjoined by AND."
Then assign the variable "s" to the left-most sentence "This sentence is simple". Define "compound" c = "not simple" ~s, and assign c = ~s to "This sentence is compound"; assign "j" to "It [this sentence] is conjoined by AND". The second sentence can be expressed as:
- ( NOT(s) AND j )
If truth values are to be placed on the sentences c = ~s and j, then all are clearly FALSEHOODS: e.g. "This sentence is complex" is a FALSEHOOD (it is simple, by definition). So their conjunction (AND) is a falsehood. But when taken in its assembled form, the sentence a TRUTH.
This is an example of the paradoxes that result from an impredicative definition—that is, when an object m has a property P, but the object m is defined in terms of property P. The best advice for a rhetorician or one involved in deductive analysis is avoid impredicative definitions but at the same time be on the lookout for them because they can indeed create paradoxes. Engineers, on the other hand, put them to work in the form of propositional formulas with feedback.
Propositional formula with "feedback"
The notion of a propositional formula appearing as one of its own variables requires a formation rule that allows the assignment of the formula to a variable. In general there is no stipulation (either axiomatic or truth-table systems of objects and relations) that forbids this from happening.
The simplest case occurs when an OR formula becomes one its own inputs e.g. p = q. Begin with (p ∨ s) = q, then let p = q. Observe that q's "definition" depends on itself "q" as well as on "s" and the OR connective; this definition of q is thus impredicative. Either of two conditions can result: oscillation or memory.
It helps to think of the formula as a black box. Without knowledge of what is going on "inside" the formula-"box" from the outside it would appear that the output is no longer a function of the inputs alone. That is, sometimes one looks at q and sees 0 and other times 1. To avoid this problem one has to know the state (condition) of the "hidden" variable p inside the box (i.e. the value of q fed back and assigned to p). When this is known the apparent inconsistency goes away.
To understand [predict] the behavior of formulas with feedback requires the more sophisticated analysis of sequential circuits. Propositional formulas with feedback lead, in their simplest form, to state machines; they also lead to memories in the form of Turing tapes and counter-machine counters. From combinations of these elements one can build any sort of bounded computational model (e.g. Turing machines, counter machines, register machines, Macintosh computers, etc.).
Oscillation
In the abstract (ideal) case the simplest oscillating formula is a NOT fed back to itself: ~(~(p=q)) = q. Analysis of an abstract (ideal) propositional formula in a truth-table reveals an inconsistency for both p=1 and p=0 cases: When p=1, q=0, this cannot be because p=q; ditto for when p=0 and q=1.
q | |||||||
---|---|---|---|---|---|---|---|
p | ~ | ( | p | ) | = q | ||
0 | 1 | 0 | 1 | q & p inconsistent | |||
1 | 0 | 1 | 0 | q & p inconsistent |

Oscillation with delay: If a delay (ideal or non-ideal) is inserted in the abstract formula between p and q then p will oscillate between 1 and 0: 101010...101... ad infinitum. If either of the delay and NOT are not abstract (i.e. not ideal), the type of analysis to be used will be dependent upon the exact nature of the objects that make up the oscillator; such things fall outside mathematics and into engineering.
Analysis requires a delay to be inserted and then the loop cut between the delay and the input "p". The delay must be viewed as a kind of proposition that has "qd" (q-delayed) as output for "q" as input. This new proposition adds another column to the truth table. The inconsistency is now between "qd" and "p" as shown in red; two stable states resulting:
q | ||||||||
---|---|---|---|---|---|---|---|---|
qd | p | ( | ~ | ( | p | ) | = q | |
0 | 0 | 1 | 0 | 1 | state 1 | |||
0 | 1 | 0 | 1 | 0 | qd & p inconsistent | |||
1 | 0 | 1 | 0 | 1 | qd & p inconsistent | |||
1 | 1 | 0 | 1 | 0 | state 0 |
Memory


Without delay, inconsistencies must be eliminated from a truth table analysis. With the notion of "delay", this condition presents itself as a momentary inconsistency between the fed-back output variable q and p = qdelayed.
A truth table reveals the rows where inconsistencies occur between p = qdelayed at the input and q at the output. After "breaking" the feed-back, the truth table construction proceeds in the conventional manner. But afterwards, in every row the output q is compared to the now-independent input p and any inconsistencies between p and q are noted (i.e. p=0 together with q=1, or p=1 and q=0); when the "line" is "remade" both are rendered impossible by the Law of contradiction ~(p & ~p)). Rows revealing inconsistencies are either considered transient states or just eliminated as inconsistent and hence "impossible".
Once-flip memory
About the simplest memory results when the output of an OR feeds back to one of its inputs, in this case output "q" feeds back into "p". Given that the formula is first evaluated (initialized) with p=0 & q=0, it will "flip" once when "set" by s=1. Thereafter, output "q" will sustain "q" in the "flipped" condition (state q=1). This behavior, now time-dependent, is shown by the state diagram to the right of the once-flip.
q | ||||||||
---|---|---|---|---|---|---|---|---|
p | s | ( | s | ∨ | p | ) | = q | |
0 | 0 | 0 | 0 | 0 | 0 | state 0, s=0 | ||
0 | 1 | 1 | 1 | 0 | q & p inconsistent | |||
1 | 0 | 0 | 1 | 1 | 1 | state 1 with s = 0 | ||
1 | 1 | 1 | 1 | 1 | 1 | state 1 with s = 1 |
Flip-flop memory
The next simplest case is the "set-reset" flip-flop shown below the once-flip. Given that r=0 & s=0 and q=0 at the outset, it is "set" (s=1) in a manner similar to the once-flip. It however has a provision to "reset" q=0 when "r"=1. And additional complication occurs if both set=1 and reset=1. In this formula, the set=1 forces the output q=1 so when and if (s=0 & r=1) the flip-flop will be reset. Or, if (s=1 & r=0) the flip-flop will be set. In the abstract (ideal) instance in which s=1 ⇒ s=0 & r=1 ⇒ r=0 simultaneously, the formula q will be indeterminate (undecidable). Due to delays in "real" OR, AND and NOT the result will be unknown at the outset but thereafter predicable.
q | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
p | s | r | ( | s | ∨ | ( | p | & | ~ | ( | r | ) | ) | ) | = q | |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | state 0 with ( s=0 & r=0 ) | ||||||
0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | state 0 with ( s=0 & r=1 ) | ||||||
0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | q & p inconsistent | |||||||
0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | q & p inconsistent | |||||||
1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | state 1 with ( s=0 & r=0 ) | ||||||
1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | q & p inconsistent | |||||||
1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | state 1 with ( s=1 & r=0 ) | ||||||
1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | state 1 with s & r simultaneously 1 |
Clocked flip-flop memory
The formula known as "clocked flip-flop" memory ("c" is the "clock" and "d" is the "data") is given below. It works as follows: When c = 0 the data d (either 0 or 1) cannot "get through" to affect output q. When c = 1 the data d "gets through" and output q "follows" d's value. When c goes from 1 to 0 the last value of the data remains "trapped" at output "q". As long as c=0, d can change value without causing q to change.
- Examples
- ( ( c & d ) ∨ ( p & ( ~( c & ~( d ) ) ) ) = q, but now let p = q:
- ( ( c & d ) ∨ ( q & ( ~( c & ~( d ) ) ) ) = q
The state diagram is similar in shape to the flip-flop's state diagram, but with different labelling on the transitions.
s | q | w | v | r | u | |||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
row | q | d | c | ( | ( | c | & | d | ) | ∨ | ( | q | & | ~ | ( | ( | c | & | ~ | ( | d | ) | ) | ) | ) | ) | =q | Description |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | state 0 with ( s=0 & r=0 ), 0 is trapped | ||||||||||||
1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | state 0 with ( d=0 & c=1 ): q=0 is following d=0 | ||||||||||||
2 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | state 0 with ( d=1 & r=0 ), 0 is trapped | ||||||||||||
3 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | q & p inconsistent | |||||||||||||
4 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 1 | state 1 with (d =0 & c=0 ), 1 is trapped | ||||||||||||
5 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | q & p inconsistent | |||||||||||||
6 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | state 1 with (d =1 & c=0 ), 1 is trapped | ||||||||||||
7 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | state 1 with ( d=1 & c=1 ): q=1 is following d=1 |
Historical development
Bertrand Russell (1912:74) lists three laws of thought that derive from Aristotle: (1) The law of identity: "Whatever is, is.", (2) The law of noncontradiction: "Nothing can both be and not be", and (3) The law of excluded middle: "Everything must be or not be."
- Example: Here O is an expression about an object's BEING or QUALITY:
- Law of Identity: O = O
- Law of contradiction: ~(O & ~(O))
- Law of excluded middle: (O ∨ ~(O))
The use of the word "everything" in the law of excluded middle renders Russell's expression of this law open to debate. If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects (a finite "universe of discourse") -- the members of which can be investigated one after another for the presence or absence of the assertion—then the law is considered intuitionistically appropriate. Thus an assertion such as: "This object must either BE or NOT BE (in the collection)", or "This object must either have this QUALITY or NOT have this QUALITY (relative to the objects in the collection)" is acceptable. See more at Venn diagram.
Although a propositional calculus originated with Aristotle, the notion of an algebra applied to propositions had to wait until the early 19th century. In an (adverse) reaction to the 2000 year tradition of Aristotle's syllogisms, John Locke's Essay concerning human understanding (1690) used the word semiotics (theory of the use of symbols). By 1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy toward Locke's semiotics. George Bentham's work (1827) resulted in the notion of "quantification of the predicate" (1827) (nowadays symbolized as ∀ ≡ "for all"). A "row" instigated by William Hamilton over a priority dispute with Augustus De Morgan "inspired George Boole to write up his ideas on logic, and to publish them as MAL [Mathematical Analysis of Logic] in 1847" (Grattin-Guinness and Bornet 1997:xxviii).
About his contribution Grattin-Guinness and Bornet comment:
- "Boole's principal single innovation was [the] law [ xn = x ] for logic: it stated that the mental acts of choosing the property x and choosing x again and again is the same as choosing x once... As consequence of it he formed the equations x•(1-x)=0 and x+(1-x)=1 which for him expressed respectively the law of contradiction and the law of excluded middle" (p. xxviiff). For Boole "1" was the universe of discourse and "0" was nothing.
Gottlob Frege's massive undertaking (1879) resulted in a formal calculus of propositions, but his symbolism is so daunting that it had little influence excepting on one person: Bertrand Russell. First as the student of Alfred North Whitehead he studied Frege's work and suggested a (famous and notorious) emendation with respect to it (1904) around the problem of an antinomy that he discovered in Frege's treatment ( cf Russell's paradox ). Russell's work led to a collaboration with Whitehead that, in the year 1912, produced the first volume of Principia Mathematica (PM). It is here that what we consider "modern" propositional logic first appeared. In particular, PM introduces NOT and OR and the assertion symbol ⊦ as primitives. In terms of these notions they define IMPLICATION → ( def. *1.01: ~p ∨ q ), then AND (def. *3.01: ~(~p ∨ ~q) ), then EQUIVALENCE p ←→ q (*4.01: (p → q) & ( q → p ) ).
- Henry M. Sheffer (1921) and Jean Nicod demonstrate that only one connective, the "stroke" | is sufficient to express all propositional formulas.
- Emil Post (1921) develops the truth-table method of analysis in his "Introduction to a general theory of elementary propositions". He notes Nicod's stroke | .
- Whitehead and Russell add an introduction to their 1927 re-publication of PM adding, in part, a favorable treatment of the "stroke".
Computation and switching logic:
- William Eccles and F. W. Jordan (1919) describe a "trigger relay" made from a vacuum tube.
- George Stibitz (1937) invents the binary adder using mechanical relays. He builds this on his kitchen table.
- Example: Given binary bits ai and bi and carry-in ( c_ini), their summation Σi and carry-out (c_outi) are:
- ( ( ai XOR bi ) XOR c_ini )= Σi
- ( ai & bi ) ∨ c_ini ) = c_outi;
- Alan Turing builds a multiplier using relays (1937–1938). He has to hand-wind his own relay coils to do this.
- Textbooks about "switching circuits" appear in the early 1950s.
- Willard Quine 1952 and 1955, E. W. Veitch 1952, and M. Karnaugh (1953) develop map-methods for simplifying propositional functions.
- George H. Mealy (1955) and Edward F. Moore (1956) address the theory of sequential (i.e. switching-circuit) "machines".
- E. J. McCluskey and H. Shorr develop a method for simplifying propositional (switching) circuits (1962).
Footnotes
- Rosenbloom discusses this problem of implication at some length. Most philosophers and mathematicians just accept the material definition as given above. But some do not, including the intuitionists; they consider it a form of the law of excluded middle misapplied.
- Rosenbloom and Kleene 1952:73-74 ranks all 11 symbols.
Citations
- Kao, Eric J.; Genesereth, Michael (2017). "Propositional Logic". Introduction to Logic (3rd ed.). doi:10.1007/978-3-031-01801-5. ISBN 9783031018015.
- Hamilton 1978:1
- Principia Mathematica (PM) p. 91 eschews "the" because they require a clear-cut "object of sensation"; they stipulate the use of "this"
- (italics added) Reichenbach[clarification needed] p.80.
- Tarski p.54-68. Suppes calls IDENTITY a "further rule of inference" and has a brief development around it; Robbin, Bender and Williamson, and Goodstein introduce the sign and its usage without comment or explanation. Hamilton p. 37 employs two signs ≠ and = with respect to the valuation of a formula in a formal calculus. Kleene p. 70 and Hamilton p. 52 place it in the predicate calculus, in particular with regards to the arithmetic of natural numbers.
- Empiricits eschew the notion of a priori (built-in, born-with) knowledge. "Radical reductionists" such as John Locke and David Hume "held that every idea must either originate directly in sense experience or else be compounded of ideas thus originating"; quoted from Quine reprinted in 1996 The Emergence of Logical Empriricism, Garland Publishing Inc. http://www.marxists.org/reference/subject/philosophy/works/us/quine.htm
- Neural net modelling offers a good mathematical model for a comparator as follows: Given a signal S and a threshold "thr", subtract "thr" from S and substitute this difference d to a sigmoid function: For large "gains" k, e.g. k=100, 1/( 1 + e−k*d ) = 1/( 1 + e−k*(S-thr) ) = { ≃0, ≃1 }.[clarification needed] For example, if "The door is DOWN" means "The door is less than 50% of the way up", then a threshold thr=0.5 corresponding to 0.5*5.0 = +2.50 volts could be applied to a "linear" measuring-device with an output of 0 volts when fully closed and +5.0 volts when fully open.
- In actuality the digital 1 and 0 are defined over non-overlapping ranges e.g. { "1" = +5/+0.2/−1.0 volts, 0 = +0.5/−0.2 volts }[clarification needed]. When a value falls outside the defined range(s) the value becomes "u" -- unknown; e.g. +2.3 would be "u".
- While the notion of logical product is not so peculiar (e.g. 0*0=0, 0*1=0, 1*0=0, 1*1=1), the notion of (1+1=1 is peculiar; in fact (a "+" b) = (a + (b - a*b)) where "+" is the "logical sum" but + and - are the true arithmetic counterparts. Occasionally all four notions do appear in a formula: A AND B = 1/2*( A plus B minus ( A XOR B ) ] (cf p. 146 in John Wakerly 1978, Error Detecting Codes, Self-Checking Circuits and Applications, North-Holland, New York, ISBN 0-444-00259-6 pbk.)
- A careful look at its Karnaugh map shows that IF...THEN...ELSE can also be expressed, in a rather round-about way, in terms of two exclusive-ORs: ( (b AND (c XOR a)) OR (a AND (c XOR b)) ) = d.
- Robbin p. 3.
- Rosenbloom 1950, pp. 30 and 54ff.
- Indeed, exhaustive selection between alternatives -- mutual exclusion -- is required by the definition that Kleene gives the CASE operator (Kleene 1952229)
- The use of quote marks around the expressions is not accidental. Tarski comments on the use of quotes in his "18. Identity of things and identity of their designations; use of quotation marks" p. 58ff.
- Hamilton p. 37. Bender and Williamson p. 29 state "In what follows, we'll replace "equals" with the symbol " ⇔ " (equivalence) which is usually used in logic. We use the more familiar " = " for assigning meaning and values."
- Reichenbach p. 20-22 and follows the conventions of PM. The symbol =Df is in the metalanguage and is not a formal symbol with the following meaning: "by symbol ' s ' is to have the same meaning as the formula '(c & d)' ".
- Rosenbloom 1950, p. 32.
- cf Minsky 1967:75, section 4.2.3 "The method of parenthesis counting". Minsky presents a state machine that will do the job, and by use of induction (recursive definition) Minsky proves the "method" and presents a theorem as the result. A fully generalized "parenthesis grammar" requires an infinite state machine (e.g. a Turing machine) to do the counting.
- Robbin p. 7
- cf Reichenbach p. 68 for a more involved discussion: "If the inference is valid and the premises are true, the inference is called conclusive.
- As well as the first three, Hamilton pp.19-22 discusses logics built from only | (NAND), and ↓ (NOR).
- Wickes 1967:36ff. Wickes offers a good example of 8 of the 2 x 4 (3-variable maps) and 16 of the 4 x 4 (4-variable) maps. As an arbitrary 3-variable map could represent any one of 28=256 2x4 maps, and an arbitrary 4-variable map could represent any one of 216 = 65,536 different formula-evaluations, writing down every one is infeasible.
- This definition is given by Stephen Kleene. Both Kurt Gödel and Kleene believed that the classical paradoxes are uniformly examples of this sort of definition. But Kleene went on to assert that the problem has not been solved satisfactorily and impredicative definitions can be found in analysis. He gives as example the definition of the least upper bound (l.u.b) u of M. Given a Dedekind cut of the number line C and the two parts into which the number line is cut, i.e. M and (C - M), l.u.b. = u is defined in terms of the notion M, whereas M is defined in terms of C. Thus the definition of u, an element of C, is defined in terms of the totality C and this makes its definition impredicative. Kleene asserts that attempts to argue this away can be used to uphold the impredicative definitions in the paradoxes.(Kleene 1952:43).
- McCluskey comments that "it could be argued that the analysis is still incomplete because the word statement "The outputs are equal to the previous values of the inputs" has not been obtained"; he goes on to dismiss such worries because "English is not a formal language in a mathematical sense, [and] it is not really possible to have a formal procedure for obtaining word statements" (p. 185).
- More precisely, given enough "loop gain", either oscillation or memory will occur (cf McCluskey p. 191-2). In abstract (idealized) mathematical systems adequate loop gain is not a problem.
- The notion of delay and the principle of local causation as caused ultimately by the speed of light appears in Robin Gandy (1980), "Church's thesis and Principles for Mechanisms", in J. Barwise, H. J. Keisler and K. Kunen, eds., The Kleene Symposium, North-Holland Publishing Company (1980) 123-148. Gandy considered this to be the most important of his principles: "Contemporary physics rejects the possibility of instantaneous action at a distance" (p. 135). Gandy was Alan Turing's student and close friend.
- McKlusky p. 194-5 discusses "breaking the loop" and inserts "amplifiers" to do this; Wickes (p. 118-121) discuss inserting delays. McCluskey p. 195ff discusses the problem of "races" caused by delays.
References
- Rosenbloom, Paul (1950). The Elements of Mathematical Logic. Mineola, New York: Dover Publications, Inc. ISBN 0-486-44617-4.
- Kleene, Stephen (1952). Introduction to metamathematics. Amsterdam: North-Holland Publishing Company.
- Bender, Edward A. and Williamson, S. Gill, 2005, A Short Course in Discrete Mathematics, Dover Publications, Mineola NY, ISBN 0-486-43946-1. This text is used in a "lower division two-quarter [computer science] course" at UC San Diego.
- Enderton, H. B., 2002, A Mathematical Introduction to Logic. Harcourt/Academic Press. ISBN 0-12-238452-0
- Goodstein, R. L., (Pergamon Press 1963), 1966, (Dover edition 2007), Boolean Algebra, Dover Publications, Inc. Minola, New York, ISBN 0-486-45894-6. Emphasis on the notion of "algebra of classes" with set-theoretic symbols such as ∩, ∪, ' (NOT), ⊂ (IMPLIES). Later Goldstein replaces these with &, ∨, ¬, → (respectively) in his treatment of "Sentence Logic" pp. 76–93.
- Ivor Grattan-Guinness and Gérard Bornet 1997, George Boole: Selected Manuscripts on Logic and its Philosophy, Birkhäuser Verlag, Basil, ISBN 978-0-8176-5456-6 (Boston).
- A. G. Hamilton 1978, Logic for Mathematicians, Cambridge University Press, Cambridge UK, ISBN 0-521-21838-1.
- E. J. McCluskey 1965, Introduction to the Theory of Switching Circuits, McGraw-Hill Book Company, New York. No ISBN. Library of Congress Catalog Card Number 65-17394. McCluskey was a student of Willard Quine and developed some notable theorems with Quine and on his own. For those interested in the history, the book contains a wealth of references.
- Marvin L. Minsky 1967, Computation: Finite and Infinite Machines, Prentice-Hall, Inc, Englewood Cliffs, N.J.. No ISBN. Library of Congress Catalog Card Number 67-12342. Useful especially for computability, plus good sources.
- 1969, 1997, Mathematical Logic: A First Course, Dover Publications, Inc., Mineola, New York, ISBN 0-486-45018-X (pbk.).
- Patrick Suppes 1957 (1999 Dover edition), Introduction to Logic, Dover Publications, Inc., Mineola, New York. ISBN 0-486-40687-3 (pbk.). This book is in print and readily available.
- On his page 204 in a footnote he references his set of axioms to E. V. Huntington, "Sets of Independent Postulates for the Algebra of Logic", Transactions of the American Mathematical Society, Vol. 5 91904) pp. 288-309.
- Alfred Tarski 1941 (1995 Dover edition), Introduction to Logic and to the Methodology of Deductive Sciences, Dover Publications, Inc., Mineola, New York. ISBN 0-486-28462-X (pbk.). This book is in print and readily available.
- Jean van Heijenoort 1967, 3rd printing with emendations 1976, From Frege to Gödel: A Source Book in Mathematical Logic, 1879-1931, Harvard University Press, Cambridge, Massachusetts. ISBN 0-674-32449-8 (pbk.) Translation/reprints of Frege (1879), Russell's letter to Frege (1902) and Frege's letter to Russell (1902), Richard's paradox (1905), Post (1921) can be found here.
- Alfred North Whitehead and Bertrand Russell 1927 2nd edition, paperback edition to *53 1962, Principia Mathematica, Cambridge University Press, no ISBN. In the years between the first edition of 1912 and the 2nd edition of 1927, H. M. Sheffer 1921 and M. Jean Nicod (no year cited) brought to Russell's and Whitehead's attention that what they considered their primitive propositions (connectives) could be reduced to a single |, nowadays known as the "stroke" or NAND (NOT-AND, NEITHER ... NOR...). Russell-Whitehead discuss this in their "Introduction to the Second Edition" and makes the definitions as discussed above.
- William E. Wickes 1968, Logic Design with Integrated Circuits, John Wiley & Sons, Inc., New York. No ISBN. Library of Congress Catalog Card Number: 68-21185. Tight presentation of engineering's analysis and synthesis methods, references McCluskey 1965. Unlike Suppes, Wickes' presentation of "Boolean algebra" starts with a set of postulates of a truth-table nature and then derives the customary theorems of them (p. 18ff).
External links
Media related to Propositional formula at Wikimedia Commons
In propositional logic a propositional formula is a type of syntactic formula which is well formed If the values of all variables in a propositional formula are given it determines a unique truth value A propositional formula may also be called a propositional expression a sentence or a sentential formula A propositional formula is constructed from simple propositions such as five is greater than three or propositional variables such as p and q using connectives or logical operators such as NOT AND OR or IMPLIES for example p AND NOT q IMPLIES p OR q In mathematics a propositional formula is often more briefly referred to as a proposition but more precisely a propositional formula is not a proposition but a formal expression that denotes a proposition a formal object under discussion just like an expression such as x y is not a value but denotes a value In some contexts maintaining the distinction may be of importance PropositionsFor the purposes of the propositional calculus propositions utterances sentences assertions are considered to be either simple or compound Compound propositions are considered to be linked by sentential connectives some of the most common of which are AND OR IF THEN NEITHER NOR IS EQUIVALENT TO The linking semicolon and connective BUT are considered to be expressions of AND A sequence of discrete sentences are considered to be linked by AND s and formal analysis applies a recursive parenthesis rule with respect to sequences of simple propositions see more below about well formed formulas For example The assertion This cow is blue That horse is orange but this horse here is purple is actually a compound proposition linked by AND s This cow is blue AND that horse is orange AND this horse here is purple Simple propositions are declarative in nature that is they make assertions about the condition or nature of a particular object of sensation e g This cow is blue There s a coyote That coyote IS there behind the rocks Thus the simple primitive assertions must be about specific objects or specific states of mind Each must have at least a subject an immediate object of thought or observation a verb in the active voice and present tense preferred and perhaps an adjective or adverb Dog probably implies I see a dog but should be rejected as too ambiguous Example That purple dog is running This cow is blue Switch M31 is closed This cap is off Tomorrow is Friday For the purposes of the propositional calculus a compound proposition can usually be reworded into a series of simple sentences although the result will probably sound stilted Relationship between propositional and predicate formulas The predicate calculus goes a step further than the propositional calculus to an analysis of the inner structure of propositions It breaks a simple sentence down into two parts i its subject the object singular or plural of discourse and ii a predicate a verb or possibly verb clause that asserts a quality or attribute of the object s The predicate calculus then generalizes the subject predicate form where symbolizes concatenation stringing together of symbols into a form with the following blank subject structure predicate and the predicate in turn generalized to all things with that property Example This blue pig has wings becomes two sentences in the propositional calculus This pig has wings AND This pig is blue whose internal structure is not considered In contrast in the predicate calculus the first sentence breaks into this pig as the subject and has wings as the predicate Thus it asserts that object this pig is a member of the class set collection of winged things The second sentence asserts that object this pig has an attribute blue and thus is a member of the class of blue things One might choose to write the two sentences connected with AND as p W AND p B dd The generalization of this pig to a potential member of two classes winged things and blue things means that it has a truth relationship with both of these classes In other words given a domain of discourse winged things p is either found to be a member of this domain or not Thus there is a relationship W wingedness between p pig and T F W p evaluates to T F where T F is the set of the Boolean values true and false Likewise for B blueness and p pig and T F B p evaluates to T F So one now can analyze the connected assertions B p AND W p for its overall truth value i e B p AND W p evaluates to T F In particular simple sentences that employ notions of all some a few one of etc called logical quantifiers are treated by the predicate calculus Along with the new function symbolism F x two new symbols are introduced For all and There exists At least one of exists etc The predicate calculus but not the propositional calculus can establish the formal validity of the following statement All blue pigs have wings but some pigs have no wings hence some pigs are not blue Identity Tarski asserts that the notion of IDENTITY as distinguished from LOGICAL EQUIVALENCE lies outside the propositional calculus however he notes that if a logic is to be of use for mathematics and the sciences it must contain a theory of IDENTITY Some authors refer to predicate logic with identity to emphasize this extension See more about this below An algebra of propositions the propositional calculusThis section is written like a personal reflection personal essay or argumentative essay that states a Wikipedia editor s personal feelings or presents an original argument about a topic Please help improve it by rewriting it in an encyclopedic style June 2021 Learn how and when to remove this message An algebra and there are many different ones loosely defined is a method by which a collection of symbols called variables together with some other symbols such as parentheses and some sub set of symbols such as amp are manipulated within a system of rules These symbols and well formed strings of them are said to represent objects but in a specific algebraic system these objects do not have meanings Thus work inside the algebra becomes an exercise in obeying certain laws rules of the algebra s syntax symbol formation rather than in semantics meaning of the symbols The meanings are to be found outside the algebra For a well formed sequence of symbols in the algebra a formula to have some usefulness outside the algebra the symbols are assigned meanings and eventually the variables are assigned values then by a series of rules the formula is evaluated When the values are restricted to just two and applied to the notion of simple sentences e g spoken utterances or written assertions linked by propositional connectives this whole algebraic system of symbols and rules and evaluation methods is usually called the propositional calculus or the sentential calculus While some of the familiar rules of arithmetic algebra continue to hold in the algebra of propositions e g the commutative and associative laws for AND and OR some do not e g the distributive laws for AND OR and NOT Usefulness of propositional formulas Analysis In deductive reasoning philosophers rhetoricians and mathematicians reduce arguments to formulas and then study them usually with truth tables for correctness soundness For example Is the following argument sound Given that consciousness is sufficient for an artificial intelligence and only conscious entities can pass the Turing test before we can conclude that a robot is an artificial intelligence the robot must pass the Turing test Engineers analyze the logic circuits they have designed using synthesis techniques and then apply various reduction and minimization techniques to simplify their designs Synthesis Engineers in particular synthesize propositional formulas that eventually end up as circuits of symbols from truth tables For example one might write down a truth table for how binary addition should behave given the addition of variables b and a and carry in ci and the results carry out co and sum S Example in row 5 b a ci 1 0 1 the number 2 written as a binary number this is 102 where co 1 and S 0 as shown in the right most columns row b a ci b a ci co S0 0 0 0 0 0 01 0 0 1 1 0 12 0 1 0 1 0 13 0 1 1 2 1 04 1 0 0 1 0 15 1 0 1 2 1 06 1 1 0 2 1 07 1 1 1 3 1 1Propositional variables The simplest type of propositional formula is a propositional variable Propositions that are simple atomic symbolic expressions are often denoted by variables named p q or P Q etc A propositional variable is intended to represent an atomic proposition assertion such as It is Saturday p here the symbol means is assigned the variable named or I only go to the movies on Monday q Truth value assignments formula evaluations Evaluation of a propositional formula begins with assignment of a truth value to each variable Because each variable represents a simple sentence the truth values are being applied to the truth or falsity of these simple sentences Truth values in rhetoric philosophy and mathematics The truth values are only two TRUTH T FALSITY F An empiricist puts all propositions into two broad classes analytic true no matter what e g tautology and synthetic derived from experience and thereby susceptible to confirmation by third parties the verification theory of meaning Empiricists hold that in general to arrive at the truth value of a synthetic proposition meanings pattern matching templates must first be applied to the words and then these meaning templates must be matched against whatever it is that is being asserted For example my utterance That cow is blue Is this statement a TRUTH Truly I said it And maybe I am seeing a blue cow unless I am lying my statement is a TRUTH relative to the object of my perhaps flawed perception But is the blue cow really there What do you see when you look out the same window In order to proceed with a verification you will need a prior notion a template of both cow and blue and an ability to match the templates against the object of sensation if indeed there is one citation needed Truth values in engineering Engineers try to avoid notions of truth and falsity that bedevil philosophers but in the final analysis engineers must trust their measuring instruments In their quest for robustness engineers prefer to pull known objects from a small library objects that have well defined predictable behaviors even in large combinations hence their name for the propositional calculus combinatorial logic The fewest behaviors of a single object are two e g OFF ON open shut UP DOWN etc and these are put in correspondence with 0 1 Such elements are called digital those with a continuous range of behaviors are called analog Whenever decisions must be made in an analog system quite often an engineer will convert an analog behavior the door is 45 32146 UP to digital e g DOWN 0 by use of a comparator Thus an assignment of meaning of the variables and the two value symbols 0 1 comes from outside the formula that represents the behavior of the usually compound object An example is a garage door with two limit switches one for UP labelled SW U and one for DOWN labelled SW D and whatever else is in the door s circuitry Inspection of the circuit either the diagram or the actual objects themselves door switches wires circuit board etc might reveal that on the circuit board node 22 goes to 0 volts when the contacts of switch SW D are mechanically in contact closed and the door is in the down position 95 down and node 29 goes to 0 volts when the door is 95 UP and the contacts of switch SW U are in mechanical contact closed The engineer must define the meanings of these voltages and all possible combinations all 4 of them including the bad ones e g both nodes 22 and 29 at 0 volts meaning that the door is open and closed at the same time The circuit mindlessly responds to whatever voltages it experiences without any awareness of TRUTH or FALSEHOOD RIGHT or WRONG SAFE or DANGEROUS citation needed Propositional connectivesArbitrary propositional formulas are built from propositional variables and other propositional formulas using propositional connectives Examples of connectives include The unary negation connective If a displaystyle alpha is a formula then a displaystyle lnot alpha is a formula The classical binary connectives displaystyle land lor to leftrightarrow Thus for example if a displaystyle alpha and b displaystyle beta are formulas so is a b displaystyle alpha to beta Other binary connectives such as NAND NOR and XOR The ternary connective IF THEN ELSE Constant 0 ary connectives and alternately constants T F 1 0 etc The theory extension connective EQUALS alternately IDENTITY or the sign as distinguished from the logical connective displaystyle leftrightarrow Connectives of rhetoric philosophy and mathematics The following are the connectives common to rhetoric philosophy and mathematics together with their truth tables The symbols used will vary from author to author and between fields of endeavor In general the abbreviations T and F stand for the evaluations TRUTH and FALSITY applied to the variables in the propositional formula e g the assertion That cow is blue will have the truth value T for Truth or F for Falsity as the case may be The connectives go by a number of different word usages e g a IMPLIES b is also said IF a THEN b Some of these are shown in the table b only if ab IS SUFFICIENT FOR a b PRECISELY WHEN aa IS NECESSARY FOR b b IF AND ONLY IF a b IFF ainclusive OR IF b THEN a b IS NECESSARY AND SUFFICIENT FOR anegation negation conjunction disjunction implication biconditionalvariables NOT b NOT a b AND a b OR a b IMPLIES a b IS logically equivalent TO a f IS A tautology NEITHER a NOR b b stroke a exclusive ORb a b a b a b a b a b a f formula a NOR b b a variousF F T T F F T T T T T FF T T F F T T F T F T TT F F T F T F F T F T TT T F F T T T T T F F FEngineering connectives Engineering symbols have varied over the years but these are commonplace Sometimes they appear simply as boxes with symbols in them a and b are called the inputs and c is called the output In general the engineering connectives are just the same as the mathematics connectives excepting they tend to evaluate with 1 T and 0 F This is done for the purposes of analysis minimization and synthesis of formulas by use of the notion of minterms and Karnaugh maps see below Engineers also use the words logical product from Boole s notion a a a and logical sum from Jevons notion a a a logical product logical sum half adder no carry exclusive ORrow number variables NOT NOT AND OR NAND NOR XORb 21 a 20 b a b a b amp a b a b amp a b a 0 0 0 1 1 0 0 1 1 01 0 1 1 0 0 1 1 0 12 1 0 0 1 0 1 1 0 13 1 1 0 0 1 1 0 0 0CASE connective IF THEN ELSE The IF THEN ELSE connective appears as the simplest form of CASE operator of recursion theory and computation theory and is the connective responsible for conditional goto s jumps branches From this one connective all other connectives can be constructed see more below Although IF c THEN b ELSE a sounds like an implication it is in its most reduced form a switch that makes a decision and offers as outcome only one of two alternatives a or b hence the name switch statement in the C programming language The following three propositions are equivalent as indicated by the logical equivalence sign IF counter is zero THEN go to instruction b ELSE go to instruction a c b amp c a IF counter is zero THEN go to instruction b AND IF It is NOT the case that counter is zero THEN go to instruction a c amp b c amp a Counter is zero AND go to instruction b OR It is NOT the case that counter is zero AND go to instruction a Thus IF THEN ELSE unlike implication does not evaluate to an ambiguous TRUTH when the first proposition is false i e c F in c b For example most people would reject the following compound proposition as a nonsensical non sequitur because the second sentence is not connected in meaning to the first Example The proposition IF Winston Churchill was Chinese THEN The sun rises in the east evaluates as a TRUTH given that Winston Churchill was Chinese is a FALSEHOOD and The sun rises in the east evaluates as a TRUTH In recognition of this problem the sign of formal implication in the propositional calculus is called material implication to distinguish it from the everyday intuitive implication The use of the IF THEN ELSE construction avoids controversy because it offers a completely deterministic choice between two stated alternatives it offers two objects the two alternatives b and a and it selects between them exhaustively and unambiguously In the truth table below d1 is the formula IF c THEN b AND IF NOT c THEN a Its fully reduced form d2 is the formula c AND b OR NOT c AND a The two formulas are equivalent as shown by the columns d1 and d2 Electrical engineers call the fully reduced formula the AND OR SELECT operator The CASE or SWITCH operator is an extension of the same idea to n possible but mutually exclusive outcomes Electrical engineers call the CASE operator a multiplexer d1 d2row c b a c b amp c a d1 c amp b c amp a d20 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 01 0 0 1 0 1 0 1 1 0 1 1 1 0 0 0 1 1 0 1 1 12 0 1 0 0 1 1 0 1 0 0 0 0 0 0 1 0 1 0 0 0 03 0 1 1 0 1 1 1 1 0 1 1 1 0 0 1 1 1 0 1 1 14 1 0 0 1 0 0 0 0 1 1 0 0 1 0 0 0 0 1 0 0 05 1 0 1 1 0 0 0 0 1 1 1 0 1 0 0 0 0 1 0 1 06 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 1 0 0 17 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 0 1 0 1 1IDENTITY and evaluation The first table of this section stars the entry logical equivalence to note the fact that Logical equivalence is not the same thing as identity For example most would agree that the assertion That cow is blue is identical to the assertion That cow is blue On the other hand logical equivalence sometimes appears in speech as in this example The sun is shining means I m biking Translated into a propositional formula the words become IF the sun is shining THEN I m biking AND IF I m biking THEN the sun is shining IF s THEN b AND IF b THEN s is written as s b amp b s or in an abbreviated form as s b As the rightmost symbol string is a definition for a new symbol in terms of the symbols on the left the use of the IDENTITY sign is appropriate s b amp b s s b dd Different authors use different signs for logical equivalence e g Suppes Goodstein Hamilton e g Robbin e g Bender and Williamson Typically identity is written as the equals sign One exception to this rule is found in Principia Mathematica For more about the philosophy of the notion of IDENTITY see Leibniz s law As noted above Tarski considers IDENTITY to lie outside the propositional calculus but he asserts that without the notion logic is insufficient for mathematics and the deductive sciences In fact the sign comes into the propositional calculus when a formula is to be evaluated In some systems there are no truth tables but rather just formal axioms e g strings of symbols from a set variables p1 p2 p3 and formula formation rules rules about how to make more symbol strings from previous strings by use of e g substitution and modus ponens the result of such a calculus will be another formula i e a well formed symbol string Eventually however if one wants to use the calculus to study notions of validity and truth one must add axioms that define the behavior of the symbols called the truth values T F or 1 0 etc relative to the other symbols For example Hamilton uses two symbols and when he defines the notion of a valuation v of any well formed formulas wffs A and B in his formal statement calculus L A valuation v is a function from the wffs of his system L to the range output T F given that each variable p1 p2 p3 in a wff is assigned an arbitrary truth value T F v A v A iv A B F if and only if v A T and v B F ii The two definitions i and ii define the equivalent of the truth tables for the NOT and IMPLICATION connectives of his system The first one derives F T and T F in other words v A does not mean v A Definition ii specifies the third row in the truth table and the other three rows then come from an application of definition i In particular ii assigns the value F or a meaning of F to the entire expression The definitions also serve as formation rules that allow substitution of a value previously derived into a formula v A B v A v B F T FF T TT F FT T T Some formal systems specify these valuation axioms at the outset in the form of certain formulas such as the law of contradiction or laws of identity and nullity The choice of which ones to use together with laws such as commutation and distribution is up to the system s designer as long as the set of axioms is complete i e sufficient to form and to evaluate any well formed formula created in the system More complex formulasAs shown above the CASE IF c THEN b ELSE a connective is constructed either from the 2 argument connectives IF THEN and AND or from OR and AND and the 1 argument NOT Connectives such as the n argument AND a amp b amp c amp amp n OR a b c n are constructed from strings of two argument AND and OR and written in abbreviated form without the parentheses These and other connectives as well can then be used as building blocks for yet further connectives Rhetoricians philosophers and mathematicians use truth tables and the various theorems to analyze and simplify their formulas Electrical engineering uses drawn symbols and connect them with lines that stand for the mathematicals act of substitution and replacement They then verify their drawings with truth tables and simplify the expressions as shown below by use of Karnaugh maps or the theorems In this way engineers have created a host of combinatorial logic i e connectives without feedback such as decoders encoders mutifunction gates majority logic binary adders arithmetic logic units etc Definitions A definition creates a new symbol and its behavior often for the purposes of abbreviation Once the definition is presented either form of the equivalent symbol or formula can be used The following symbolism Df is following the convention of Reichenbach Some examples of convenient definitions drawn from the symbol set amp and variables Each definition is producing a logically equivalent formula that can be used for substitution or replacement definition of a new variable c amp d Df s OR a amp b Df a b IMPLICATION a b Df a b XOR a amp b a amp b Df a b LOGICAL EQUIVALENCE a b amp b a Df a b Axiom and definition schemas The definitions above for OR IMPLICATION XOR and logical equivalence are actually schemas or schemata that is they are models demonstrations examples for a general formula format but shown for illustrative purposes with specific letters a b c for the variables whereas any variable letters can go in their places as long as the letter substitutions follow the rule of substitution below Example In the definition a b Df a b other variable symbols such as SW2 and CON1 might be used i e formally a Df SW2 b Df CON1 so we would have as an instance of the definition schema SW2 CON1 Df SW2 CON1 dd Substitution versus replacement Substitution The variable or sub formula to be substituted with another variable constant or sub formula must be replaced in all instances throughout the overall formula Example c amp d p amp c amp d but q1 amp q2 d Now wherever variable d occurs substitute q1 amp q2 c amp q1 amp q2 p amp c amp q1 amp q2 dd Replacement i the formula to be replaced must be within a tautology i e logically equivalent connected by or to the formula that replaces it and ii unlike substitution its permissible for the replacement to occur only in one place i e for one formula Example Use this set of formula schemas equivalences a 0 a a amp a 0 a b Df a b start with a aUse 1 to replace a with a 0 a 0 Use the notion of schema to substitute b for a in 2 a amp a 0 Use 2 to replace 0 with b amp b a b amp b see below for how to distribute a over b amp b etc Inductive definitionThe classical presentation of propositional logic see Enderton 2002 uses the connectives displaystyle lnot land lor to leftrightarrow The set of formulas over a given set of propositional variables is inductively defined to be the smallest set of expressions such that Each propositional variable in the set is a formula a displaystyle lnot alpha is a formula whenever a displaystyle alpha is and a b displaystyle alpha Box beta is a formula whenever a displaystyle alpha and b displaystyle beta are formulas and displaystyle Box is one of the binary connectives displaystyle land lor to leftrightarrow This inductive definition can be easily extended to cover additional connectives The inductive definition can also be rephrased in terms of a closure operation Enderton 2002 Let V denote a set of propositional variables and let XV denote the set of all strings from an alphabet including symbols in V left and right parentheses and all the logical connectives under consideration Each logical connective corresponds to a formula building operation a function from XXV to XXV Given a string z the operation E z displaystyle mathcal E lnot z returns z displaystyle lnot z Given strings y and z the operation E y z displaystyle mathcal E land y z returns y z displaystyle y land z There are similar operations E displaystyle mathcal E lor E displaystyle mathcal E to and E displaystyle mathcal E leftrightarrow corresponding to the other binary connectives The set of formulas over V is defined to be the smallest subset of XXV containing V and closed under all the formula building operations Parsing formulasThe following laws of the propositional calculus are used to reduce complex formulas The laws can be verified easily with truth tables For each law the principal outermost connective is associated with logical equivalence or identity A complete analysis of all 2n combinations of truth values for its n distinct variables will result in a column of 1 s T s underneath this connective This finding makes each law by definition a tautology And for a given law because its formula on the left and right are equivalent or identical they can be substituted for one another Example The following truth table is De Morgan s law for the behavior of NOT over OR a b a amp b To the left of the principal connective yellow column labelled taut the formula b a evaluates to 1 0 0 0 under the label P On the right of taut the formula b a also evaluates to 1 0 0 0 under the label Q As the two columns have equivalent evaluations the logical equivalence under taut evaluates to 1 1 1 1 i e P Q Thus either formula can be substituted for the other if it appears in a larger formula P taut Qb a b V a b amp a 0 0 1 0 0 0 1 1 0 1 1 00 1 0 0 1 1 1 1 0 0 0 11 0 0 1 1 0 1 0 1 0 1 01 1 0 1 1 1 1 0 1 0 0 1 Enterprising readers might challenge themselves to invent an axiomatic system that uses the symbols amp variables a b c the formation rules specified above and as few as possible of the laws listed below and then derive as theorems the others as well as the truth table valuations for amp and One set attributed to Huntington 1904 Suppes 204 uses eight of the laws defined below If used in an axiomatic system the symbols 1 and 0 or T and F are considered to be well formed formulas and thus obey all the same rules as the variables Thus the laws listed below are actually axiom schemas that is they stand in place of an infinite number of instances Thus x y y x might be used in one instance p 0 0 p and in another instance 1 q q 1 etc Connective seniority symbol rank In general to avoid confusion during analysis and evaluation of propositional formulas one can make liberal use of parentheses However quite often authors leave them out To parse a complicated formula one first needs to know the seniority or rank that each of the connectives excepting has over the other connectives To well form a formula start with the connective with the highest rank and add parentheses around its components then move down in rank paying close attention to the connective s scope over which it is working From most to least senior with the predicate signs x and x the IDENTITY and arithmetic signs added for completeness LOGICAL EQUIVALENCE IMPLICATION amp AND OR NOT x FOR ALL x x THERE EXISTS AN x IDENTITY arithmetic sum arithmetic multiply s arithmetic successor dd Thus the formula can be parsed but because NOT does not obey the distributive law the parentheses around the inner formula c amp d is mandatory Example d amp c w rewritten is d amp c w Example a amp a b a amp a b rewritten rigorously is has seniority a amp a b a amp a b has seniority a amp a b a amp a b amp has seniority both sides a amp a b a amp a b has seniority a amp a b a amp a b check 9 parenthesis and 9 parenthesis a amp a b a amp a b dd Example d amp c p amp c amp d c amp d p amp c p amp d rewritten is d amp c p amp c amp d c amp d p amp c p amp d dd Commutative and associative laws Both AND and OR obey the commutative law and associative law Commutative law for OR a b b a Commutative law for AND a amp b b amp a Associative law for OR a b c a b c Associative law for AND a amp b amp c a amp b amp c Omitting parentheses in strings of AND and OR The connectives are considered to be unary one variable e g NOT and binary i e two variable AND OR IMPLIES For example c amp d p amp c p amp d above should be written c amp d p amp c p amp d or possibly c amp d p amp c p amp d However a truth table demonstration shows that the form without the extra parentheses is perfectly adequate Omitting parentheses with regards to a single variable NOT While a where a is a single variable is perfectly clear a is adequate and is the usual way this literal would appear When the NOT is over a formula with more than one symbol then the parentheses are mandatory e g a b Distributive laws OR distributes over AND and AND distributes over OR NOT does not distribute over AND or OR See below about De Morgan s law Distributive law for OR c a amp b c a amp c b Distributive law for AND c amp a b c amp a c amp b De Morgan s laws NOT when distributed over OR or AND does something peculiar again these can be verified with a truth table De Morgan s law for OR a b a b De Morgan s law for AND a b a b Laws of absorption Absorption in particular the first one causes the laws of logic to differ from the laws of arithmetic Absorption idempotency for OR a a a Absorption idempotency for AND a amp a aLaws of evaluation Identity nullity and complement The sign as distinguished from logical equivalence alternately or symbolizes the assignment of value or meaning Thus the string a amp a symbolizes 0 i e it means the same thing as symbol 0 In some systems this will be an axiom definition perhaps shown as a amp a Df 0 in other systems it may be derived in the truth table below c taut ca a amp a 0 0 0 0 1 0 1 01 1 0 0 1 1 0Commutation of equality a b b a Identity for OR a 0 a or a F a Identity for AND a amp 1 a or a amp T a Nullity for OR a 1 1 or a T T Nullity for AND a amp 0 0 or a amp F F Complement for OR a a 1 or a a T law of excluded middle Complement for AND a amp a 0 or a amp a F law of contradictionDouble negative involution a aWell formed formulas wffs A key property of formulas is that they can be uniquely parsed to determine the structure of the formula in terms of its propositional variables and logical connectives When formulas are written in infix notation as above unique readability is ensured through an appropriate use of parentheses in the definition of formulas Alternatively formulas can be written in Polish notation or reverse Polish notation eliminating the need for parentheses altogether The inductive definition of infix formulas in the previous section can be converted to a formal grammar in Backus Naur form lt formula gt lt propositional variable gt lt formula gt lt formula gt lt formula gt lt formula gt lt formula gt lt formula gt lt formula gt lt formula gt lt formula gt It can be shown that any expression matched by the grammar has a balanced number of left and right parentheses and any nonempty initial segment of a formula has more left than right parentheses This fact can be used to give an algorithm for parsing formulas For example suppose that an expression x begins with displaystyle lnot Starting after the second symbol match the shortest subexpression y of x that has balanced parentheses If x is a formula there is exactly one symbol left after this expression this symbol is a closing parenthesis and y itself is a formula This idea can be used to generate a recursive descent parser for formulas Example of parenthesis counting This method locates as 1 the principal connective the connective under which the overall evaluation of the formula occurs for the outer most parentheses which are often omitted It also locates the inner most connective where one would begin evaluatation of the formula without the use of a truth table e g at level 6 start c amp d V p amp c amp d c amp d V p amp d V p amp c count 0 1 2 3 3 3 3 2 2 3 3 3 3 4 5 5 5 5 6 6 5 4 3 3 1 1 2 3 4 4 4 4 3 3 4 4 4 4 3 2 2 3 3 3 3 3 3 3 2 1 0Well formed formulas versus valid formulas in inferences The notion of valid argument is usually applied to inferences in arguments but arguments reduce to propositional formulas and can be evaluated the same as any other propositional formula Here a valid inference means The formula that represents the inference evaluates to truth beneath its principal connective no matter what truth values are assigned to its variables i e the formula is a tautology Quite possibly a formula will be well formed but not valid Another way of saying this is Being well formed is necessary for a formula to be valid but it is not sufficient The only way to find out if it is both well formed and valid is to submit it to verification with a truth table or by use of the laws Example 1 What does one make of the following difficult to follow assertion Is it valid If it s sunny but if the frog is croaking then it s not sunny then it s the same as saying that the frog isn t croaking Convert this to a propositional formula as follows IF a AND IF b THEN NOT a THEN NOT a where a represents its sunny and b represents the frog is croaking a amp b a b dd This is well formed but is it valid In other words when evaluated will this yield a tautology all T beneath the logical equivalence symbol The answer is NO it is not valid However if reconstructed as an implication then the argument is valid Saying it s sunny but if the frog is croaking then it s not sunny implies that the frog isn t croaking Other circumstances may be preventing the frog from croaking perhaps a crane ate it Example 2 from Reichenbach via Bertrand Russell If pigs have wings some winged animals are good to eat Some winged animals are good to eat so pigs have wings a b amp b a is well formed but an invalid argument as shown by the red evaluation under the principal implication W G arga b a gt b amp b gt a 0 0 0 1 0 0 0 1 00 1 0 1 1 1 1 0 01 0 1 0 0 0 0 1 11 1 1 1 1 1 1 1 1Reduced sets of connectivesThe engineering symbol for the NAND connective the stroke can be used to build any propositional formula The notion that truth 1 and falsity 0 can be defined in terms of this connective is shown in the sequence of NANDs on the left and the derivations of the four evaluations of a NAND b are shown along the bottom The more common method is to use the definition of the NAND from the truth table A set of logical connectives is called complete if every propositional formula is tautologically equivalent to a formula with just the connectives in that set There are many complete sets of connectives including displaystyle land lnot displaystyle lor lnot and displaystyle to lnot There are two binary connectives that are complete on their own corresponding to NAND and NOR respectively Some pairs are not complete for example displaystyle land lor The stroke NAND The binary connective corresponding to NAND is called the Sheffer stroke and written with a vertical bar or vertical arrow The completeness of this connective was noted in Principia Mathematica 1927 xvii Since it is complete on its own all other connectives can be expressed using only the stroke For example where the symbol represents logical equivalence p p p p q p q p q p q p amp q p q In particular the zero ary connectives displaystyle top representing truth and displaystyle bot representing falsity can be expressed using the stroke a a a displaystyle top equiv a a a displaystyle bot equiv top top IF THEN ELSE This connective together with 0 1 or F T or displaystyle bot displaystyle top forms a complete set In the following the IF THEN ELSE relation c b a d represents c b c a c amp b c amp a d c b a c 0 1 c c b 1 c b c c a c a c b c c amp b Example The following shows how a theorem based proof of c b 1 c b would proceed below the proof is its truth table verification Note c b is defined to be c b Begin with the reduced form c amp b c amp a Substitute 1 for a c amp b c amp 1 Identity c amp 1 c c amp b c Law of commutation for V c c amp b Distribute c V over c amp b c c amp c b Law of excluded middle c c 1 1 amp c b Distribute 1 amp over c b 1 amp c 1 amp b Commutivity and Identity 1 amp c c amp 1 c and 1 amp b b amp 1 b c b c b is defined as c b Q E D In the following truth table the column labelled taut for tautology evaluates logical equivalence symbolized here by between the two columns labelled d Because all four rows under taut are 1 s the equivalence indeed represents a tautology d taut drows c b a c amp b V c amp a c V b 0 1 0 0 1 0 0 0 1 1 0 1 1 1 1 0 1 02 3 0 1 1 0 0 1 1 1 0 1 1 1 1 0 1 14 5 1 0 1 1 0 0 0 0 1 0 1 1 0 1 0 06 7 1 1 1 1 1 1 1 0 1 0 1 1 0 1 1 1Normal formsAn arbitrary propositional formula may have a very complicated structure It is often convenient to work with formulas that have simpler forms known as normal forms Some common normal forms include conjunctive normal form and disjunctive normal form Any propositional formula can be reduced to its conjunctive or disjunctive normal form Reduction to normal form A truth table will contain 2n rows where n is the number of variables e g three variables p d c produce 23 rows Each row represents a minterm Each minterm can be found on the Hasse diagram on the Veitch diagram and on the Karnaugh map The evaluations of p shown in the truth table are not shown in the Hasse Veitch and Karnaugh diagrams these are shown in the Karnaugh map of the following section Reduction to normal form is relatively simple once a truth table for the formula is prepared But further attempts to minimize the number of literals see below requires some tools reduction by De Morgan s laws and truth tables can be unwieldy but Karnaugh maps are very suitable a small number of variables 5 or less Some sophisticated tabular methods exist for more complex circuits with multiple outputs but these are beyond the scope of this article for more see Quine McCluskey algorithm Literal term and alterm In electrical engineering a variable x or its negation x can be referred to as a literal A string of literals connected by ANDs is called a term A string of literals connected by OR is called an alterm Typically the literal x is abbreviated x Sometimes the amp symbol is omitted altogether in the manner of algebraic multiplication Examples a b c d are variables a amp b amp c amp d is a term This can be abbreviated as a amp b amp c amp d or a b cd p q r s are variables p q r s is an alterm This can be abbreviated as p q r s Minterms In the same way that a 2n row truth table displays the evaluation of a propositional formula for all 2n possible values of its variables n variables produces a 2n square Karnaugh map even though we cannot draw it in its full dimensional realization For example 3 variables produces 23 8 rows and 8 Karnaugh squares 4 variables produces 16 truth table rows and 16 squares and therefore 16 minterms Each Karnaugh map square and its corresponding truth table evaluation represents one minterm Any propositional formula can be reduced to the logical sum OR of the active i e 1 or T valued minterms When in this form the formula is said to be in disjunctive normal form But even though it is in this form it is not necessarily minimized with respect to either the number of terms or the number of literals In the following table observe the peculiar numbering of the rows 0 1 3 2 6 7 5 4 0 The first column is the decimal equivalent of the binary equivalent of the digits cba in other words Example cba2 c 22 b 21 a 20 cba c 1 b 0 a 1 1012 1 22 0 21 1 20 510 This numbering comes about because as one moves down the table from row to row only one variable at a time changes its value Gray code is derived from this notion This notion can be extended to three and four dimensional hypercubes called Hasse diagrams where each corner s variables change only one at a time as one moves around the edges of the cube Hasse diagrams hypercubes flattened into two dimensions are either Veitch diagrams or Karnaugh maps these are virtually the same thing When working with Karnaugh maps one must always keep in mind that the top edge wrap arounds to the bottom edge and the left edge wraps around to the right edge the Karnaugh diagram is really a three or four or n dimensional flattened object decimal equivalent of c b a c b a minterm0 0 0 0 c amp b amp a 1 0 0 1 c amp b amp a 3 0 1 1 c amp b amp a 2 0 1 0 c amp b amp a 6 1 1 0 c amp b amp a 7 1 1 1 c amp b amp a 5 1 0 1 c amp b amp a 4 1 0 0 c amp b amp a 0 0 0 0 a amp b amp c Reduction by use of the map method Veitch Karnaugh Veitch improved the notion of Venn diagrams by converting the circles to abutting squares and Karnaugh simplified the Veitch diagram by converting the minterms written in their literal form e g abc d into numbers The method proceeds as follows Produce the formula s truth table Produce the formula s truth table Number its rows using the binary equivalents of the variables usually just sequentially 0 through n 1 for n variables Technically the propositional function has been reduced to its unminimized conjunctive normal form each row has its minterm expression and these can be OR d to produce the formula in its unminimized conjunctive normal form Example c amp d p amp c amp d q in conjunctive normal form is p amp d amp c p amp d amp c p amp d amp c p amp d amp c q dd dd However this formula be reduced both in the number of terms from 4 to 3 and in the total count of its literals 12 to 6 row Minterms p d c c amp d p amp c amp d Active minterms Formula in conjunctive normal form0 p amp d amp c 0 0 0 0 0 0 0 0 0 1 0 0 1 01 p amp d amp c 0 0 1 1 0 0 0 0 0 0 1 1 1 02 p amp d amp c 0 1 0 0 0 1 0 0 0 1 0 0 0 13 p amp d amp c 0 1 1 1 1 1 1 0 0 1 1 0 0 1 p amp d amp c 4 p amp d amp c 1 0 0 0 0 0 1 1 1 1 0 0 1 0 p amp d amp c 5 p amp d amp c 1 0 1 1 0 0 0 1 0 0 1 1 1 06 p amp d amp c 1 1 0 0 0 1 1 1 1 1 0 0 0 1 p amp d amp c 7 p amp d amp c 1 1 1 0 1 1 1 1 1 1 1 0 0 1 p amp d amp c q p amp d amp c p amp d amp c p amp d amp c p amp d amp c Create the formula s Karnaugh map Steps in the reduction using a Karnaugh map The final result is the OR logical sum of the three reduced terms Use the values of the formula e g p found by the truth table method and place them in their into their respective associated Karnaugh squares these are numbered per the Gray code convention If values of d for don t care appear in the table this adds flexibility during the reduction phase Reduce minterms Minterms of adjacent abutting 1 squares T squares can be reduced with respect to the number of their literals and the number terms also will be reduced in the process Two abutting squares 2 x 1 horizontal or 1 x 2 vertical even the edges represent abutting squares lose one literal four squares in a 4 x 1 rectangle horizontal or vertical or 2 x 2 square even the four corners represent abutting squares lose two literals eight squares in a rectangle lose 3 literals etc One seeks out the largest square or rectangles and ignores the smaller squares or rectangles contained totally within it This process continues until all abutting squares are accounted for at which point the propositional formula is minimized For example squares 3 and 7 abut These two abutting squares can lose one literal e g p from squares 3 and 7 four squares in a rectangle or square lose two literals eight squares in a rectangle lose 3 literals etc One seeks out the largest square or rectangles This process continues until all abutting squares are accounted for at which point the propositional formula is said to be minimized Example The map method usually is done by inspection The following example expands the algebraic method to show the trick behind the combining of terms on a Karnaugh map Minterms 3 and 7 abut 7 and 6 abut and 4 and 6 abut because the table s edges wrap around So each of these pairs can be reduced Observe that by the Idempotency law A A A we can create more terms Then by association and distributive laws the variables to disappear can be paired and then disappeared with the Law of contradiction x amp x 0 The following uses brackets and only to keep track of the terms they have no special significance Put the formula in conjunctive normal form with the formula to be reduced q p amp d amp c p amp d amp c p amp d amp c p amp d amp c 3 7 6 4 dd dd Idempotency absorption A A A 3 7 7 6 6 4 dd dd Associative law x y z x y z 3 7 7 6 6 4 p amp d amp c p amp d amp c p amp d amp c p amp d amp c p amp d amp c p amp d amp c dd dd Distributive law x amp y z x amp y x amp z d amp c p amp p p amp d c amp c p amp c c amp c dd dd Commutative law and law of contradiction x amp x x amp x 0 d amp c 0 p amp d 0 p amp c 0 dd dd Law of identity x 0 x leading to the reduced form of the formula q d amp c p amp d p amp c dd dd Verify reduction with a truth table row Minterms p d c d amp c p amp d p amp c 0 p amp d amp c 0 0 0 0 0 0 0 0 0 0 0 0 0 1 01 p amp d amp c 0 0 1 0 0 1 0 0 0 0 0 0 0 0 12 p amp d amp c 0 1 0 1 0 0 0 0 0 1 0 0 0 1 03 p amp d amp c 0 1 1 1 1 1 1 0 0 1 1 0 0 0 14 p amp d amp c 1 0 0 0 0 0 0 1 0 0 1 1 1 1 05 p amp d amp c 1 0 1 0 0 1 0 1 0 0 0 1 0 0 16 p amp d amp c 1 1 0 1 0 0 1 1 1 1 1 1 1 1 07 p amp d amp c 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1qImpredicative propositionsGiven the following examples as definitions what does one make of the subsequent reasoning 1 This sentence is simple 2 This sentence is complex and it is conjoined by AND Then assign the variable s to the left most sentence This sentence is simple Define compound c not simple s and assign c s to This sentence is compound assign j to It this sentence is conjoined by AND The second sentence can be expressed as NOT s AND j If truth values are to be placed on the sentences c s and j then all are clearly FALSEHOODS e g This sentence is complex is a FALSEHOOD it is simple by definition So their conjunction AND is a falsehood But when taken in its assembled form the sentence a TRUTH This is an example of the paradoxes that result from an impredicative definition that is when an object m has a property P but the object m is defined in terms of property P The best advice for a rhetorician or one involved in deductive analysis is avoid impredicative definitions but at the same time be on the lookout for them because they can indeed create paradoxes Engineers on the other hand put them to work in the form of propositional formulas with feedback Propositional formula with feedback The notion of a propositional formula appearing as one of its own variables requires a formation rule that allows the assignment of the formula to a variable In general there is no stipulation either axiomatic or truth table systems of objects and relations that forbids this from happening The simplest case occurs when an OR formula becomes one its own inputs e g p q Begin with p s q then let p q Observe that q s definition depends on itself q as well as on s and the OR connective this definition of q is thus impredicative Either of two conditions can result oscillation or memory It helps to think of the formula as a black box Without knowledge of what is going on inside the formula box from the outside it would appear that the output is no longer a function of the inputs alone That is sometimes one looks at q and sees 0 and other times 1 To avoid this problem one has to know the state condition of the hidden variable p inside the box i e the value of q fed back and assigned to p When this is known the apparent inconsistency goes away To understand predict the behavior of formulas with feedback requires the more sophisticated analysis of sequential circuits Propositional formulas with feedback lead in their simplest form to state machines they also lead to memories in the form of Turing tapes and counter machine counters From combinations of these elements one can build any sort of bounded computational model e g Turing machines counter machines register machines Macintosh computers etc Oscillation In the abstract ideal case the simplest oscillating formula is a NOT fed back to itself p q q Analysis of an abstract ideal propositional formula in a truth table reveals an inconsistency for both p 1 and p 0 cases When p 1 q 0 this cannot be because p q ditto for when p 0 and q 1 qp p q0 1 0 1 q amp p inconsistent1 0 1 0 q amp p inconsistent Oscillation with delay If a delay ideal or non ideal is inserted in the abstract formula between p and q then p will oscillate between 1 and 0 101010 101 ad infinitum If either of the delay and NOT are not abstract i e not ideal the type of analysis to be used will be dependent upon the exact nature of the objects that make up the oscillator such things fall outside mathematics and into engineering Analysis requires a delay to be inserted and then the loop cut between the delay and the input p The delay must be viewed as a kind of proposition that has qd q delayed as output for q as input This new proposition adds another column to the truth table The inconsistency is now between qd and p as shown in red two stable states resulting qqd p p q0 0 1 0 1 state 10 1 0 1 0 qd amp p inconsistent1 0 1 0 1 qd amp p inconsistent1 1 0 1 0 state 0Memory About the simplest memory results when the output of an OR feeds back to one of its inputs in this case output q feeding back into p The next simplest is the flip flop shown below the once flip Analysis of these sorts of formulas can be done by either cutting the feedback path s or inserting ideal delay in the path A cut path and an assumption that no delay occurs anywhere in the circuit results in inconsistencies for some of the total states combination of inputs and outputs e g p 0 s 1 r 1 results in an inconsistency When delay is present these inconsistencies are merely transient and expire when the delay s expire The drawings on the right are called state diagrams A clocked flip flop memory c is the clock and d is the data The data can change at any time when clock c 0 when clock c 1 the output q tracks the value of data d When c goes from 1 to 0 it traps d q s value and this continues to appear at q no matter what d does as long as c remains 0 Without delay inconsistencies must be eliminated from a truth table analysis With the notion of delay this condition presents itself as a momentary inconsistency between the fed back output variable q and p qdelayed A truth table reveals the rows where inconsistencies occur between p qdelayed at the input and q at the output After breaking the feed back the truth table construction proceeds in the conventional manner But afterwards in every row the output q is compared to the now independent input p and any inconsistencies between p and q are noted i e p 0 together with q 1 or p 1 and q 0 when the line is remade both are rendered impossible by the Law of contradiction p amp p Rows revealing inconsistencies are either considered transient states or just eliminated as inconsistent and hence impossible Once flip memory About the simplest memory results when the output of an OR feeds back to one of its inputs in this case output q feeds back into p Given that the formula is first evaluated initialized with p 0 amp q 0 it will flip once when set by s 1 Thereafter output q will sustain q in the flipped condition state q 1 This behavior now time dependent is shown by the state diagram to the right of the once flip qp s s p q0 0 0 0 0 0 state 0 s 00 1 1 1 0 q amp p inconsistent1 0 0 1 1 1 state 1 with s 01 1 1 1 1 1 state 1 with s 1Flip flop memory The next simplest case is the set reset flip flop shown below the once flip Given that r 0 amp s 0 and q 0 at the outset it is set s 1 in a manner similar to the once flip It however has a provision to reset q 0 when r 1 And additional complication occurs if both set 1 and reset 1 In this formula the set 1 forces the output q 1 so when and if s 0 amp r 1 the flip flop will be reset Or if s 1 amp r 0 the flip flop will be set In the abstract ideal instance in which s 1 s 0 amp r 1 r 0 simultaneously the formula q will be indeterminate undecidable Due to delays in real OR AND and NOT the result will be unknown at the outset but thereafter predicable qp s r s p amp r q0 0 0 0 0 0 0 1 0 0 state 0 with s 0 amp r 0 0 0 1 0 0 0 0 0 1 0 state 0 with s 0 amp r 1 0 1 0 1 1 0 0 1 0 q amp p inconsistent0 1 1 1 1 0 0 0 1 q amp p inconsistent1 0 0 0 1 1 1 1 0 1 state 1 with s 0 amp r 0 1 0 1 0 0 1 0 0 1 q amp p inconsistent1 1 0 1 1 1 1 1 0 1 state 1 with s 1 amp r 0 1 1 1 1 1 1 0 0 1 1 state 1 with s amp r simultaneously 1Clocked flip flop memory The formula known as clocked flip flop memory c is the clock and d is the data is given below It works as follows When c 0 the data d either 0 or 1 cannot get through to affect output q When c 1 the data d gets through and output q follows d s value When c goes from 1 to 0 the last value of the data remains trapped at output q As long as c 0 d can change value without causing q to change Examples c amp d p amp c amp d q but now let p q c amp d q amp c amp d q The state diagram is similar in shape to the flip flop s state diagram but with different labelling on the transitions s q w v r urow q d c c amp d q amp c amp d q Description0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 state 0 with s 0 amp r 0 0 is trapped1 0 0 1 1 0 0 0 0 0 0 1 1 1 0 0 state 0 with d 0 amp c 1 q 0 is following d 02 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 state 0 with d 1 amp r 0 0 is trapped3 0 1 1 1 1 1 1 0 0 1 1 0 0 1 q amp p inconsistent4 1 0 0 0 0 0 1 1 1 1 0 0 1 0 1 state 1 with d 0 amp c 0 1 is trapped5 1 0 1 1 0 0 0 1 0 0 1 1 1 0 q amp p inconsistent6 1 1 0 0 0 1 1 1 1 1 0 0 0 1 1 state 1 with d 1 amp c 0 1 is trapped7 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 state 1 with d 1 amp c 1 q 1 is following d 1Historical developmentBertrand Russell 1912 74 lists three laws of thought that derive from Aristotle 1 The law of identity Whatever is is 2 The law of noncontradiction Nothing can both be and not be and 3 The law of excluded middle Everything must be or not be Example Here O is an expression about an object s BEING or QUALITY Law of Identity O O Law of contradiction O amp O Law of excluded middle O O The use of the word everything in the law of excluded middle renders Russell s expression of this law open to debate If restricted to an expression about BEING or QUALITY with reference to a finite collection of objects a finite universe of discourse the members of which can be investigated one after another for the presence or absence of the assertion then the law is considered intuitionistically appropriate Thus an assertion such as This object must either BE or NOT BE in the collection or This object must either have this QUALITY or NOT have this QUALITY relative to the objects in the collection is acceptable See more at Venn diagram Although a propositional calculus originated with Aristotle the notion of an algebra applied to propositions had to wait until the early 19th century In an adverse reaction to the 2000 year tradition of Aristotle s syllogisms John Locke s Essay concerning human understanding 1690 used the word semiotics theory of the use of symbols By 1826 Richard Whately had critically analyzed the syllogistic logic with a sympathy toward Locke s semiotics George Bentham s work 1827 resulted in the notion of quantification of the predicate 1827 nowadays symbolized as for all A row instigated by William Hamilton over a priority dispute with Augustus De Morgan inspired George Boole to write up his ideas on logic and to publish them as MAL Mathematical Analysis of Logic in 1847 Grattin Guinness and Bornet 1997 xxviii About his contribution Grattin Guinness and Bornet comment Boole s principal single innovation was the law xn x for logic it stated that the mental acts of choosing the property x and choosing x again and again is the same as choosing x once As consequence of it he formed the equations x 1 x 0 and x 1 x 1 which for him expressed respectively the law of contradiction and the law of excluded middle p xxviiff For Boole 1 was the universe of discourse and 0 was nothing Gottlob Frege s massive undertaking 1879 resulted in a formal calculus of propositions but his symbolism is so daunting that it had little influence excepting on one person Bertrand Russell First as the student of Alfred North Whitehead he studied Frege s work and suggested a famous and notorious emendation with respect to it 1904 around the problem of an antinomy that he discovered in Frege s treatment cf Russell s paradox Russell s work led to a collaboration with Whitehead that in the year 1912 produced the first volume of Principia Mathematica PM It is here that what we consider modern propositional logic first appeared In particular PM introduces NOT and OR and the assertion symbol as primitives In terms of these notions they define IMPLICATION def 1 01 p q then AND def 3 01 p q then EQUIVALENCE p q 4 01 p q amp q p Henry M Sheffer 1921 and Jean Nicod demonstrate that only one connective the stroke is sufficient to express all propositional formulas Emil Post 1921 develops the truth table method of analysis in his Introduction to a general theory of elementary propositions He notes Nicod s stroke Whitehead and Russell add an introduction to their 1927 re publication of PM adding in part a favorable treatment of the stroke Computation and switching logic William Eccles and F W Jordan 1919 describe a trigger relay made from a vacuum tube George Stibitz 1937 invents the binary adder using mechanical relays He builds this on his kitchen table Example Given binary bits ai and bi and carry in c ini their summation Si and carry out c outi are ai XOR bi XOR c ini Si ai amp bi c ini c outi Alan Turing builds a multiplier using relays 1937 1938 He has to hand wind his own relay coils to do this Textbooks about switching circuits appear in the early 1950s Willard Quine 1952 and 1955 E W Veitch 1952 and M Karnaugh 1953 develop map methods for simplifying propositional functions George H Mealy 1955 and Edward F Moore 1956 address the theory of sequential i e switching circuit machines E J McCluskey and H Shorr develop a method for simplifying propositional switching circuits 1962 FootnotesRosenbloom discusses this problem of implication at some length Most philosophers and mathematicians just accept the material definition as given above But some do not including the intuitionists they consider it a form of the law of excluded middle misapplied Rosenbloom and Kleene 1952 73 74 ranks all 11 symbols CitationsKao Eric J Genesereth Michael 2017 Propositional Logic Introduction to Logic 3rd ed doi 10 1007 978 3 031 01801 5 ISBN 9783031018015 Hamilton 1978 1 Principia Mathematica PM p 91 eschews the because they require a clear cut object of sensation they stipulate the use of this italics added Reichenbach clarification needed p 80 Tarski p 54 68 Suppes calls IDENTITY a further rule of inference and has a brief development around it Robbin Bender and Williamson and Goodstein introduce the sign and its usage without comment or explanation Hamilton p 37 employs two signs and with respect to the valuation of a formula in a formal calculus Kleene p 70 and Hamilton p 52 place it in the predicate calculus in particular with regards to the arithmetic of natural numbers Empiricits eschew the notion of a priori built in born with knowledge Radical reductionists such as John Locke and David Hume held that every idea must either originate directly in sense experience or else be compounded of ideas thus originating quoted from Quine reprinted in 1996 The Emergence of Logical Empriricism Garland Publishing Inc http www marxists org reference subject philosophy works us quine htm Neural net modelling offers a good mathematical model for a comparator as follows Given a signal S and a threshold thr subtract thr from S and substitute this difference d to a sigmoid function For large gains k e g k 100 1 1 e k d 1 1 e k S thr 0 1 clarification needed For example if The door is DOWN means The door is less than 50 of the way up then a threshold thr 0 5 corresponding to 0 5 5 0 2 50 volts could be applied to a linear measuring device with an output of 0 volts when fully closed and 5 0 volts when fully open In actuality the digital 1 and 0 are defined over non overlapping ranges e g 1 5 0 2 1 0 volts 0 0 5 0 2 volts clarification needed When a value falls outside the defined range s the value becomes u unknown e g 2 3 would be u While the notion of logical product is not so peculiar e g 0 0 0 0 1 0 1 0 0 1 1 1 the notion of 1 1 1 is peculiar in fact a b a b a b where is the logical sum but and are the true arithmetic counterparts Occasionally all four notions do appear in a formula A AND B 1 2 A plus B minus A XOR B cf p 146 in John Wakerly 1978 Error Detecting Codes Self Checking Circuits and Applications North Holland New York ISBN 0 444 00259 6 pbk A careful look at its Karnaugh map shows that IF THEN ELSE can also be expressed in a rather round about way in terms of two exclusive ORs b AND c XOR a OR a AND c XOR b d Robbin p 3 Rosenbloom 1950 pp 30 and 54ff Indeed exhaustive selection between alternatives mutual exclusion is required by the definition that Kleene gives the CASE operator Kleene 1952229 The use of quote marks around the expressions is not accidental Tarski comments on the use of quotes in his 18 Identity of things and identity of their designations use of quotation marks p 58ff Hamilton p 37 Bender and Williamson p 29 state In what follows we ll replace equals with the symbol equivalence which is usually used in logic We use the more familiar for assigning meaning and values Reichenbach p 20 22 and follows the conventions of PM The symbol Df is in the metalanguage and is not a formal symbol with the following meaning by symbol s is to have the same meaning as the formula c amp d Rosenbloom 1950 p 32 cf Minsky 1967 75 section 4 2 3 The method of parenthesis counting Minsky presents a state machine that will do the job and by use of induction recursive definition Minsky proves the method and presents a theorem as the result A fully generalized parenthesis grammar requires an infinite state machine e g a Turing machine to do the counting Robbin p 7 cf Reichenbach p 68 for a more involved discussion If the inference is valid and the premises are true the inference is called conclusive As well as the first three Hamilton pp 19 22 discusses logics built from only NAND and NOR Wickes 1967 36ff Wickes offers a good example of 8 of the 2 x 4 3 variable maps and 16 of the 4 x 4 4 variable maps As an arbitrary 3 variable map could represent any one of 28 256 2x4 maps and an arbitrary 4 variable map could represent any one of 216 65 536 different formula evaluations writing down every one is infeasible This definition is given by Stephen Kleene Both Kurt Godel and Kleene believed that the classical paradoxes are uniformly examples of this sort of definition But Kleene went on to assert that the problem has not been solved satisfactorily and impredicative definitions can be found in analysis He gives as example the definition of the least upper bound l u b u of M Given a Dedekind cut of the number line C and the two parts into which the number line is cut i e M and C M l u b u is defined in terms of the notion M whereas M is defined in terms of C Thus the definition of u an element of C is defined in terms of the totality C and this makes its definition impredicative Kleene asserts that attempts to argue this away can be used to uphold the impredicative definitions in the paradoxes Kleene 1952 43 McCluskey comments that it could be argued that the analysis is still incomplete because the word statement The outputs are equal to the previous values of the inputs has not been obtained he goes on to dismiss such worries because English is not a formal language in a mathematical sense and it is not really possible to have a formal procedure for obtaining word statements p 185 More precisely given enough loop gain either oscillation or memory will occur cf McCluskey p 191 2 In abstract idealized mathematical systems adequate loop gain is not a problem The notion of delay and the principle of local causation as caused ultimately by the speed of light appears in Robin Gandy 1980 Church s thesis and Principles for Mechanisms in J Barwise H J Keisler and K Kunen eds The Kleene Symposium North Holland Publishing Company 1980 123 148 Gandy considered this to be the most important of his principles Contemporary physics rejects the possibility of instantaneous action at a distance p 135 Gandy was Alan Turing s student and close friend McKlusky p 194 5 discusses breaking the loop and inserts amplifiers to do this Wickes p 118 121 discuss inserting delays McCluskey p 195ff discusses the problem of races caused by delays ReferencesRosenbloom Paul 1950 The Elements of Mathematical Logic Mineola New York Dover Publications Inc ISBN 0 486 44617 4 Kleene Stephen 1952 Introduction to metamathematics Amsterdam North Holland Publishing Company Bender Edward A and Williamson S Gill 2005 A Short Course in Discrete Mathematics Dover Publications Mineola NY ISBN 0 486 43946 1 This text is used in a lower division two quarter computer science course at UC San Diego Enderton H B 2002 A Mathematical Introduction to Logic Harcourt Academic Press ISBN 0 12 238452 0 Goodstein R L Pergamon Press 1963 1966 Dover edition 2007 Boolean Algebra Dover Publications Inc Minola New York ISBN 0 486 45894 6 Emphasis on the notion of algebra of classes with set theoretic symbols such as NOT IMPLIES Later Goldstein replaces these with amp respectively in his treatment of Sentence Logic pp 76 93 Ivor Grattan Guinness and Gerard Bornet 1997 George Boole Selected Manuscripts on Logic and its Philosophy Birkhauser Verlag Basil ISBN 978 0 8176 5456 6 Boston A G Hamilton 1978 Logic for Mathematicians Cambridge University Press Cambridge UK ISBN 0 521 21838 1 E J McCluskey 1965 Introduction to the Theory of Switching Circuits McGraw Hill Book Company New York No ISBN Library of Congress Catalog Card Number 65 17394 McCluskey was a student of Willard Quine and developed some notable theorems with Quine and on his own For those interested in the history the book contains a wealth of references Marvin L Minsky 1967 Computation Finite and Infinite Machines Prentice Hall Inc Englewood Cliffs N J No ISBN Library of Congress Catalog Card Number 67 12342 Useful especially for computability plus good sources 1969 1997 Mathematical Logic A First Course Dover Publications Inc Mineola New York ISBN 0 486 45018 X pbk Patrick Suppes 1957 1999 Dover edition Introduction to Logic Dover Publications Inc Mineola New York ISBN 0 486 40687 3 pbk This book is in print and readily available On his page 204 in a footnote he references his set of axioms to E V Huntington Sets of Independent Postulates for the Algebra of Logic Transactions of the American Mathematical Society Vol 5 91904 pp 288 309 Alfred Tarski 1941 1995 Dover edition Introduction to Logic and to the Methodology of Deductive Sciences Dover Publications Inc Mineola New York ISBN 0 486 28462 X pbk This book is in print and readily available Jean van Heijenoort 1967 3rd printing with emendations 1976 From Frege to Godel A Source Book in Mathematical Logic 1879 1931 Harvard University Press Cambridge Massachusetts ISBN 0 674 32449 8 pbk Translation reprints of Frege 1879 Russell s letter to Frege 1902 and Frege s letter to Russell 1902 Richard s paradox 1905 Post 1921 can be found here Alfred North Whitehead and Bertrand Russell 1927 2nd edition paperback edition to 53 1962 Principia Mathematica Cambridge University Press no ISBN In the years between the first edition of 1912 and the 2nd edition of 1927 H M Sheffer 1921 and M Jean Nicod no year cited brought to Russell s and Whitehead s attention that what they considered their primitive propositions connectives could be reduced to a single nowadays known as the stroke or NAND NOT AND NEITHER NOR Russell Whitehead discuss this in their Introduction to the Second Edition and makes the definitions as discussed above William E Wickes 1968 Logic Design with Integrated Circuits John Wiley amp Sons Inc New York No ISBN Library of Congress Catalog Card Number 68 21185 Tight presentation of engineering s analysis and synthesis methods references McCluskey 1965 Unlike Suppes Wickes presentation of Boolean algebra starts with a set of postulates of a truth table nature and then derives the customary theorems of them p 18ff External linksMedia related to Propositional formula at Wikimedia Commons