Main Page | Recent changes | Edit this page | Page history

Printable version | Disclaimers

Not logged in
Log in | Help
 
Other languages: Afrikaans | Български | Dansk | Deutsch | Esperanto | Español | Eesti | Suomi | Français | Nederlands | Polski | Português | Svenska | 中文

Logic

(Redirected from Formal logic)

The basic definition for computer LOGIC is that it is the sequence of operations performed by hardware or software.

Roughly speaking, logic is the study of prescriptive systems of reasoning, that is, systems proposed as guides for how people (as well, perhaps, as other intelligent beings/machines/systems) ought to reason. Logic says which forms of inference are valid and which are not. Traditionally, logic is studied as a branch of philosophy, but it can also be considered a branch of mathematics and Computer Science. How people actually reason is usually studied under other headings, including cognitive psychology. Logic is traditionally divided into deductive reasoning, concerned with what follows logically from given premises, and inductive reasoning, concerned with how we can go from some number of observed events to a reliable generalization.

As a science, logic defines the structure of statement and argument and devises formulae by which these are codified. Implicit in a study of logic is the understanding of what makes a good argument and what arguments are fallacious.

Philosophical logic deals with formal descriptions of natural language. Most philosophers assume that the bulk of "normal" proper reasoning can be captured by logic, if one can find the right method for translating ordinary language into that logic.

Following are more specific discussions of some systems of logic. See also: list of topics in logic.

Table of contents

Aristotelian logic

Aristotelian logic was pioneered by Aristotle. Although it is possible that Aristotle was taught by someone else, the earliest study of reasoning can be attributed to Aristotle. Aristotle and his followers held that two of the most important principles of logic are the law of non-contradiction and the law of excluded middle. This kind of logic is now given various names to distinguish it from more recent systems of logic, e.g., Aristotelian logic or classical two-valued logic.

The law of non-contradiction states that no proposition is both true and false and law of excluded middle states that a proposition must either be true or false. In combination, these laws require two truth values that are mutually exclusive. A proposition can be either true or false, but cannot be both at the same time.

Some have considered classical logic to be just like a mathematical theory, and in particular the laws of non-contradiction and the excluded middle to be simply axioms of the theory, which have to be assumed without proof. In fact this is not so:

A better way to look at these laws is that, without them, the logic still remains valid, but a whole lot of illogic becomes valid as well. Thus, those laws are simply filters for stripping away the illogic, and leaving only the part that doesn't depend on them—the logic.

Formal logic

See also Propositional calculus

Formal logic, also called symbolic logic, is concerned primarily with the structure of reasoning. Formal logic deals with the relationships between concepts and provides a way to compose proofs of statements. In formal logic, concepts are rigorously defined, and sentences are translated into a precise, compact, and unambiguous symbolic notation.

Descartes, who was both a mathematician and a philosopher, had the idea of keeping the method of algebra while going beyond the material of traditional mathematics and embracing the general science of what thought finds, so that philosophy should become a kind of Universal mathematics. This sort of generalization of the use of symbols for analogous theories is a characteristic of mathematics. This idea of a universal mathematics was cultivated especially by Gottfried Wilhelm Leibniz. Though modern logic is really due to Boole and De Morgan, Leibniz was the first to have a really distinct plan of a system of mathematical logic. That this is so appears from research - much of which is quite recent - into Leibniz's unpublished work.

Some examples of symbolic notation are:

p: 1 + 2 = 3

This statement defines p is 1 + 2 = 3 and that is true.

Two propositions can be combined using the operations of conjunction, disjunction or conditional. These are called binary logical operators. Such combined propositions are called compound propositions. For example,

p: 1 + 1 = 2 and "logic is the study of reasoning."

In this case, and is a conjunction. The two propositions can differ totally from each other.

In mathematics and computer science, one may want to state a proposition depending on some variables:

p: n is an odd integer.

This proposition can be either true or false according to the variable n.

A proposition with free variables is called propositional function with domain of discourse D. To form an actual proposition, one uses quantifiers. For every n, or for some n, can be specified by quantifiers: either the universal quantifier or the existential quantifier. For example,

for all n in D, P(n).

This can be written also as:

\forall n\in D, P(n)

The standard situation in mathematical analysis since Weierstrass, the quantifications for all ... there exists or there exists ... such that for all (and more complex analogues) can be expressed, instead of symbols. This may be done for clarity in certain cases also.

There is one circumstance of particular interest, namely, that the algebra in question, like logic, is susceptible of two distinct interpretations, the parallelism between them being almost perfect, according as the letters represent concepts or propositions. Doubtless we can, with Boole, reduce the two interpretations to one, by considering the concepts on the one hand and the propositions on the other as corresponding to assemblages or classes; since a concept determines the class of objects to which it is applied (and which in logic is called its extension), and a proposition determines the class of the instances or moments of time in which it is true (and which by analogy can also be called its extension). Accordingly predicate calculus and propositional calculus become reduced to but one, the calculus of classes or, as Leibniz called it, the theory of the whole and part, of that which contains and that which is contained. But as a matter of fact, predicate calculus and propositional calculus present certain differences which prevent their complete identification from the formal point of view and consequently their reduction to a single calculus of classes.

In particular, the principle of assertion (a = 1) = a is peculiar to propositional calculus, and is interpreted as follows: To state a proposition is to affirm the truth of that proposition. Clearly, this formula is not susceptible of a conceptual interpretation, for, if a is a concept, (a = 1) is a proposition, and we would then have a logical equality between a concept and a proposition, which is absurd. From this formula combined with the Law of non-contradiction we deduce the law of Bivalence. In fact, propositional calculus is equivalent to the calculus of classes when the classes can possess only the two values 0 and 1.

The equivalence of an implication and a disjunction (a \Rightarrow b) \Leftrightarrow (\bar{a} \vee b) is no less fundamental to propositional calculus, as it makes possible to reduce secondary, tertiary, etc. propositions to primary propositions, or even to sums of elementary propositions.

Accordingly we have in reality three distinct calculi, or, in the part common to all, three different interpretations of the same calculus. In any case one must not forget that the logical value and the deductive sequence of the formulas does not in the least depend upon the interpretations which may be given them. These interpretations shall serve only to render the formulas intelligible, to give them clearness and to make their meaning at once obvious, but never to justify them. They may be omitted without destroying the formal rigidity of the system.

In order not to favor either interpretation one might say that the letters represent terms; these terms may be either concepts or propositions according to the case in hand.

Mathematical logic

Main article:Mathematical logic

Mathematical logic is the use of formal logic to study mathematical reasoning. At the beginning of the twentieth century, philosophical logicians including (Frege, Russell) attempted to prove that mathematics could be entirely reduced to logic. They held that in discovering the "logical form" of a sentence, you were somehow revealing the "right" way to say it, or uncovering some previously hidden essence. The reduction failed, but in the process, logic took on much of the notation and methodology of mathematics, and nowadays logic is accepted as an accurate way to describe mathematical reasoning.

Philosophical logic

Philosophical logic is essentially a continuation of the traditional discipline that was called "Logic" before it was supplanted by the invention of Mathematical logic. It is concerned with the elucidation of ideas such as reference, predication, identity, truth, quantification, existence, and others. Philosophical logic has a much greater concern with the connection between natural language and logic. See Philosophical logic.

Predicate logic

See also First-order predicate calculus

Gottlob Frege, in his Begriffsschrift, discovered a way to rearrange many sentences to make their logical form clear, to show how sentences relate to one another in certain respects. Prior to Frege, formal logic had not been successful beyond the level of sentential logic: it could represent the structure of sentences composed of other sentences using such words as "and", "or", and "not," but it could not break sentences down into smaller parts. It could not show how "Cows are animals" entails "Parts of cows are parts of animals."

Sentential logic explains the workings of words such as "and", "but", "or", "not", "if-then", "if and only if", and "neither-nor". Frege expanded logic to include words such as "all", "some", and "none". He showed how we can introduce variables and "quantifiers" to rearrange sentences.

\forall x (H(x)\Rightarrow M(x))
\exists x (H(x)\wedge V(x)).

Frege treats simple sentences without subject nouns as predicates and applies them to "dummy objects" (x). The logical structure in discourse about objects can then be operated on according to the rules of sentential logic, with some additional details for adding and removing quantifiers. Frege's work started contemporary formal logic.

Frege adds to sentential logic (1) the vocabulary of quantifiers (upside-down A, backward E) and variables, (2) a semantics that explains that the variables denote individual objects and the quantifiers have something like the force of "all" "some" in relation to those objects, and (3) methods for using these in language. To introduce an "All" quantifier, you assume an arbitrary variable, prove something that must hold true of it, and then prove that it didn't matter which variable you chose, that would have held true. An "All" quantifier can be removed by applying the sentence to any particular object at all. A "Some" (exists) quantifier can be added to a sentence true of any object at all; it can be removed in favor of a term about which you are not already presupposing any information.

Computability logic

Computability logic is a very recent initiative introduced in 2003 by G.Japaridze (Introduction to computability logic, Annals of Pure and Applied Logic 123, pp. 1-99). It would be accurate to say that this is a formal theory of computability in the same sense as classical logic is a formal theory of truth. It understands logical operators as operations on computational tasks, formulas as schemata of such tasks, and validity of formulas as existence of a machine that always successfully accomlishes the tasks represented by a formula.

Multi-valued logic

The logics discussed above are all "bivalent" or "two-valued"; that is, the semantics for each of these languages will assign to every sentence either the value "True" or the value "False."

Systems which do not always make this distinction are known as non-Aristotelian logics, or multi-valued logics.

In the early 20th century Jan Łukasiewicz investigated the extension of the traditional true/false values to include a third value, "possible".

Logics such as fuzzy logic have since been devised with an infinite number of "degrees of truth", e.g., represented by a real number between 0 and 1. Bayesian probability can be interpreted as a system of logic where probability is the subjective truth value.

Logic and computers

Logic is extensively used in the fields of artificial intelligence, and computer science.

In the 1950s and 1960s, researchers predicted that when human knowledge could be expressed using logic with mathematical notation, it would be possible to create a machine that reasons, or artificial intelligence. This turned out to be more difficult than expected because of the complexity of human reasoning. Logic programming is an attempt to make computers do logical reasoning and Prolog programming language is commonly used for it.

In symbolic logic and mathematical logic, proofs by humans can be computer-assisted. Using automated theorem proving the machines can find and check proofs, as well as work with proofs too lengthy to be written out by hand.

In computer science, Boolean algebra is the basis of hardware design, as well as much software design.

Logic puzzles

A large class of elementary logical puzzles can be solved using the laws of boolean algebra and logic truth tables. Familiarity with boolean algebra and its simplification process is a prerequisite to understand the following examples.

Example

On the Keikei Island, there lived two kinds of people -- knights and knaves. The knights always tell the truth, but the knaves always tell a lie.

John and Bill are residents of the Keikei Island.

Example 1

John says: We are both knaves.

Who is who?

Example 2

John: If Bill is a knave then I'm a knight.

Bill: We are different.

Who is who?

Example 3

Logician: Are you both knights? John: Yes or No. Logician: Are you both knaves? John: Yes or No.

Who is who?

Solution to Example 1

We can use Boolean algebra to deduce who's who as follows:

Let J be true if John is a knight and let B be true if Bill is a knight. Now, either John is a knight and what he said was true, or John is not a knight and what he said was false. Translating that into Boolean algebra, we get:


(J \ \wedge (J' \wedge B')) \vee (J' \wedge (J' \wedge B')') = \mbox{tautology}

Simplification process:


(J \ \wedge (J' \wedge B')) \vee (J' \wedge (J' \wedge B')')


false \vee (J' \wedge (J' \wedge B')'); \qquad J \wedge J' = \mbox{contradiction}


(J' \wedge (J' \wedge B')');\qquad \mbox{contradiction}\vee X = X

(J' \wedge (J \vee B));\qquad by de Morgan's theorem.

 ((J' \wedge J) \vee (J'\wedge B))


(J'\wedge B) = \mbox{tautology}

Therefore John is a knave and Bill is a knight. Although most people can solve this puzzle without using Boolean algebra, the example still serves as a powerful testament of the power of Boolean algebra in solving logic puzzles.

See also


[Main Page]
Main Page
Recent changes
Random page
Current events
Community Portal

Edit this page
Discuss this page
Page history
What links here
Related changes

Special pages
Contact us
Donations