Philosophy 

Portal 
Logic, from the Greek λογική (logiké)^{[1]} is defined by the Penguin Encyclopedia to be "The formal systematic study of the principles of valid inference and correct reasoning".^{[2]} As a discipline, logic dates back to Aristotle, who established its fundamental place in philosophy. It became part of the classical trivium, a fundamental part of a classical education, and is now an integral part of disciplines such as mathematics, computer science, and linguistics.
Logic concerns the structure of statements and arguments, in formal systems of inference and natural language. Topics include validity, fallacies and paradoxes, reasoning using probability and arguments involving causality and time. Logic is also commonly used today in argumentation theory.^{[3]}
Contents 
The concept of logical form is central to logic; it being held that the validity of an argument is determined by its logical form, not by its content. Traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logics.
These families are generally give logic a similar structure: to establish the relation of the sentences in topic of interest to their representation in logic through the analysis of logical form and semantics, and to present an account of inference relating these formal propositions^{[8]}.
Logic is generally accepted to be formal, in that it aims to analyse and represent the form (or logical form) of any valid argument type. The form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical system to make its content usable in formal inference.
This is known as showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a considerable variety of form and complexity that makes their use in inference impractical. It requires, first, ignoring those grammatical features which are irrelevant to logic (such as gender and declension if the argument is in Latin), replacing conjunctions which are not relevant to logic (such as 'but') with logical conjunctions like 'and' and replacing ambiguous or alternative logical expressions ('any', 'every', etc.) with expressions of a standard type (such as 'all', or the universal quantifier ∀).
Second, certain parts of the sentence must be replaced with schematic letters. Thus, for example, the expression 'all As are Bs' shows the logical form which is common to the sentences 'all men are mortals', 'all cats are carnivores', 'all Greeks are philosophers' and so on.
That the concept of form is fundamental to logic was already recognized in ancient times. Aristotle uses variable letters to represent valid inferences the Prior Analytics, leading Jan Łukasiewicz to say that the introduction of variables was 'one of Aristotle's greatest inventions'. According to the followers of Aristotle (such as Ammonius), only the logical principles stated in schematic terms belong to logic, and not those given in concrete terms. The concrete terms 'man', 'mortal', etc., are analogous to the substitution values of the schematic placeholders 'A', 'B', 'C', which were called the 'matter' (Greek 'hyle') of the inference.
The fundamental difference between modern formal logic and traditional or Aristotelian logic lies in their differing analysis of the logical form of the sentences they treat.
Deductive reasoning concerns what follows necessarily from given premises. However, inductive reasoning—the process of deriving a reliable generalization from observations—has sometimes been included in the study of logic. Correspondingly, we must distinguish between deductive validity and inductive validity (called "cogency"). An inference is deductively valid if and only if there is no possible situation in which all the premises are true and the conclusion false.
The notion of deductive validity can be rigorously stated for systems of formal logic in terms of the wellunderstood notions of semantics. Inductive validity on the other hand requires us to define a reliable generalization of some set of observations. The task of providing this definition may be approached in various ways, some less formal than others; some of these definitions may use mathematical models of probability. For the most part this discussion of logic deals only with deductive logic.
Among the important properties that logical systems can have:
Some logical systems do not have all three properties. As an example, Kurt Gödel's incompleteness theorems show that no standard formal system of arithmetic can be consistent and complete.^{[7]} At the same time his theorems for firstorder predicate logics not extended by specific axioms to be arithmetic formal systems with equality, show those to be complete and consistent.^{[9]}
Logic arose (see below) from a concern with correctness of argumentation. Modern logicians usually wish to ensure that logic studies just those arguments that arise from appropriately general forms of inference. For example, Thomas Hofweber writes in the Stanford Encyclopedia of Philosophy that logic "does not, however, cover good reasoning as a whole. That is the job of the theory of rationality. Rather it deals with inferences whose validity can be traced back to the formal features of the representations that are involved in that inference, be they linguistic, mental, or other representations".^{[10]}
By contrast, Immanuel Kant argued that logic should be conceived as the science of judgment, an idea taken up in Gottlob Frege's logical and philosophical work, where thought (German: Gedanke) is substituted for judgement (German: Urteil). On this conception, the valid inferences of logic follow from the structural features of judgements or thoughts.
The earliest sustained work on the subject of logic is that of Aristotle,^{[11]} In contrast with other traditions, Aristotelian logic became widely accepted in science and mathematics, ultimately giving rise to the formally sophisticated systems of modern logic.
Several ancient civilizations have employed intricate systems of reasoning and asked questions about logic or propounded logical paradoxes. In India, the Nasadiya Sukta of the Rigveda (RV 10.129) contains ontological speculation in terms of various logical divisions that were later recast formally as the four circles of catuṣkoṭi: "A", "not A", "Neither A or not A", and "Both not A and not not A". ^{[12]} The Chinese philosopher Gongsun Long (ca. 325–250 BC) proposed the paradox "One and one cannot become two, since neither becomes two."^{[13]} Also, the Chinese 'School of Names' is recorded as having examined logical puzzles such as "A White Horse is not a Horse" as early as the fifth century BCE.^{[14]} In China, the tradition of scholarly investigation into logic, however, was repressed by the Qin dynasty following the legalist philosophy of Han Feizi.
Logic in Islamic philosophy also contributed to the development of modern logic, which included the development of "Avicennian logic"^{[15]} as an alternative to Aristotelian logic. Avicenna's system of logic was responsible for the introduction of hypothetical syllogism,^{[16]} temporal modal logic,^{[17]}^{[18]} and inductive logic.^{[19]}^{[20]} The rise of the Asharite school, however, limited original work on logic in Islamic philosophy, though it did continue into the 15th century and had a significant influence on European logic during the Renaissance.
In India, innovations in the scholastic school, called Nyaya, continued from ancient times into the early 18th century, though it did not survive long into the colonial period. In the 20th century, Western philosophers like Stanislaw Schayer and Klaus Glashoff have tried to explore certain aspects of the Indian tradition of logic.
During the later medieval period, major efforts were made to show that Aristotle's ideas were compatible with Christian faith. During the later period of the Middle Ages, logic became a main focus of philosophers, who would engage in critical logical analyses of philosophical arguments.
The syllogistic logic developed by Aristotle predominated until the midnineteenth century when interest in the foundations of mathematics stimulated the development of symbolic logic (now called mathematical logic). In 1854, George Boole published An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities, introducing symbolic logic and the principles of what is now known as Boolean logic. In 1879 Frege published Begriffsschrift which inaugurated modern logic with the invention of quantifier notation. In 1903 Alfred North Whitehead and Bertrand Russell published Principia Mathematica^{[6]} on the foundations of mathematics, attempting to derive mathematical truths from axioms and inference rules in symbolic logic. In 1931 Gödel raised serious problems with the foundationalist program and logic ceased to focus on such issues.
The development of logic since Frege, Russell and Wittgenstein had a profound influence on the practice of philosophy and the perceived nature of philosophical problems (see Analytic philosophy), and Philosophy of mathematics. Logic, especially sentential logic, is implemented in computer logic circuits and is fundamental to computer science. Logic is commonly taught by university philosophy departments often as a compulsory discipline.
The Organon was Aristotle's body of work on logic, with the Prior Analytics constituting the first explicit work in formal logic, introducing the syllogistic. The parts of syllogistic, also known by the name term logic, were the analysis of the judgements into propositions consisting of two terms that are related by one of a fixed number of relations, and the expression of inferences by means of syllogisms that consisted of two propositions sharing a common term as premise, and a conclusion which was a proposition involving the two unrelated terms from the premises.
Aristotle's work was regarded in classical times and from medieval times in Europe and the Middle East as the very picture of a fully worked out system. It was not alone: the Stoics proposed a system of propositional logic that was studied by medieval logicians; nor was the perfection of Aristotle's system undisputed; for example the problem of multiple generality was recognised in medieval times. Nonetheless, problems with syllogistic logic were not seen as being in need of revolutionary solutions.
Today, some academics claim that Aristotle's system is generally seen as having little more than historical value (though there is some current interest in extending term logics), regarded as made obsolete by the advent of propositional logic and the predicate calculus. Others use Aristotle in argumentation theory to help develop and critically question argumentation schemes that are used in artificial intelligence and legal arguments.
A propositional calculus or logic (also a sentential calculus) is a formal system in which formulae representing propositions can be formed by combining atomic propositions using logical connectives, and a system of formal proof rules allows certain formulæ to be established as "theorems".
Predicate logic is the generic term for symbolic formal systems such as firstorder logic, secondorder logic, manysorted logic, and infinitary logic.
Predicate logic provides an account of quantifiers general enough to express a wide set of arguments occurring in natural language. Aristotelian syllogistic logic specifies a small number of forms that the relevant part of the involved judgements may take. Predicate logic allows sentences to be analysed into subject and argument in several additional ways, thus allowing predicate logic to solve the problem of multiple generality that had perplexed medieval logicians.
The development of predicate logic is usually attributed to Gottlob Frege, who is also credited as one of the founders of analytical philosophy, but the formulation of predicate logic most often used today is the firstorder logic presented in Principles of Mathematical Logic by David Hilbert and Wilhelm Ackermann in 1928. The analytical generality of predicate logic allowed the formalisation of mathematics, drove the investigation of set theory, and allowed the development of Alfred Tarski's approach to model theory. It provides the foundation of modern mathematical logic.
Frege's original system of predicate logic was secondorder, rather than firstorder. Secondorder logic is most prominently defended (against the criticism of Willard Van Orman Quine and others) by George Boolos and Stewart Shapiro.
In languages, modality deals with the phenomenon that subparts of a sentence may have their semantics modified by special verbs or modal particles. For example, "We go to the games" can be modified to give "We should go to the games", and "We can go to the games"" and perhaps "We will go to the games". More abstractly, we might say that modality affects the circumstances in which we take an assertion to be satisfied.
The logical study of modality dates back to Aristotle, who was concerned with the alethic modalities of necessity and possibility, which he observed to be dual in the sense of De Morgan duality.Template:Fact While the study of necessity and possibility remained important to philosophers, little logical innovation happened until the landmark investigations of Clarence Irving Lewis in 1918, who formulated a family of rival axiomatizations of the alethic modalities. His work unleashed a torrent of new work on the topic, expanding the kinds of modality treated to include deontic logic and epistemic logic. The seminal work of Arthur Prior applied the same formal language to treat temporal logic and paved the way for the marriage of the two subjects. Saul Kripke discovered (contemporaneously with rivals) his theory of frame semantics which revolutionised the formal technology available to modal logicians and gave a new graphtheoretic way of looking at modality that has driven many applications in computational linguistics and computer science, such as dynamic logic.
The motivation for the study of logic in ancient times was clear: it is so that one may learn to distinguish good from bad arguments, and so become more effective in argument and oratory, and perhaps also to become a better person. Half of the works of Aristotle's Organon treat inference as it occurs in an informal setting, side by side with the development of the syllogistic, and in the Aristotelian school, these informal works on logic were seen as complementary to Aristotle's treatment of rhetoric.
This ancient motivation is still alive, although it no longer takes centre stage in the picture of logic; typically dialectical logic will form the heart of a course in critical thinking, a compulsory course at many universities.
Argumentation theory is the study and research of informal logic, fallacies, and critical questions as they relate to every day and practical situations. Specific types of dialogue can be analyzed and questioned to reveal premises, conclusions, and fallacies. Argumentation theory is now applied in artificial intelligence and law.
Mathematical logic really refers to two distinct areas of research: the first is the application of the techniques of formal logic to mathematics and mathematical reasoning, and the second, in the other direction, the application of mathematical techniques to the representation and analysis of formal logic.^{[21]}
The earliest use of mathematics and geometry in relation to logic and philosophy goes back to the ancient Greeks such as Euclid, Plato, and Aristotle.^{[22]} Many other ancient and medieval philosophers applied mathematical ideas and methods to their philosophical claims.^{[23]}
The boldest attempt to apply logic to mathematics was undoubtedly the logicism pioneered by philosopherlogicians such as Gottlob Frege and Bertrand Russell: the idea was that mathematical theories were logical tautologies, and the programme was to show this by means to a reduction of mathematics to logic.^{[6]} The various attempts to carry this out met with a series of failures, from the crippling of Frege's project in his Grundgesetze by Russell's paradox, to the defeat of Hilbert's program by Gödel's incompleteness theorems.
Both the statement of Hilbert's program and its refutation by Gödel depended upon their work establishing the second area of mathematical logic, the application of mathematics to logic in the form of proof theory.^{[24]} Despite the negative nature of the incompleteness theorems, Gödel's completeness theorem, a result in model theory and another application of mathematics to logic, can be understood as showing how close logicism came to being true: every rigorously defined mathematical theory can be exactly captured by a firstorder logical theory; Frege's proof calculus is enough to describe the whole of mathematics, though not equivalent to it. Thus we see how complementary the two areas of mathematical logic have been.Template:Fact
If proof theory and model theory have been the foundation of mathematical logic, they have been but two of the four pillars of the subject. Set theory originated in the study of the infinite by Georg Cantor, and it has been the source of many of the most challenging and important issues in mathematical logic, from Cantor's theorem, through the status of the Axiom of Choice and the question of the independence of the continuum hypothesis, to the modern debate on large cardinal axioms.
Recursion theory captures the idea of computation in logical and arithmetic terms; its most classical achievements are the undecidability of the Entscheidungsproblem by Alan Turing, and his presentation of the ChurchTuring thesis.^{[25]} Today recursion theory is mostly concerned with the more refined problem of complexity classes — when is a problem efficiently solvable? — and the classification of degrees of unsolvability.^{[26]}
Philosophical logic deals with formal descriptions of natural language. Most philosophers assume that the bulk of "normal" proper reasoning can be captured by logic, if one can find the right method for translating ordinary language into that logic. Philosophical logic is essentially a continuation of the traditional discipline that was called "Logic" before the invention of mathematical logic. Philosophical logic has a much greater concern with the connection between natural language and logic. As a result, philosophical logicians have contributed a great deal to the development of nonstandard logics (e.g., free logics, tense logics) as well as various extensions of classical logic (e.g., modal logics), and nonstandard semantics for such logics (e.g., Kripke's technique of supervaluations in the semantics of logic).
Logic and the philosophy of language are closely related. Philosophy of language has to do with the study of how our language engages and interacts with our thinking. Logic has an immediate impact on other areas of study. Studying logic and the relationship between logic and ordinary speech can help a person better structure their own arguments and critique the arguments of others. Many popular arguments are filled with errors because so many people are untrained in logic and unaware of how to correctly formulate an argument.
Logic cut to the heart of computer science as it emerged as a discipline: Alan Turing's work on the Entscheidungsproblem followed from Kurt Gödel's work on the incompleteness theorems, and the notion of general purpose computers that came from this work was of fundamental importance to the designers of the computer machinery in the 1940s.
In the 1950s and 1960s, researchers predicted that when human knowledge could be expressed using logic with mathematical notation, it would be possible to create a machine that reasons, or artificial intelligence. This turned out to be more difficult than expected because of the complexity of human reasoning. In logic programming, a program consists of a set of axioms and rules. Logic programming systems such as Prolog compute the consequences of the axioms and rules in order to answer a query.
Today, logic is extensively applied in the fields of artificial intelligence, and computer science, and these fields provide a rich source of problems in formal and informal logic. Argumentation theory is one good example of how logic is being applied to artificial intelligence. The ACM Computing Classification System in particular regards:
Furthermore, computers can be used as tools for logicians. For example, in symbolic logic and mathematical logic, proofs by humans can be computerassisted. Using automated theorem proving the machines can find and check proofs, as well as work with proofs too lengthy to be written out by hand.
Just as we have seen there is disagreement over what logic is about, so there is disagreement about what logical truths there are.
The logics discussed above are all "bivalent" or "twovalued"; that is, they are most naturally understood as dividing propositions into true and false propositions. Nonclassical logics are those systems which reject bivalence.
Hegel developed his own dialectic logic that extended Kant's transcendental logic but also brought it back to ground by assuring us that "neither in heaven nor in earth, neither in the world of mind nor of nature, is there anywhere such an abstract 'either–or' as the understanding maintains. Whatever exists is concrete, with difference and opposition in itself".^{[27]}
In 1910 Nicolai A. Vasiliev rejected the law of excluded middle and the law of contradiction and proposed the law of excluded fourth and logic tolerant to contradiction.Template:Fact In the early 20th century Jan Łukasiewicz investigated the extension of the traditional true/false values to include a third value, "possible", so inventing ternary logic, the first multivalued logic.Template:Fact
Logics such as fuzzy logic have since been devised with an infinite number of "degrees of truth", represented by a real number between 0 and 1.^{[28]}
Intuitionistic logic was proposed by L.E.J. Brouwer as the correct logic for reasoning about mathematics, based upon his rejection of the law of the excluded middle as part of his intuitionism. Brouwer rejected formalisation in mathematics, but his student Arend Heyting studied intuitionistic logic formally, as did Gerhard Gentzen. Intuitionistic logic has come to be of great interest to computer scientists, as it is a constructive logic, and is hence a logic of what computers can do.
Modal logic is not truth conditional, and so it has often been proposed as a nonclassical logic. However, modal logic is normally formalised with the principle of the excluded middle, and its relational semantics is bivalent, so this inclusion is disputable.
What is the epistemological status of the laws of logic? What sort of argument is appropriate for criticising purported principles of logic? In an influential paper entitled "Is logic empirical?"^{[29]} Hilary Putnam, building on a suggestion of W.V. Quine, argued that in general the facts of propositional logic have a similar epistemological status as facts about the physical universe, for example as the laws of mechanics or of general relativity, and in particular that what physicists have learned about quantum mechanics provides a compelling case for abandoning certain familiar principles of classical logic: if we want to be realists about the physical phenomena described by quantum theory, then we should abandon the principle of distributivity, substituting for classical logic the quantum logic proposed by Garrett Birkhoff and John von Neumann.^{[30]}
Another paper by the same name by Sir Michael Dummett argues that Putnam's desire for realism mandates the law of distributivity.^{[31]} Distributivity of logic is essential for the realist's understanding of how propositions are true of the world in just the same way as he has argued the principle of bivalence is. In this way, the question, "Is logic empirical?" can be seen to lead naturally into the fundamental controversy in metaphysics on realism versus antirealism.
It is obvious that the notion of implication formalised in classical logic does not comfortably translate into natural language by means of "if… then…", due to a number of problems called the paradoxes of material implication.
The first class of paradoxes involves counterfactuals, such as "If the moon is made of green cheese, then 2+2=5", which are puzzling because natural language does not support the principle of explosion. Eliminating this class of paradoxes was the reason for C. I. Lewis's formulation of strict implication, which eventually led to more radically revisionist logics such as relevance logic.
The second class of paradoxes involves redundant premises, falsely suggesting that we know the succedent because of the antecedent: thus "if that man gets elected, granny will die" is materially true if granny happens to be in the last stages of a terminal illness, regardless of the man's election prospects. Such sentences violate the Gricean maxim of relevance, and can be modelled by logics that reject the principle of monotonicity of entailment, such as relevance logic.
Hegel was deeply critical of any simplified notion of the Law of NonContradiction. It was based on Leibniz's idea that this law of logic also requires a sufficient ground in order to specify from what point of view (or time) one says that something cannot contradict itself, a building for example both moves and does not move, the ground for the first is our solar system for the second the earth. In Hegelian dialectic the law of noncontradiction, of identity, itself relies upon difference and so is not independently assertable.
Closely related to questions arising from the paradoxes of implication comes the suggestion that logic ought to tolerate inconsistency. Relevance logic and paraconsistent logic are the most important approaches here, though the concerns are different: a key consequence of classical logic and some of its rivals, such as intuitionistic logic, is that they respect the principle of explosion, which means that the logic collapses if it is capable of deriving a contradiction. Graham Priest, the main proponent of dialetheism, has argued for paraconsistency on the grounds that there are in fact, true contradictions.^{[32]}
The philosophical vein of various kinds of skepticism contains many kinds of doubt and rejection of the various bases upon which logic rests, such as the idea of logical form, correct inference, or meaning, typically leading to the conclusion that there are no logical truths. Observe that this is opposite to the usual views in philosophical skepticism, where logic directs skeptical enquiry to doubt received wisdoms, as in the work of Sextus Empiricus.
Friedrich Nietzsche provides a strong example of the rejection of the usual basis of logic: his radical rejection of idealisation led him to reject truth as a mobile army of metaphors, metonyms, and anthropomorphisms—in short ... metaphors which are worn out and without sensuous power; coins which have lost their pictures and now matter only as metal, no longer as coins^{[33]}. His rejection of truth did not lead him to reject the idea of either inference or logic completely, but rather suggested that logic [came] into existence in man's head [out] of illogic, whose realm originally must have been immense. Innumerable beings who made inferences in a way different from ours perished^{[34]}. Thus there is the idea that logical inference has a use as a tool for human survival, but that its existence does not support the existence of truth, nor does it have a reality beyond the instrumental: Logic, too, also rests on assumptions that do not correspond to anything in the real world^{[35]}.


Textbooks from Wikibooks
Quotations from Wikiquote
Source texts from Wikisource
Error creating thumbnail: sh: convert: command not found 
News stories from Wikinews
Template:Navbox with collapsible sections

Contents 
From Old French logique < Latin logica < Ancient Greek λογική (logike), “‘logic’”) < properly feminine of λογικός (logikós), “‘of or pertaining to speech or reason or reasoning, rational, reasonable’”) < λόγος (logos), “‘speech, reason’”).
Singular 
Plural 
logic (countable and uncountable; plural logics)




Logic is the science of reasoning. Logic helps people decide whether something is true or false.
A popular example, given by Aristotle:
$\backslash land$ is read like "and", meaning both of the two. $\backslash lor$ is read like "or", meaning at least one of the two. $\backslash Rightarrow$ is read like "implies", or "If ... then ...". $\backslash lnot$ is read like "not", or "it is not the case that ...". Parentheses (,) are added for clarity; this means that what is in parenthesis should be looked at before the things outside.
This is the same example using logic symbols:
And this is the same example using general terms:
Finally, those talking about logic talk about logic clauses. A clause is simply something like "Aristole is human" or "all humans are mortal". Clauses have a truth value; they are either true or false, but not both. Mistakes in logic are called "fallacies".
A logical proof is a series of logical clauses that use ideas which are already proven to be either true or false ("Aristotle is mortal," "all mortals die") to prove that another logical clause is true or false ("Aristotle has died or will die").
There are statements that are always true.
$(a\; \backslash lor\; \backslash lnot\; a)$ is always true. It is called a tautology. (for example: "It rains, or it does not rain")
Logic is used by computers in what is called an algorithm. An algorithm is sort of like a cooking recipe; it tells the computer what to do and when to do it.
Logic is used in mathematics. People who study math create proofs that use logic to show that math facts are correct. There is an area of mathematics called mathematical logic that studies logic using mathematics.
Logic is also studied in philosophy.
Here are sentences from other pages on Logic, which are similar to those in the above article.
