On Information

Concepts in Information Management – blog by Ronald Fuller

Comments on the Decline of Logic Education

I recently asked a number of logicians, historians of logic, and logic enthusiasts the following question: 


When exactly and why did logic stop being a core requirement for every single educated person in the west and become seen as a technical, niche elective that only a tiny fraction of educated people know anything about?


I received 2 responses in the Ontolog Forum:


John Sowa said: "I blame Bertrand Russell. He wanted schools to stop teaching traditional logic and replace it with symbolic logic.  He got 50% of what he asked for."


Chris Menzel said: "I attribute this far more to the utter havoc wrought upon higher education by conservative, specifically brainless Republican, politicians. They've managed to transmogrify our once glorious system of state universities to a collection of education 'dealerships' whose purpose is to provide a 'service' to their 'clients' that guarantees them a high paying job in business or industry. The on-going gutting of the liberal arts has been a sad consequence of this."


I received numerous responses to the same question at Academia.edu:


John Corcoran said: "I have never given this any thought, but it is an interesting question. One thing to bear in mind is that over the years logic got competition from 'critical thinking' and kindred subjects."  


The discussion below is from John Corcoran's session titled CORCORAN ON LOGIC TEACHING IN THE 21ST CENTURY at Academia.edu:


Ronald Fuller
When exactly and why did logic stop being a core requirement for every single educated person in the west and become a technical, niche elective that only a tiny fraction of educated people know anything about? John Sowa blames Bertrand Russell, Chris Menzel blames republican politicians (both views expressed recently in the ontolog forum). Others blame the progressive education movement. This seems to be an important question but its hard to find anything written about it. What do you say?



H. E. Baber
My university has just dropped the logic requirement. Why? Because it doesn't sell. I'm siting here at academic assembly after a rousing speech by our dean telling us we had to worry about the possibility of a decline in enrollment and so had to work ever harder to sell ourselves by offering gimmicky junk and renaming courses with sexy names, etc. It isn't the politicians, or the progressive education movement: it is the market.



Edward Macierowski
I'd like to sketch an answer to Fuller's question: "When exactly and why did logic stop being a core requirement for every single educated person in the west and become a technical, niche elective that only a tiny fraction of educated people know anything about? ... This seems to be an important question its hard to find anything written about it."


First, let's look at the terms of the argument: "core requirement" and "elective" seem to presuppose an education system built upon something like "majors" and "minors," instead of a largely fixed undergraduate curriculum. How might such a division in "majors" and "minors" have developed? The old-fashioned American liberal arts college, usually of religious origin, seems generally concerned with educating students in the moral, intellectual, and theological virtues with a view to cultivating a citizen able to make competent practical judgments. The bachelor of arts degree was generally a terminal degree, except for those called to one of the learned professions--the law, medicine, or preaching. Accordingly, a common curriculum, largely dealing with classical and modern literatures, philosophy or theology, and perhaps mathematics, seemed to provide an encyclopedic common vision of the whole.


Under the influence of the impressive scientific progress found in the specialized German research universities, Johns Hopkins University and then most others began to accept the German model of specialization. Unlike the Germans, however, who provided education in the liberal arts at the Gymnasium, the American bachelor's programs would have been where the liberal arts would, more or less, be studied. With the new emphasis on specialized research and the creation of the Ph.D. degree, undergraduate education began to be re-oriented from teaching the liberal arts to preparing students for research specialties. The introduction of the elective system under the influence of President Eliot of Harvard was widely, indeed eventually almost universally adopted. I believe it is against this background that we see the development of "majors" and "minors." In 1945, with Harvard's red book, a proposal for "general education" was introduced perhaps to accommodate the needs of large numbers of returning veterans now eligible to attend college.


Against this background, we can appreciate the tensions between general education (before selecting a "major") and the specialized education involved in a "major" subject. Sometimes the notion of "liberal arts" was identified with some part of "general education," which might come to mean just about anything except what a student chose to "major" in. Accordingly, in a worst case, the notion of "liberal arts" could be defined negatively as "whatever does not belong to my major" and or even voluntaristically as "whatever I want aside from my major." In this setting, the notion of "liberal arts" would be evacuated of any specific intellectual content.


Here I'd like to call attention to a text-book by Professor John A. Oesterle at the University of Notre Dame "Logic: The Art of Defining and Reasoning." The first edition came out in 1953, the second in 1963. One of the exercises at the end of chapter 19 calls for expanding an argument into explicit categorical syllogisms and to check for accuracy. Let's consider this one:


"Every liberal art is necessary, for every liberal art is ordered to acquiring knowledge. Logic is a liberal art. Therefore, logic is necessary.


" Obviously, the missing major premise for the prosyllogism is "Everything ordered to acquiring knowledge is necessary." Professor Oesterle felt no need to justify the minor premise of his main argument: "Logic is a liberal art."


Why not? At Notre Dame, he could take it for granted as a matter of common knowledge that the medieval liberal arts would have included a "trivium" of grammar, rhetoric, and logic, and a "quadrivium" of arithmetic, geometry, music, and astronomy. This indicates that logic does have a definite intellectual content.


When people talk about the “liberal arts” are they using this term univocally or equivocally? Most people in the United States today, following the 1945 Harvard "Report on General Education in a Free Society," have come to regard “liberal arts” education as “general education,” what might crudely be described as “any course I take outside my major.” Some of my students, however, were surprised by my calling to their attention to a larger tradition that includes a trivium of studies called grammar, rhetoric, and logic, and a quadrivium of arithmetic, geometry, music, and astronomy. The notion that the “liberal or freeing” arts have a definite content suggests that the word is used equivocally when people use it to mean “anything but my major subject.”


These students wanted to know where they could learn more about the classical liberal arts, understood as having a definite content: verbal and mathematical skills. So here I propose to those who are interested some places where they can learn more about these arts. The shortest competent treatment I have found in English is the New Catholic Encyclopedia article “liberal arts” by Father Benedict Ashley, O.P. This article provides an “executive summary” of the content, history, and controversies about the liberal arts. It also provides cross-references to related key topics covered in the NCE. This article was so good that it has been included in the new edition without change. Since then, however, the important lecture “History and the Liberal Arts” that it mentioned has been published in Jacob Klein: Lectures and Essays (Annapolis 1985 ISBN 0-9603690-2-3).


For those who want a more comprehensive study of the issue, see Conway and Ashley’s article “The Liberal Arts in St. Thomas Aquinas,” published in The Thomist vol. 22 (1959).


For a massive treatment of the liberal arts tradition in the Middle Ages, see "Arts Libéraux et Philosophie au Moyen Âge" (Montreal and Paris 1969)



John Corcoran
Many thanks for your well-researched and thoughtful contribution.

As a personal aside, I was an undergraduate engineering major at Hopkins in the late 1950s. Almost every student took logic as an elective. The textbook was the 1934 Cohen-Nagel masterpiece. When I started teaching logic in the mid-1960s I used Mates 1965 first, then Lemmon 1965, then Tarski 1946 until I realized that beginning students needed a more liberal-arts approach before they could appreciate the brilliance of Mates, Lemmon, Tarski, etc. My solution was to use selections from Cohen-Nagel, Lemmon, and Tarski all available in inexpensive paperbacks.

If I remember correctly, the only course required of all Hopkins students was English Composition, which was largely aimed at polishing so the graduates would not embarrass themselves by using "uneducated" language.



Bhupinder Singh Anand
Ronald Fuller aptly notes:


"When exactly and why did logic stop being a core requirement for every single educated person in the west and become a technical, niche elective that only a tiny fraction of educated people know anything about? ... This seems to be an important question its hard to find anything written about it."


Yes, it is indeed hard to accept that, for over a hundred years now, logic has been taught:


(a) to an academic elite (see the remarks below on the teaching of Set Theory as a core requirement in mathematics and logic), as intellectually challenging philosophical science fiction;

instead of:


(b) to the laity as the scientific assignment of truth values to the grammatically correct propostions of a language of adequate representation and unambiguous communication, whose evolution is dictated by the needs of a pragmatic philosophy responsible for abstracting a coherent perspective of the external world for applied scientists (whose concern is our sensory observations of a `common' external world).


Inevitably, such teaching is difficult to justify economically, and to sustain as a core educational requirement, in a globalising and increasing egalitarian educational environment.




For instance, in a brief, but provocative, review of what they term as "the enduring evolution of logic" over the ages, the authors of Oxford University Press' recently released 'A Dictionary of Logic', philosophers Thomas Ferguson and Graham Priest take to task---what they view as Kant-influenced---the manner in which logic is taught as a first course in most places in the world:


"... as usually ahistorical and somewhat dogmatic. This is what logic is; just learn the rules. It is as if Frege had brought down the tablets from Mount Sinai: the result is God-given, fixed, and unquestionable."




However, Ferguson and Priest reveal their conventional heritage by concluding that:


"Logic provides a theory, or set of theories, about what follows from what, and why. And like any theoretical inquiry, it has evolved, and will continue to do so. It will surely produce theories of greater depth, scope, subtlety, refinement—and maybe even truth."


It is not obvious whether the 'maybe' is prescient optimism, or a tongue-in-cheek exit throwaway!




For, if anything, the developments in logic since 1931 has---seemingly in gross violation of the hallowed principle of Ockham's razor, and its crude, but highly effective, modern avatar KISS---indeed produced a 'plethora of theories of great depth, scope, subtlety, and refinement'.


These, however, seem to have more in common with the, cynical, twentieth century emphasis on subjective, unverifiable, 'truth', rather than with the concept of an objective, evidence-based, 'truth' that centuries of philosophers and mathematicians strenuously struggled to differentiate and express.


A struggle whose dilemma was reflected so eloquently in this nineteenth century satirical quote:


"When I use a word," Humpty Dumpty said, in rather a scornful tone, "it means just what I choose it to mean—neither more nor less."


"The question is," said Alice, "whether you can make words mean so many different things."


"The question is," said Humpty Dumpty, "which is to be master—that’s all."


... Lewis Carroll (Charles L. Dodgson), 'Through the Looking-Glass', chapter 6, p. 205 (1934 ed.). First published in 1872.


... http://www.bartleby.com/73/2019.html




It was, indeed, an epic struggle which culminated in the nineteenth century standards of rigour successfully imposed---in no small measure by the works of Augustin-Louis Cauchy and Karl Weierstrasse---on verifiable interpretations of mathematical propositions conncerning infinite processes involving real numbers.


A struggle, moreover, which should have culminated equally successfully in similar twentieth century standards---on the verifiable interpretations of mathematical propositions containing references to infinite computations involving integers---sought to be imposed in 1936 by Alan Turing upon philosophical and mathematical discourse.




For it follows from Turing's 1936 reasoning that where quantification is not, or cannot be, explicitly defined in formal logical terms---eg. the classical expression of the Liar paradox as 'This sentence is a lie'---a paradox cannot per se be considered as posing serious linguistic or philosophical concerns.


Of course, as reflected implicitly in Kurt Goedel's seminal 1931 paper on undecidable arithmetical propositions---it would be a matter of serious concern if the word 'This' in the English language sentence, 'This sentence is a lie', could be validly viewed as implicitly implying that:


(i) there is a constructive infinite enumeration of English language sentences;


(ii) to each of which a truth-value can be constructively assigned by the rules of a two-valued logic; and,


(iii) in which 'This' refers uniquely to a particular sentence in the enumeration.




However, Turing's constructive perspective had the misfortune of being subverted by a knee-jerk, anti-establishment, culture that was---and apparently remains to this day---overwhelmed and misled by Goedel's powerful Platonic---and essentially unverifiable---mathematical and philosophical 1931 interpretation of his own work (i.e, construction of an arithmetical proposition that is formally unprovable, but undeniably true under any definition of 'truth' in any interpretation of arithmetic over the natural numbers).


Otherwise, I believe that Turing could easily have provided the necessary constructive interpretations of arithmetical truth---sought by David Hilbert for establishing the consistency of number theory finitarily---which is addressed by the following paper, due to appear in the December 2016 issue of 'Cognitive Systems Research':


'The Truth Assignments That Differentiate Human Reasoning From Mechanistic Reasoning: The evidence-based argument for Lucas’ Gödelian thesis'






The paper endorses the implicit orthodoxy of an Ockham's razor influenced perspective---which Ferguson and Priest apparently find wanting---that logic is simply a deterministic set of rules that must constructively assign the truth values of 'truth/falsity' to the sentences of a language.


It is a view expressed earlier, as the key to a possible resolution of the EPR paradox, in the following paper that was presented at the workshop on 'Emergent Computational Logics' at UNILOG'2015, Istanbul, Turkey: 'Algorithmically Verifiable Logic vis a vis Algorithmically Computable Logic: Could resolving EPR need two complementary Logics?'




The paper answered the question of 'What is logic' by introducing the explicit definition:


A finite set Lambda of rules is a Logic of a formal mathematical language L if, and only if, Lambda constructively assigns unique truth-values:


(a) Of provability/unprovability to the formulas of L; and


(b) Of truth/falsity to the sentences of the Theory T(U) which is defined semantically by the Lambda-interpretation of L over a structure U.


It showed there that such a definitional rule-based approach to 'logic' and 'truth' allows us to:


* Equate the provable formulas of the first order Peano Arithmetic PA with the PA formulas that can be evidenced as `true' under an algorithmically computable interpretation of PA over the structure N of the natural numbers;


* Adequately represent some of the philosophically troubling abstractions of the physical sciences mathematically;


* Interpret such representations unambiguously; and


* Conclude further:


* First that the concept of infinity is an emergent feature of any mechanical intelligence whose true arithmetical propositions are provable in the first-order Peano Arithmetic; and


* Second that discovery and formulation of the laws of quantum physics lies within the algorithmically computable logic and reasoning of a mechanical intelligence whose logic is circumscribed by the first-order Peano Arithmetic.




The broader significance of Ferguson and Priest's criticism is seen if we note how the, seemingly innocuous, perception of Set Theory---as the preferred language of mathematical communication---taught to an aspiring physical scientist is at variance with the reality of what a physical scientist actually requires from a language of adequate representation and unambiguous communication.


For instance, in a recent post---Large Countable Ordinals (Part 1)---on his Azimuth Blog, mathematical physicist John Baez confesses to a passionate urge to write a series of blogs---that might even eventually yield a book---about the infinite, reflecting both his fascination with, and frustration at, the challenges involved in formally denoting, and talking meaningfully about, different sizes of infinity in a Set Theory such as ZFC:


"I love the infinite. ... It may not exist in the physical world, but we can set up rules to think about it in consistent ways, and then it’s a helpful concept. ... Cantor’s realization that there are different sizes of infinity is ... part of the everyday bread and butter of mathematics."






However whether we can, and even whether we should, think at the university level about the "different sizes of infinity" that don't "exist in the physical world" in "consistent ways", and to what extent such a concept is "helpful" to students, are issues that need to be addressed from an objective, evidence-based, computational perspective in addition to the conventional self-evident, intuition-based, classical perspective towards formal axiomatic theories.


REASON: It follows from the above-cited CSR paper that ZF (which is ZFC without the axiom of Choice) axiomatically postulates the existence of an infinite set (a completed infinity) which cannot be evidenced as 'true' under any putative interpretation of ZF.


Since the paper shows that the first order Peano Arithmetic PA is a language of adequate representation and unambiguous communication of mathematically expressible propositions, it would follow in Russell's colourful phraseology that:


(i) In the first-order Peano Arithmetic PA we always know what we are talking about, even though we may not always know whether it is true or not;


(ii) In the first-order Set Theory we never know what we are talking about, so the question of whether or not it is true is only of notional interest.


Which raises the issue not only of whether we can think about the different sizes of infinity in a consistent way, but also to what extent we may need to justify that teaching of such a concept is helpful to an emerging student of mathematics. Kind regards, Bhup


Add comment


  • No comments found