Argumentation theory is the interdisciplinary study of how conclusions can be supported or undermined by premises through logical reasoning. With historical origins in logic, dialectic, and rhetoric, argumentation theory includes the arts and sciences of civil debate, dialogue, conversation, and persuasion. It studies rules of inference, logic, and procedural rules in both artificial and real-world settings.[1][2]
Argumentation includes various forms of dialogue such as deliberation and negotiation which are concerned with collaborative decision-making procedures.[3] It also encompasses eristic dialog, the branch of social debate in which victory over an opponent is the primary goal, and didactic dialogue used for teaching.[2] This discipline also studies the means by which people can express and rationally resolve or at least manage their disagreements.[4]
Argumentation is a daily occurrence, such as in public debate, science, and law.[5] For example in law, in courts by the judge, the parties and the prosecutor, in presenting and testing the validity of evidences. Also, argumentation scholars study the post hoc rationalizations by which organizational actors try to justify decisions they have made irrationally.
Argumentation is one of four rhetorical modes (also known as modes of discourse), along with exposition, description, and narration.
Some key components of argumentation are:
For example, consider the following exchange, illustrating the No true Scotsman fallacy:
In this dialogue, the proposer first offers a premise, the premise is challenged by the interlocutor, and so the proposer offers a modification of the premise, which is designed only to evade the challenge provided.
Typically an argument has an internal structure, comprising the following:
An argument has one or more premises and one conclusion.
Often classical logic is used as the method of reasoning so that the conclusion follows logically from the assumptions or support. One challenge is that if the set of assumptions is inconsistent then anything can follow logically from inconsistency. Therefore, it is common to insist that the set of assumptions be consistent. It is also good practice to require the set of assumptions to be the minimal set, with respect to set inclusion, necessary to infer the consequent. Such arguments are called MINCON arguments, short for minimal consistent. Such argumentation has been applied to the fields of law and medicine.
A non-classical approach to argumentation investigates abstract arguments, where 'argument' is considered a primitive term, so no internal structure of arguments is taken into account.[citation needed]
In its most common form, argumentation involves an individual and an interlocutor or opponent engaged in dialogue, each contending differing positions and trying to persuade each other, but there are various types of dialogue:[6]
Argumentation theory had its origins in foundationalism, a theory of knowledge (epistemology) in the field of philosophy. It sought to find the grounds for claims in the forms (logic) and materials (factual laws) of a universal system of knowledge. The dialectical method was made famous by Plato and his use of Socrates critically questioning various characters and historical figures. But argument scholars gradually rejected Aristotle's systematic philosophy and the idealism in Plato and Kant. They questioned and ultimately discarded the idea that argument premises take their soundness from formal philosophical systems. The field thus broadened.[7]
One of the original contributors to this trend was the philosopher Chaïm Perelman, who together with Lucie Olbrechts-Tyteca introduced the French term la nouvelle rhetorique in 1958 to describe an approach to argument which is not reduced to application of formal rules of inference. Perelman's view of argumentation is much closer to a juridical one, in which rules for presenting evidence and rebuttals play an important role.
Karl R. Wallace's seminal essay, "The Substance of Rhetoric: Good Reasons" in the Quarterly Journal of Speech (1963) 44, led many scholars to study "marketplace argumentation" – the ordinary arguments of ordinary people. The seminal essay on marketplace argumentation is Ray Lynn Anderson's and C. David Mortensen's "Logic and Marketplace Argumentation" Quarterly Journal of Speech 53 (1967): 143–150.[8][9] This line of thinking led to a natural alliance with late developments in the sociology of knowledge.[10] Some scholars drew connections with recent developments in philosophy, namely the pragmatism of John Dewey and Richard Rorty. Rorty has called this shift in emphasis "the linguistic turn".
In this new hybrid approach argumentation is used with or without empirical evidence to establish convincing conclusions about issues which are moral, scientific, epistemic, or of a nature in which science alone cannot answer. Out of pragmatism and many intellectual developments in the humanities and social sciences, "non-philosophical" argumentation theories grew which located the formal and material grounds of arguments in particular intellectual fields. These theories include informal logic, social epistemology, ethnomethodology, speech acts, the sociology of knowledge, the sociology of science, and social psychology. These new theories are not non-logical or anti-logical. They find logical coherence in most communities of discourse. These theories are thus often labeled "sociological" in that they focus on the social grounds of knowledge.
In general, the label "argumentation" is used by communication scholars such as (to name only a few) Wayne E. Brockriede, Douglas Ehninger, Joseph W. Wenzel, Richard Rieke, Gordon Mitchell, Carol Winkler, Eric Gander, Dennis S. Gouran, Daniel J. O'Keefe, Mark Aakhus, Bruce Gronbeck, James Klumpp, G. Thomas Goodnight, Robin Rowland, Dale Hample, C. Scott Jacobs, Sally Jackson, David Zarefsky, and Charles Arthur Willard, while the term "informal logic" is preferred by philosophers, stemming from University of Windsor philosophers Ralph H. Johnson and J. Anthony Blair. Harald Wohlrapp developed a criterion for validness (Geltung, Gültigkeit) as freedom of objections.
Trudy Govier, Douglas N. Walton, Michael Gilbert, Harvey Seigal, Michael Scriven, and John Woods (to name only a few) are other prominent authors in this tradition. Over the past thirty years, however, scholars from several disciplines have co-mingled at international conferences such as that hosted by the University of Amsterdam (the Netherlands) and the International Society for the Study of Argumentation (ISSA). Other international conferences are the biannual conference held at Alta, Utah sponsored by the (US) National Communication Association and American Forensics Association and conferences sponsored by the Ontario Society for the Study of Argumentation (OSSA).
Some scholars (such as Ralph H. Johnson) construe the term "argument" narrowly, as exclusively written discourse or even discourse in which all premises are explicit. Others (such as Michael Gilbert) construe the term "argument" broadly, to include spoken and even nonverbal discourse, for instance the degree to which a war memorial or propaganda poster can be said to argue or "make arguments". The philosopher Stephen Toulmin has said that an argument is a claim on our attention and belief, a view that would seem to authorize treating, say, propaganda posters as arguments. The dispute between broad and narrow theorists is of long standing and is unlikely to be settled. The views of the majority of argumentation theorists and analysts fall somewhere between these two extremes.
The study of naturally occurring conversation arose from the field of sociolinguistics. It is usually called conversation analysis (CA). Inspired by ethnomethodology, it was developed in the late 1960s and early 1970s principally by the sociologist Harvey Sacks and, among others, his close associates Emanuel Schegloff and Gail Jefferson. Sacks died early in his career, but his work was championed by others in his field, and CA has now become an established force in sociology, anthropology, linguistics, speech-communication and psychology.[11] It is particularly influential in interactional sociolinguistics, discourse analysis and discursive psychology, as well as being a coherent discipline in its own right. Recently CA techniques of sequential analysis have been employed by phoneticians to explore the fine phonetic details of speech.
Empirical studies and theoretical formulations by Sally Jackson and Scott Jacobs, and several generations of their students, have described argumentation as a form of managing conversational disagreement within communication contexts and systems that naturally prefer agreement.
The basis of mathematical truth has been the subject of long debate. Frege in particular sought to demonstrate (see Gottlob Frege, The Foundations of Arithmetic, 1884, and Begriffsschrift, 1879) that arithmetical truths can be derived from purely logical axioms and therefore are, in the end, logical truths.[12] The project was developed by Russell and Whitehead in their Principia Mathematica. If an argument can be cast in the form of sentences in symbolic logic, then it can be tested by the application of accepted proof procedures. This was carried out for arithmetic using Peano axioms, and the foundation most commonly used for most modern mathematics is Zermelo-Fraenkel set theory, with or without the Axiom of Choice. Be that as it may, an argument in mathematics, as in any other discipline, can be considered valid only if it can be shown that it cannot have true premises and a false conclusion.
Perhaps the most radical statement of the social grounds of scientific knowledge appears in Alan G.Gross's The Rhetoric of Science (Cambridge: Harvard University Press, 1990). Gross holds that science is rhetorical "without remainder",[13] meaning that scientific knowledge itself cannot be seen as an idealized ground of knowledge. Scientific knowledge is produced rhetorically, meaning that it has special epistemic authority only insofar as its communal methods of verification are trustworthy. This thinking represents an almost complete rejection of the foundationalism on which argumentation was first based.
Interpretive argumentation is a dialogical process in which participants explore and/or resolve interpretations often of a text of any medium containing significant ambiguity in meaning.
Interpretive argumentation is pertinent to the humanities, hermeneutics, literary theory, linguistics, semantics, pragmatics, semiotics, analytic philosophy and aesthetics. Topics in conceptual interpretation include aesthetic, judicial, logical and religious interpretation. Topics in scientific interpretation include scientific modeling.
Legal arguments are spoken presentations to a judge or appellate court by a lawyer, or parties when representing themselves of the legal reasons why they should prevail. Oral argument at the appellate level accompanies written briefs, which also advance the argument of each party in the legal dispute. A closing argument, or summation, is the concluding statement of each party's counsel reiterating the important arguments for the trier of fact, often the jury, in a court case. A closing argument occurs after the presentation of evidence.
A judicial opinion or legal opinion is in certain jurisdictions a written explanation by a judge or group of judges that accompanies an order or ruling in a case, laying out the rationale (justification) and legal principles for the ruling.[14] It cites the decision reached to resolve the dispute. A judicial opinion usually includes the reasons behind the decision.[14] Where there are three or more judges, it may take the form of a majority opinion, minority opinion or a concurring opinion.[15]
Political arguments are used by academics, media pundits, candidates for political office and government officials. Political arguments are also used by citizens in ordinary interactions to comment about and understand political events.[16] The rationality of the public is a major question in this line of research. Political scientist Samuel L. Popkin coined the expression "low information voters" to describe most voters who know very little about politics or the world in general.
In practice, a "low information voter" may not be aware of legislation that their representative has sponsored in Congress. A low-information voter may base their ballot box decision on a media sound-bite, or a flier received in the mail. It is possible for a media sound-bite or campaign flier to present a political position for the incumbent candidate that completely contradicts the legislative action taken in the Capitol on behalf of the constituents. It may only take a small percentage of the overall voting group who base their decision on the inaccurate information to form a voter bloc large enough to swing an overall election result. When this happens, the constituency at large may have been duped or fooled. Nevertheless, the election result is legal and confirmed. Savvy Political consultants will take advantage of low-information voters and sway their votes with disinformation and fake news because it can be easier and sufficiently effective. Fact checkers have come about in recent years to help counter the effects of such campaign tactics.
Psychology has long studied the non-logical aspects of argumentation. For example, studies have shown that simple repetition of an idea is often a more effective method of argumentation than appeals to reason. Propaganda often utilizes repetition.[17] "Repeat a lie often enough and it becomes the truth" is a law of propaganda often attributed to the Nazi politician Joseph Goebbels. Nazi rhetoric has been studied extensively as, inter alia, a repetition campaign.
Empirical studies of communicator credibility and attractiveness, sometimes labeled charisma, have also been tied closely to empirically-occurring arguments. Such studies bring argumentation within the ambit of persuasion theory and practice.
Some psychologists such as William J. McGuire believe that the syllogism is the basic unit of human reasoning. They have produced a large body of empirical work around McGuire's famous title "A Syllogistic Analysis of Cognitive Relationships". A central line of this way of thinking is that logic is contaminated by psychological variables such as "wishful thinking", in which subjects confound the likelihood of predictions with the desirability of the predictions. People hear what they want to hear and see what they expect to see. If planners want something to happen they see it as likely to happen. If they hope something will not happen, they see it as unlikely to happen. Thus smokers think that they personally will avoid cancer, promiscuous people practice unsafe sex, and teenagers drive recklessly.
Stephen Toulmin and Charles Arthur Willard have championed the idea of argument fields, the former drawing upon Ludwig Wittgenstein's notion of language games, (Sprachspiel) the latter drawing from communication and argumentation theory, sociology, political science, and social epistemology. For Toulmin, the term "field" designates discourses within which arguments and factual claims are grounded.[18] For Willard, the term "field" is interchangeable with "community", "audience", or "readership".[19] Similarly, G. Thomas Goodnight has studied "spheres" of argument and sparked a large literature created by younger scholars responding to or using his ideas.[20] The general tenor of these field theories is that the premises of arguments take their meaning from social communities.[21]
The most influential theorist has been Stephen Toulmin, the Cambridge educated philosopher and educator,[22] best known for his Toulmin model of argument. What follows below is a sketch of his ideas.
Throughout many of his works, Toulmin pointed out that absolutism (represented by theoretical or analytic arguments) has limited practical value. Absolutism is derived from Plato's idealized formal logic, which advocates universal truth; accordingly, absolutists believe that moral issues can be resolved by adhering to a standard set of moral principles, regardless of context. By contrast, Toulmin contends that many of these so-called standard principles are irrelevant to real situations encountered by human beings in daily life.
To develop his contention, Toulmin introduced the concept of argument fields. In The Uses of Argument (1958), Toulmin claims that some aspects of arguments vary from field to field, and are hence called "field-dependent", while other aspects of argument are the same throughout all fields, and are hence called "field-invariant". The flaw of absolutism, Toulmin believes, lies in its unawareness of the field-dependent aspect of argument; absolutism assumes that all aspects of argument are field invariant.
In Human Understanding (1972), Toulmin suggests that anthropologists have been tempted to side with relativists because they have noticed the influence of cultural variations on rational arguments. In other words, the anthropologist or relativist overemphasizes the importance of the "field-dependent" aspect of arguments, and neglects or is unaware of the "field-invariant" elements. In order to provide solutions to the problems of absolutism and relativism, Toulmin attempts throughout his work to develop standards that are neither absolutist nor relativist for assessing the worth of ideas.
In Cosmopolis (1990), he traces philosophers' "quest for certainty" back to René Descartes and Thomas Hobbes, and lauds John Dewey, Wittgenstein, Martin Heidegger, and Richard Rorty for abandoning that tradition.
Arguing that absolutism lacks practical value, Toulmin aimed to develop a different type of argument, called practical arguments (also known as substantial arguments). In contrast to absolutists' theoretical arguments, Toulmin's practical argument is intended to focus on the justificatory function of argumentation, as opposed to the inferential function of theoretical arguments. Whereas theoretical arguments make inferences based on a set of principles to arrive at a claim, practical arguments first find a claim of interest, and then provide justification for it. Toulmin believed that reasoning is less an activity of inference, involving the discovering of new ideas, and more a process of testing and sifting already existing ideas—an act achievable through the process of justification.
Toulmin believed that for a good argument to succeed, it needs to provide good justification for a claim. This, he believed, will ensure it stands up to criticism and earns a favourable verdict. In The Uses of Argument (1958), Toulmin proposed a layout containing six interrelated components for analyzing arguments:
The first three elements, claim, ground, and warrant, are considered as the essential components of practical arguments, while the second triad, qualifier, backing, and rebuttal, may not be needed in some arguments.
When Toulmin first proposed it, this layout of argumentation was based on legal arguments and intended to be used to analyze the rationality of arguments typically found in the courtroom. Toulmin did not realize that this layout could be applicable to the field of rhetoric and communication until his works were introduced to rhetoricians by Wayne Brockriede and Douglas Ehninger. Their Decision by Debate (1963) streamlined Toulmin's terminology and broadly introduced his model to the field of debate.[24] Only after Toulmin published Introduction to Reasoning (1979) were the rhetorical applications of this layout mentioned in his works.
One criticism of the Toulmin model is that it does not fully consider the use of questions in argumentation.[25] The Toulmin model assumes that an argument starts with a fact or claim and ends with a conclusion, but ignores an argument's underlying questions. In the example "Harry was born in Bermuda, so Harry must be a British subject", the question "Is Harry a British subject?" is ignored, which also neglects to analyze why particular questions are asked and others are not. (See Issue mapping for an example of an argument-mapping method that emphasizes questions.)
Toulmin's argument model has inspired research on, for example, goal structuring notation (GSN), widely used for developing safety cases,[26] and argument maps and associated software.[27]
In 1972, Toulmin published Human Understanding, in which he asserts that conceptual change is an evolutionary process. In this book, Toulmin attacks Thomas Kuhn's account of conceptual change in his seminal work The Structure of Scientific Revolutions (1962). Kuhn believed that conceptual change is a revolutionary process (as opposed to an evolutionary process), during which mutually exclusive paradigms compete to replace one another. Toulmin criticized the relativist elements in Kuhn's thesis, arguing that mutually exclusive paradigms provide no ground for comparison, and that Kuhn made the relativists' error of overemphasizing the "field variant" while ignoring the "field invariant" or commonality shared by all argumentation or scientific paradigms.
In contrast to Kuhn's revolutionary model, Toulmin proposed an evolutionary model of conceptual change comparable to Darwin's model of biological evolution. Toulmin states that conceptual change involves the process of innovation and selection. Innovation accounts for the appearance of conceptual variations, while selection accounts for the survival and perpetuation of the soundest conceptions. Innovation occurs when the professionals of a particular discipline come to view things differently from their predecessors; selection subjects the innovative concepts to a process of debate and inquiry in what Toulmin considers as a "forum of competitions". The soundest concepts will survive the forum of competition as replacements or revisions of the traditional conceptions.
From the absolutists' point of view, concepts are either valid or invalid regardless of contexts. From the relativists' perspective, one concept is neither better nor worse than a rival concept from a different cultural context. From Toulmin's perspective, the evaluation depends on a process of comparison, which determines whether or not one concept will improve explanatory power more than its rival concepts.
Scholars at the University of Amsterdam in the Netherlands have pioneered a rigorous modern version of dialectic under the name pragma-dialectics. The intuitive idea is to formulate clear-cut rules that, if followed, will yield reasonable discussion and sound conclusions. Frans H. van Eemeren, the late Rob Grootendorst, and many of their students and co-authors have produced a large body of work expounding this idea.
The dialectical conception of reasonableness is given by ten rules for critical discussion, all being instrumental for achieving a resolution of the difference of opinion (from Van Eemeren, Grootendorst, & Snoeck Henkemans, 2002, p. 182–183). The theory postulates this as an ideal model, and not something one expects to find as an empirical fact. The model can however serve as an important heuristic and critical tool for testing how reality approximates this ideal and point to where discourse goes wrong, that is, when the rules are violated. Any such violation will constitute a fallacy. Albeit not primarily focused on fallacies, pragma-dialectics provides a systematic approach to deal with them in a coherent way.
Van Eemeren and Grootendorst identified four stages of argumentative dialogue. These stages can be regarded as an argument protocol. In a somewhat loose interpretation, the stages are as follows:[citation needed]
Van Eemeren and Grootendorst provide a detailed list of rules that must be applied at each stage of the protocol.[citation needed] Moreover, in the account of argumentation given by these authors, there are specified roles of protagonist and antagonist in the protocol which are determined by the conditions which set up the need for argument.
Douglas N. Walton developed a distinctive philosophical theory of logical argumentation built around a set of practical methods to help a user identify, analyze and evaluate arguments in everyday conversational discourse and in more structured areas such as debate, law and scientific fields.[28] There are four main components: argumentation schemes,[29] dialogue structures, argument mapping tools, and formal argumentation systems. The method uses the notion of commitment in dialogue as the fundamental tool for the analysis and evaluation of argumentation rather than the notion of belief.[6] Commitments are statements that the agent has expressed or formulated, and has pledged to carry out, or has publicly asserted. According to the commitment model, agents interact with each other in a dialogue in which each takes its turn to contribute speech acts. The dialogue framework uses critical questioning as a way of testing plausible explanations and finding weak points in an argument that raise doubt concerning the acceptability of the argument.
Walton's logical argumentation model took a view of proof and justification different from analytic philosophy's dominant epistemology, which was based on a justified true belief framework.[30] In the logical argumentation approach, knowledge is seen as form of belief commitment firmly fixed by an argumentation procedure that tests the evidence on both sides, and uses standards of proof to determine whether a proposition qualifies as knowledge. In this evidence-based approach, knowledge must be seen as defeasible.
Efforts have been made within the field of artificial intelligence to perform and analyze argumentation with computers. Argumentation has been used to provide a proof-theoretic semantics for non-monotonic logic, starting with the influential work of Dung (1995). Computational argumentation systems have found particular application in domains where formal logic and classical decision theory are unable to capture the richness of reasoning, domains such as law and medicine. In Elements of Argumentation, Philippe Besnard and Anthony Hunter show how classical logic-based techniques can be used to capture key elements of practical argumentation.[33]
Within computer science, the ArgMAS workshop series (Argumentation in Multi-Agent Systems), the CMNA workshop series,[34] and the COMMA Conference,[35] are regular annual events attracting participants from every continent. The journal Argument & Computation[36] is dedicated to exploring the intersection between argumentation and computer science. ArgMining is a workshop series dedicated specifically to the related argument mining task.[37]
Data from the collaborative structured online argumentation platform Kialo has been used to train and to evaluate natural language processing AI systems such as, most commonly, BERT and its variants.[48] This includes argument extraction, conclusion generation,[40][additional citation(s) needed] argument form quality assessment,[49] machine argumentative debate generation or participation,[44][46][47] surfacing most relevant previously overlooked viewpoints or arguments,[44][46] argumentative writing support[42] (including sentence attackability scores),[50] automatic real-time evaluation of how truthful or convincing a sentence is (similar to fact-checking),[50] language model fine tuning[51][47] (including for chatbots),[52][53] argument impact prediction, argument classification and polarity prediction.[54][55]
At the start of Topics VIII.5, Aristotle distinguishes three types of dialogue by their different goals: (1) the truly dialectical debate, which is concerned with training (gumnasia), with critical examination (peira), or with inquiry (skepsis); (2) the didactic discussion, concerned with teaching; and (3) the competitive (eristic, contentious) type of debate in which winning is the only concern.
Toulmin's 1958 work is essential in the field of argumentation
In place of the traditional epistemological view of knowledge as justified true belief we argue that artificial intelligence and law needs an evidence -based epistemology
{{cite journal}}
: Cite journal requires |journal=
(help)
{{cite journal}}
: Cite journal requires |journal=
(help)
{{cite journal}}
: Cite journal requires |journal=
(help)