Philosophy | By Branch/Doctrine | By Historical Period | By Movement/School | By Individual Philosopher
Philosophy: The Basics
A huge subject broken down into manageable chunks
Random Philosophy Quote:
By Branch / Doctrine > Philosophy of... >

Philosophy of Language

Introduction | History of the Philosophy of Language | The Nature of Language | The Nature of Meaning | Intentionality | Reference | Sentence Composition | Learning and Thought | Formal vs. Informal Approaches
 
Introduction Back to Top

Philosophy of Language is the reasoned inquiry into the origins of language, the nature of meaning, the usage and cognition of language, and the relationship between language and reality. It overlaps to some extent with the study of Epistemology, Logic, Philosophy of Mind and other fields (including linguistics and psychology), although for many Analytic Philosophers it is an important discipline in its own right.

It asks questions like "What is meaning?", "How does language refer to the real world?", "Is language learned or is it innate?", "How does the meaning of a sentence emerge out of its parts?"

History of the Philosophy of Language Back to Top

Early inquiry into language can be traced back to as long ago as 1500 B.C. in India, long before any systematic description of language, and there were various schools of thought discussing linguistic issues in early medieval Indian philosophy (roughly between 5th to 10th Centuries A.D.)

In the Western tradition, the early work was covered, as usual, by Plato, Aristotle and the Stoics of Ancient Greece. Plato generally considered that the names of things are determined by nature, with each phoneme (the smallest structural unit that distinguishes meaning) representing basic ideas or sentiments, and that convention only has a small part to play. Aristotle held that the meaning of a predicate (the way a subject is modified or described in a sentence) is established through an abstraction of the similarities between various individual things (a theory later known as Nominalism). His assumption that these similarities are constituted by a real commonality of form, however, also makes him a proponent of Moderate Realism.

The Stoic philosophers made important contributions to the analysis of grammar, distinguishing five parts of speech: nouns, verbs, appellatives, conjunctions and articles. What they called the lektón (the meaning, or sense, of every term) gave rise to the important concept of the proposition of a sentence (its ability to be considered an assertion, which can be either true or false).

The Scholastics of the Medieval era were greatly interested in the subtleties of language and its usage, provoked to some extent by the necessity of translating Greek texts into Latin, with Peter Abelard, William of Ockham and John Duns Scotus meriting particular mention. They considered Logic to be a "science of language", and anticipated many of the most interesting problems of modern Philosophy of Language, including the phenomena of vagueness and ambiguity, the doctrines of proper and improper suppositio (the interpretation of a term in a specific context), and the study of categorematic and syncategorematic words and terms.

Linguists of the Renaissance period were particularly interested in the idea of a philosophical language (or universal language), spurred on by the gradual discovery in the West of Chinese characters and Egyptian hieroglyphs.

Language finally began to play a more central role in Western philosophy in the late 19th Century, and even more so in the 20th Century, especially after the publication of "Cours de linguistique générale" by Ferdinand de Saussure (1857 - 1913), which was published posthumously in 1916. For a time, in the 20th Century philosophical branches of Analytic Philosophy and Ordinary Language Philosophy circles, philosophy as a whole was understood to be purely a matter of Philosophy of Language.

The Nature of Language Back to Top

One of the most fundamental questions asked in Philosophy of Language is "What is language (in general terms)?" According to semiotics (the study of sign processes in communication, and of how meaning is constructed and understood), language is the mere manipulation and use of symbols in order to draw attention to signified content, in which case humans would not be the sole possessors of language skills.

Linguistics is the field of study that asks questions like: What distinguishes one particular language from another e.g. What is it that makes "English" English? What is the difference between Spanish and French? Linguists like Noam Chomsky (1928 - ), a figure who has come to define 20th Century linguistics, have emphasized the role of "grammar" and syntax (the rules that govern the structure of sentences) as a characteristic of any language. Chomsky believes that humans are born with an innate understanding of what he calls "universal grammar" (an innate set of linguistic principles shared by all humans) and a child's exposure to a particular language just triggers this antecedent knowledge.

Chomsky begins with the study of people's internal language (what he calls "I-languages"), which are based upon certain rules which generate grammars, supported in part by the conviction that there is no clear, general and principled difference between one language and the next, and which may apply across the field of all languages. Other attempts, which he dubs "E-languages", have tried to explain a language as usage within a specific speech community with a specific set of well-formed utterances in mind.

Translation and interpretation present other problems to philosophers of language. In the 1950s, W.V. Quine argued for the indeterminacy of meaning and reference based on the principle of radical translation (e.g. when faced with translating the language of a previously undocumented, primitive tribe). He claimed that, in such a situation, it is impossible in principle to be absolutely certain of the meaning or reference that a speaker of the primitive tribe's language attaches to an utterance, and, since the references are indeterminate, there are many possible interpretations, no one of which is more correct than the others.

The resulting view is called Semantic Holism, a type of Holism which holds that meaning is not something that is associated with a single word or sentence, but can only be attributed to a whole language (if at all). Quine's disciple, Donald Davidson (1917 - 2003), extended this argument further to the notion of radical interpretation, that the meaning that an individual ascribes to a sentence can only be determined by attributing meanings to many, perhaps all, of the individual's assertions as well as his mental states and attitudes.

The Nature of Meaning Back to Top

As we have seen, then, the answer to the question, "What is meaning?", is not immediately obvious.

"Meaning" can be described as the content carried by the words or signs exchanged by people when communicating through language. Arguably, there are two essentially different types of linguistic meaning: conceptual meaning (which refers to the definitions of words themselves, and the features of those definitions, which can be treated using semantic feature analysis) and associative meaning (which refers to the individual mental understandings of the speaker, and which may be connotative, collocative, social, affective, reflected or thematic).

There are several different approaches to explaining what a linguistic "meaning" is:

  • Idea theories: which claim that meanings are purely mental contents provoked by signs. This approach is mainly associated with the British Empiricist tradition of John Locke, George Berkeley and David Hume, although interest in it has been renewed by some contemporary theorists under the guise of semantic internalism.
  • Truth-conditional theories: which hold meaning to be the conditions under which an expression may be true or false. This tradition goes back to Gottlob Frege, although there has also been much modern work in this area.
  • Use theories: which understand meaning to involve or be related to speech acts and particular utterances, not the expressions themselves. This approach was pioneered by Ludwig Wittgenstein and his Communitarian view of language.
  • Reference theories (or semantic externalism): which view meaning to be equivalent to those things in the world that are actually connected to signs. Tyler Burge (1946 - ) and Saul Kripke (1940 - ) are the best known proponents of this approach.
  • Verificationist theories: which associate the meaning of a sentence with its method of verification or falsification. This Verificationist approach was adopted by the Logical Positivists of the early 20th Century.
  • Pragmatist theories: which maintain that the meaning or understanding of a sentence is determined by the consequences of its application. This approach was favored by C.S. Peirce and other early 20th Century Pragmatists.
Intentionality Back to Top

Another important concept in the Philosophy of Language is that of intentionality, sometimes defined as "aboutness". Some things are about other things (e.g. a belief can be about icebergs, but an iceberg is not about anything; a book or a film can be about Paris, but Paris itself is not about anything), and intentionality is the term for this feature that certain mental states have of being directed at objects and states of affairs in the real world. Thus, our beliefs, fears, hopes and desires are intentional, in that they must have an object.

The term was initially coined by the Scholastics in the Middle Ages, but was revived in the 19th Century by the philosopher and psychologist Franz Brentano (1838 - 1917), an important predecessor of the school of Phenomenology. Brentano claimed that all and only mental phenomena exhibit intentionality, which he saw as proof that mental phenomena could not be the same thing as, or a species of, physical phenomena (often called Brentano's irreducibility thesis).

Later philosophers of language such as J. L. Austin (1911 - 1960) and John Searle (1932 - ) have posed the question: how does the mind, and the language that we use, impose intentionality on objects that are not intrinsically intentional? How do mental states represent, and how do they make objects represent, the real world. Austin's solution is in his theory of illocutionary acts and Searle's related solution is in his theory of speech acts, in which language is seen as a form of action and human behavior, so that by saying something, we actually do something. Combining this idea with intentionality, Searle concludes that actions themselves have a kind of intentionality.

Reference Back to Top

How language interacts with the world, what philosophers call reference, has interested many philosophers of language over the years.

John Stuart Mill believed in a type of direct reference theory, whereby the meaning of an expression lies in what it points out in the world. He identified two components to consider for most terms of a language: denotation (the literal meaning of a word or term) and connotation (the subjective cultural and/or emotional coloration attached to a word or term). According to Mill, proper names (such as of people of places) have only a denotation and no connotation, and that a sentence which refers to a mythical creature, for example, has no meaning (and is neither true nor false) because it has no referent in the real world.

Gottlob Frege was an advocate of a mediated reference theory, which posits that words refer to something in the external world, but insists that there is more to the meaning of a name than simply the object to which it refers. Frege divided the semantic content of every expression (including sentences) into two components: Sinn (usually translated as "sense") and Bedeutung ("meaning", "denotation" or "reference"). The sense of a sentence is the abstract, universal and objective thought that it expresses, but also the mode of presentation of the object that it refers to. The reference is the object or objects in the real world that words pick out, and represents a truth-value (the True or the False). Senses determine reference, and names that refer to the same object can have different senses.

Bertrand Russell, like Frege, was also a Descriptivist of sorts, in that he held that the meanings (or semantic contents) of names are identical to the descriptions associated with them by speakers and a contextually appropriate description can be substituted for the name. But he held that the only directly referential expressions are what he called "logically proper names" such as "I", "now", "here", and other indexicals (terms which symbolically point to or indicate some state of affairs). He described proper names of people or places as abbreviated definite descriptions (the name standing in for a more detailed description of who or what the person or place really is), and considered them not to be meaningful on their own and not directly referential.

Saul Kripke (1940 - ) has argued against Descriptivism on the grounds that names are rigid designators and refer to the same individual in every possible world in which that individual exists.

Sentence Composition Back to Top

Philosophical semantics (the study or science of meaning in language) tends to focus on the principle of compositionality in order to explain the relationship between meaningful parts and whole sentences. The principle asserts that a sentence can be understood on the basis of the meaning of the parts of the sentence (words or morphemes) along with an understanding of its structure (syntax or logic). Therefore, the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them.

Functions can also be used to describe the meaning of a sentence: a propositional function is an operation of language that takes an entity (or subject) as an input and outputs a semantic fact (or proposition).

There are two general methods of understanding the relationship between the parts of a linguistic string and how it is put together:

  • Syntactic trees: focus on the words of a sentence with the grammar of the sentence in mind.
  • Semantic trees: focus on the role of the meaning of the words and how those meanings combine.
Learning and Thought Back to Top

There are three main schools of thought on the issue of language learning:

  • Behaviourism: which holds that the bulk of language is learned via conditioning.
  • Hypothesis testing: which holds that learning occurs through the postulation and testing of hypotheses, through the use of the general faculty of intelligence.
  • Innatism: which holds that at least some of the syntactic settings are innate and hardwired, based on certain modules of the mind.

There are also varying notions of the structure of the brain when it comes to language:

  • Connectionist models emphasize the idea that a person's lexicon and their thoughts operate in a kind of distributed, associative network.
  • Nativist models assert that there are specialized devices in the brain that are dedicated to language acquisition.
  • Computation models emphasize the notion of a representational language of thought and the logic-like, computational processing that the mind performs over them.
  • Emergentist models focus on the notion that natural faculties are a complex system that emerge out of simpler biological parts.
  • Reductionist models attempt to explain higher level mental processes in terms of the basic low-level neurophysiological activity of the brain.

There are three main contentions regarding the relationship between language and thought:

  • Edward Sapir (1884 - 1939), Benjamin Whorf (1897 - 1941) and Michael Dummett (1925 - 2011), among others, maintain that language is analytically prior to thought.
  • Paul Grice (1913 - 1988) and Jerry Fodor (1935 - ), on the other hand, believe that thought and mental content has priority over language, and that spoken and written language derive their intentionality and meaning from an internal language encoded in the mind, especially given that the structure of thoughts and the structure of language seem to share a compositional, systematic character.
  • A third school of thought maintains that there is no way of explaining one without the other.
Formal vs. Informal Approaches Back to Top

Most philosophers have been more or less skeptical about formalizing natural languages, and thus allowing the use of formal logic to analyze and understand them, although some, including Alfred Tarski (1901 - 1983), Rudolf Carnap (1891 - 1970), Richard Montague (1930 - 1971) and Donald Davidson (1917 - 2003), have developed formal languages, or formalized parts of natural language, for investigation. Some, like Paul Grice (1913 - 1988), have even denied that there is a substantial conflict between logic and natural language.

However, in the 1950s and 1960s, the Ordinary Language Philosophy movement, whose main proponents were P.F. Strawson (1919 - 2006), John Austin (1911 - 1960) and Gilbert Ryle, stressed the importance of studying natural language without regard to the truth-conditions of sentences and the references of terms. They believed that language is something entirely different to logic, and that any attempts at formalization using the tools of logic were doomed to failure. Austin developed a theory of speech acts, which described the kinds of things which can be done with a sentence (assertion, command, inquiry, exclamation) in different contexts of use on different occasions, and Strawson argued that the truth-table semantics of the logical connectives do not capture the meanings of their natural language counterparts.



Back to Top of Page
Philosophy | What is Philosophy? | By Branch/Doctrine | By Historical Period | By Movement/School | By Individual Philosopher
 
Thank you for supporting philosophy!

The articles on this site are © 2008-.
If you quote this material please be courteous and provide a link.
Citations | FAQs | Inquiries | Privacy Policy