0% found this document useful (0 votes)
9 views

2006 Formalsemantics

Uploaded by

mahamerua45
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

2006 Formalsemantics

Uploaded by

mahamerua45
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

564 Formal Models and Language Acquisition

Jain S, Osherson D N, Royer J S & Kumar Sharma A Osherson D N, Stob M & Weinstein S (1985). Systems that
(1999). Systems that learn (2nd edn.). Cambridge, MA: learn. Cambridge, MA: MIT Press.
MIT Press. Solomonoff R J (1978). ‘Complexity-based induction sys-
Lightfoot D (1991). How to set parameters: arguments tems: comparisons and convergence theorems.’ IEEE
from language change. Cambridge, MA: MIT Press. Transactions on Information Theory 24, 422–432.
Morgan J L (1986). From simple input to complex gram- Wexler K & Culicover P (1980). Formal principles of lan-
mar. Cambridge, MA: MIT Press. guage acquisition. Cambridge, MA: MIT Press.

Formal Semantics
G Chierchia, Universita degli Studi di Milano-Bicocca, workings through a couple of examples, with no
Milan, Italy pretence of completeness.
ß 2006 Elsevier Ltd. All rights reserved.
Semantics vs. Lexicography

One of the traditional ideas about semantics is that it


Introduction deals with the meaning of words. The main task of
Semantics, in its most general form, is the study of semantics is perceived as the compilation of diction-
how a system of signs or symbols (i.e., a language of aries (semantics as lexicography). To this, people
some sort) carries information about the world. One often add the task of investigating the history of
can think of a language as constituted by a lexicon (an words. Such a history can teach us about cultural
inventory of morphemes or words) and a combinato- development. One might even hope to arrive at the
rial apparatus according to which complex expres- true meaning of a word through its history. Compil-
sions, including, in particular, sentences, can be built ing dictionaries or reconstructing how particular
up. Semantics deals with the procedures that enable words have changed over time are worthy tasks; but
users of a language to attach an interpretation to its they are not what formal semantics is about. Lexicog-
arrays of symbols. Formal semantics studies such raphy, philology, and related disciplines vs. semantics
procedures through formally explicit mathematical as conceived here constitute complementary enter-
means. prises. They all, of course, deal with language. But
The history of semantics is nearly as long and com- the main goal of semantics is to investigate how we
plex as the history of human thought; witness, e.g., can effortlessly understand a potential infinity of
the early debates on the natural vs. conventional expressions (words, phrases, sentences). To do that,
character of language among the pre-Socratic philo- we have to go beyond the level of single words.
sophers. The history of formal semantics is nearly as It may be of use to point to the kind of considera-
daunting as it is intertwined with the development of tions that have led semantics to move the main focus
logic. In its modern incarnation, it is customary to of investigation away from single word meanings and
locate its inception in the work of logicians such as their development. For one thing, it can be doubted
Frege, Russell, and Tarski. A particularly important that word histories shed light on how words are
and relatively recent turning point is constituted by synchronically (i.e., at a given point in time) under-
the encounter of this logico-philosophical tradition stood and used. People use words effectively in total
with structural and generative approaches to the ignorance of their history (a point forcefully made by
study of human languages, especially (though by no one of the founding fathers of modern linguistics,
means exclusively) those influenced by N. Chomsky. namely F. de Saussure). To make this point more
The merger of these two lines of research (one brew- vividly, take the word money. An important word
ing within logic, the other within linguistics), has led indeed; where does it come from? What does its
formal semantics to become a central protagonist in history reveal about the true meaning of money? It
the empirical study of natural language. The research comes from Latin moneta, the past participle femi-
paradigm that has emerged has proven to be quite nine of the verb moneo ‘to warn/to advise.’ Moneta
fruitful, both in terms of breadth and depth of results was one of the canonical attributes of the Roman
and in terms of the role it is playing in the investiga- goddess Juno; Juno moneta is ‘the one who advises.’
tion of human cognition. The present work reviews What has Juno to do with money? Is it perhaps that
some of the basic assumptions of modern formal her capacity to advise extends to finances? No. It so
semantics of natural language and illustrates its happens that in ancient Rome, the mint was right
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 565

next to the temple of Juno. So people metonymically sentence meaning is somehow more readily accessible
transferred Juno’s attribute to what was coming out (being, as it were, more complete) than the meaning
of the mint. A fascinating historical fact that tells us of words in isolation.
something as to how word meanings may evolve; but These are some reasons, then, why the perspective
it reveals no deep link between money and the capac- of modern semantics is so different from and comple-
ity to advise. This example is not meant to downplay mentary to lexicography and philology; such perspec-
the interest of historical investigations on word tive is much more directly tied to the investigation of
meanings; it is just an illustration of how linguistic the universal laws of language (language universals)
history affects only marginally the way in which a and of the psychological mechanisms underlying such
community actually understands its lexicon. laws. Understanding the function, use, etc., of a single
There is a second kind of consideration suggesting word presupposes a whole, complex cognitive appa-
that the scope of semantics cannot be confined to the ratus. It is, therefore, an arrival point more than a
study of word meanings. Do words in isolation have starting point. It seems thus reasonable to start by
clearly identifiable meanings? Take any simple word, asking what it is to understand a sentence.
say the concrete, singular, common noun dog. What The main thesis we wish to put forth is that to
does it mean? Some possible candidates are: the dog- understand a sentence involves understanding its rela-
kind, the concept of dog, the class of individual dogs. tions to the other sentences of the language. Each
. . . And the list can go on. How do we choose among sentence carries information. Such information will
these possibilities? Note, moreover, that all these be related to that of other sentences while being unre-
hypotheses attempt to analyze the meaning of the lated to that of yet others. In communicating, we rely
word dog by tacking onto it notions (kind, concept, on our spontaneous (and unconscious) knowledge of
class . . .) that are in and of themselves in need of these relations.
explication. If we left it at that, we wouldn’t go far.
The Notion of Synonymy and Its Problems
Looking at dictionary definitions is no big help either.
If we look up the entry for dog, typically we will find Imagine watching a Batman movie in which the caped
something like: hero fights the Riddler, one of his eternal foes. The
Riddler has scattered around five riddles with clues to
(1) A highly variable carnivorous domesticated
mammal (Canis familiaris) prob. descended his evil plans. Batman has managed to find and solve
from the common wolf. four of them. We could report this situation in any of
the following ways:
Indeed, if someone doesn’t know the meaning of the
word dog and knows what carnivorous and mammal (2a) Batman has found all of the five clues but one.
mean, then (1) may be of some practical help. But (2b) Batman has found four out of the five clues.
(2c) Four of the five clues have been found by
clearly to understand (1), we must rely on our under-
Batman.
standing of whole phrases and the words occurring in
them. Words which, in turn, need a definition to be These sentences are good paraphrases of each other.
understood. And so on, in a loop. This problem is One might say that they have roughly the same infor-
sometimes called the problem of the circularity of the mation content; or that they describe the same state of
lexicon. To put it differently, (1) is of help only if the affairs; or that they are (nearly) synonymous. (I will
capacity to use and interpret language is already be using these modes of speaking interchangeably.)
taken for granted. But it is precisely such capacity To put it differently, English speakers know that there
that we want to study. is a tight connection between what the sentences in
The limitation of a purely word-based perspective (2a), (2b), and (2c) mean. This is a kind of knowledge
on the investigation of meaning is now widely recog- they have a priori, i.e., regardless of what actually
nized. Frege summarized it in a nice motto: ‘‘only in goes on. Just by looking at (2a) vs., say, (2b) and
the context of a sentence do words have meaning.’’ grasping what they convey, we immediately see that
His insight is that complete sentences are linguistic they have roughly the same informational content.
units that can sort of stand on their own (more so This is what we mean when we say that under-
than any other linguistic units). They can, as it were, standing a sentence involves understanding which
express self-contained thoughts. We are more likely, other sentences count as good paraphrases and
therefore, to arrive at the meaning of single words which don’t. Thus, knowing a language is to know
(and of phrases in between words and complete sen- which sentences in that language count as synony-
tences) via a process of abstraction from the contri- mous. Semantics is (among other things) the study of
bution that words make to sentence meaning, rather synonymy. Two synonymous sentences (and, more
than the other way around. This is so because generally, two synonymous expressions) can always
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

566 Formal Semantics

be used interchangeably. This last informal character- In fact, it has been argued that if (3a) and (3b) are
ization can be turned into a precise definition along how we define synonymy, then there simply are no
the following lines. two sentences that qualify as such. Here is a classical
argument that purports to show this (based on Mates
(3a) Suppose one utters any complex expression a
containing a subexpression A. If one can
(1950)). Take the following two sentences:
replace in A a with a different expression b,
(5a) Billy has a dog.
without changing the overall communicative
(5b) Billy has a highly variable carnivorous
import of A, then a and b are synonymous.
domesticated mammal prob. descended from
(3b) a is synonymous with b ¼ in the utterance of any
the common wolf.
expression A containing a, a can be replaced
with b without changing the communicative Are these two sentences synonymous? Hardly. They
import of the utterance (salva significatione). are clearly semantically related. But they surely do
For example, in uttering (2a) (our A), we can replace not have the same communicative import. Nor can
the subcomponent that comes after Batman has one replace the other in every context. For example,
found namely all of the five clues but one (our a) (5a) could describe a true state of affairs, while (5b)
with four out of the five clues (our b) and convey might not:
exactly the same information. Hence, these two (6a) Molly believes that Billy has a dog.
expressions must be synonymous (and, in fact, so (6b) Molly believes that Billy has a highly variable
are the whole sentences). carnivorous domesticated mammal prob.
This looks promising. It paves the way for the descended from the common wolf.
following setup for semantics. Speakers have intui-
This shows that in contexts like Molly believes that __
tions of whether two expressions can be replaced
we cannot simply replace a word with its dictionary
with each other while keeping information content
definition. And if dictionary definitions don’t license
unchanged. For any two sentences a and b, they
synonymy, then what does?
spontaneously know whether they can be substituted
The problem can be couched in the following
for each other (i.e., whether b can be used to para-
terms. Any normal speaker of English perceives a
phrase a). Because the sentences of a language are
strong semantic connection among the sentences in
potentially infinite, it is impossible for speakers to
(2a), (2b), and (2c), or (4a) and (4b). So strong that
memorize synonymous sentences one by one (for that
one might feel tempted to talk about synonymy. Yet
clearly exceeds what our memory can do). Hence,
when we try to make the notion of synonymy precise,
they must recognize synonymy by rule, by following
we run into serious problems. Such a notion appears
an algorithm of some sort. The task of semantics,
to be elusive and graded (a more or less thing); so
then, becomes characterizing such an algorithm.
much so that people have been skeptical about
There is a problem, however. Sameness of commu-
the possibility of investigating synonymy through
nicative import is a more or less thing, much like
precise, formal means.
translation. In many contexts, even sentences as
A fundamental breakthrough has been identifying
close as those in (3a) and (3b) could not be replaced
relatively precise criteria for assessing semantic rela-
felicitously with each other. Here is a simple example.
tions. The point is that perfect synonymy simply does
The discourse in (4a) is natural and coherent. The one
not exist. No two sentences can be always replaced
in (4b) much less so:
with each other. The notion of synonymy has to be
(4a) Batman has found all of the five clues but one, deconstructed into a series of more basic semantic
which is pinned on his back. relations. We need to find a reliable source for classi-
(4b) ?? Batman has found four out of the five clues, fying such relations, and, we will argue, such a source
which is pinned on his back. lies in the notions of truth and reference. Consider the
(modeled after a famous example by B. Partee) sentences in (2a), (2b), and (2c) again. Assume that
Clearly in (4a) we cannot replace Batman has found the noun phrase the five clues in (2a) and (2b) refer to
all of the five clues but one with Batman has the same clues (i.e., we are talking about a particular
found four out of the five clues while keeping unal- episode in a particular story). Then, could it possibly
tered the overall communicative effect. This means happen that say (2a) is true and (2b) false? Evidently
that if we define synonymy as in (3a) and (3b), then not: no one in his right mind could assert (2a) while
(2a) and (2b) cannot be regarded as synonymous after simultaneously contending that (2b) is false. If (2a)
all. Yet they clearly share a significant part of their is true, (2b) also must be true. And, in fact, vice versa:
informational content. What is it that they share? if (2b) is true, then (2a) also must be. When this
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 567

happens, i.e., when two sentences are true in the same is what can be fruitfully investigated. One family of
set of circumstances, we say that they have the same such invariants concerns form: similar sound patterns
truth conditions. may be used in different speech acts. Another family
Notice that sameness of truth conditions does not of invariants concerns content: similar states of
coincide with or somehow require sameness of com- affairs may be described through a variety of expres-
municative import (too elusive a notion), nor substi- sions. The notion of truth is useful in describing the
tutivity in any context whatsoever (a condition too latter phenomenon. A pair of sentences may be
difficult to attain). Our proposal is to replace such judged as being necessarily true in the same circum-
exceedingly demanding notions with a series of truth- stances. This is so, for example, for (5a) vs. (5b). Yet,
based notions, while keeping the same general setup such sentences clearly differ in many other respects.
we sketched in connection with synonymy: for any One is much more long-winded than the other; it uses
pair of sentences, speakers have intuition about rarer words, which are typical of high, formal regis-
whether they are true under the same conditions or ters. So in spite of having the same truth conditions,
not. They can judge whether they are true in the same such sentences may well be used in different ways.
(real or hypothetical) circumstances or not. Because Having the same truth condition is generally regarded
the sentences of our language are infinite, this capa- as a semantic fact; being able to be used in different
city must be somehow based on a computational ways is often regarded as a pragmatic fact. While this
resource. Speakers must be able to compare the gives us a clue as to the role of these two disciplines
truth-conditions associated with sentences via an (both of which deal with meaning broadly con-
algorithm of some sort. The task of semantics is to strued), the exact division of labor between semantics
characterize such an algorithm. The basic notion and pragmatics remains the object of controversy.
changes (synonymy is replaced with sameness of Truth conditions are a tool for describing semantic
truth conditions), but the setup of the problem stays invariants, structural regularities across communi-
the same. cative situations. Whenever I utter a declarative
sentence, I typically do so with the intention to
communicate that its truth conditions are satisfied
Truth and Semantic Competence
(which of course raises the question of nondeclaratives,
Let us elaborate on the proposal sketched at the end emotive expressions, and the like; see for example text-
of the previous section. Information is transmitted books such as Chierchia and McConnell-Ginet (2000)
from one agent to another (the ‘illocutionary agents’) or Heim and Kratzer (1998); cf. also Kratzer (1999)
in concrete communicative situations (‘speech acts’). for a discussion of relevant issues). Truth conditions
No two such situations are alike. And consequently, depend on the reference (or denotation) of words
no two pieces of information that are transmitted and the way they are put together, i.e., they are com-
through them are alike. In Groundhog Day, a movie positionally projected via the reference of words (or
with the actor Bill Murray, the protagonist gets morphemes). If I say to you, as we are watching a
trapped into going through the same day over and movie, ‘‘Batman has found all of the five clues but
over. He wakes up and his day starts out in the same one,’’ you understand me because you sort of know
way (with the alarm clock ringing at 7 a.m. on (or guess) who Batman is, what sort of things clues
groundhog day); as he walks outside, he meets the are, what finding something is, what number the
same waitress who greets him in the same way word five refers to; you also understand the ‘‘all . . .
(‘‘weather so-so, today’’). Yet this sentence, though but . . .’’ construction. The reference/denotation of
being the same day after day, and being uttered in words is set (and is modified, as words may change
circumstances as identical as they can conceivably their denotation in time) through use, in complex
be, clearly conveys a different sense or information ways we cannot get into within the limits of the
unit on each occasion of its use (the hearer going from present work. The denotation of complex expressions
noticing that something is fishy about this verbatim (e.g., of a verb phrase such as [VPfound five clues] and
repetition, to the painful discovery of the condem- truth conditions of sentences) are set by rule (the
nation to live through groundhog day for eternity). semantic component of grammar). Semantic rules
Ultimately, we want to understand how communi- presumably work like syntactic rules: they display
cation takes place. But we cannot nail down every variation as well as a common core, constitutive of
aspect of a speech act, just as we cannot know (not universal grammar. Insofar as semantics is concerned,
even in principle, I believe) every aspect of the physi- what is important for our purposes is that truth con-
cal or mental life of a particular human being. At the ditions can be compositionally specified. This paves
same time, while speech acts are unique events, there the way for an algorithmic approach to meaning.
is much that is regular and invariant about them; that We already remarked that sentences are formed by
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

568 Formal Semantics

composing morphemes together via a limited number station.’ The point of this example is that we have
of syntactic operations. So to arrive at the truth con- intuitions that govern how the denotation of a pro-
dition of an arbitrary sentence, we can start by the noun is to be reconstructed out of contextual clues;
contribution of the words (their reference). Then, for such intuitions tell us that (8a) and (8b), though
each way of putting words together, there will be a structurally so similar, allow for a distinct range of
way of forming the reference of complex expressions, interpretive options. At the basis of intuitions of this
and so on until we arrive at the truth condition of the sort, we again see entailment at work: on its anaphor-
target sentence. ic construal, (8a) entails (9a).
So far, we have discussed sentences that have the Another important set of truth-based semantic
same truth conditions (such as those in (2a), (2b), and relations are presuppositions. Consider the contrast
(2c)); but this is not the only semantic relation that between the sentences in (10a) and (10b).
can be characterized in terms of the notion of truth.
(10a) Fred stole the cookies.
Consider the following examples. (10b) It was Fred who stole the cookies.
(7a) Every Italian voted a’. Most Italians voted
There is a noticeable semantic contrast between (10a)
for B for B.
and (10b). How can we characterize it? Clearly the
(7b) Leo voted for B b’. Leo voted for B.
two sentences are true in the same circumstances
Sentence (7a) is related to (7b) in a way that differs (they entail each other). Yet they differ semantically.
from the relation between (7a’) vs. (7b’). Here is the Such a difference can be perhaps caught by looking at
difference. If (7a) is true, and Leo is Italian, then (7b) what happens by embedding (10a) and (10b) in a
has to be true, too; this is clearly not so for (7a’) vs. negative context.
(7b’): (7a’) may be true without (7b’) being true. If
(11) So, what happened this morning?
whenever A is true, B also must be, we say that
(11a) Everything went well. Fred didn’t steal the
A entails B (B’s meaning is part of of A’s meaning). cookies; he played with his toys.
Two sentences with the same truth conditions entail (11b) ?? Everything went well. It wasn’t Fred who
each other (i.e., they hold a symmetric relation); when stole the cookies.
entailment goes only one way (as from (6a) to (6b)),
we have an asymmetric relation. The answer in (11a) is natural. The one in sentence
Entailment is pervasive. Virtually all semantic (11b) would sound more natural as an answer to
intuitions are related to it. As an illustration, consider (12a) Who stole the cookies?
the pair of sentences in (8a) and (8b). (12b) It wasn’t Fred.
(8a) John promised Bill to take him to the station. The difference between the question in (11) and the
(8b) John ordered Bill to take him to the station. one in (12a) is that the latter (but not the former)
Pronouns, like him in (8a) and (8b), take their deno- tends to presuppose that cookies where stolen. In
tation from the context; they can take it from the other terms, the situation seems to be the following.
extra linguistic context (a person salient in the visual Both sentences in (10a) and (10b) entail:
environment, a person the speaker points at, etc.) or (13) Someone stole the cookies.
from the linguistic context (e.g., from NPs that occur
in the same discourse; John or Bill in (8a) and (8b)); If either (11a) or (11b) are true, then (13) must also be
one widespread terminology is to speak of indexical true. Furthermore, sentence (13) must be true for
uses in the first case and of anaphoric uses in the (10b) to be denied felicitously. The illocutionary
second. We can conceptualize this state of affairs by agents must take for granted the truth of (13) to
viewing pronouns as context-dependent items, in- assert, deny, or otherwise use sentence (10b), as the
complete without pointers of some sort. Now we naturalness of the following continuations for (13)
shall focus on the anaphoric interpretation of (8a) illustrate:
vs. (8b).
(14) Someone stole the cookies . . .
(9a) John promised Bill that John would take Bill to (14a) It was Fred.
the station. (14b) It wasn’t Fred.
(9b) John ordered Bill that Bill should take John to (14c) Was it Fred?
the station. (14d) If it was Fred, he is going to get it . . .

These appear to be the only options. That is to This brings us to the identification of presupposing
say, sentence (8a) cannot convey something like as a distinctive semantic relation: a sentence A pre-
‘John promised Bill that Bill should take John to the supposes B if the truth of B must be taken for granted
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 569

in order to felicitously assert, deny, etc., A. Presuppo- meaning of a word like or. It can be illustrated with
sitions are quite important in language. So much so the following examples. Consider first (16a):
that there are distinctive syntactic constructions (such
as those in (10b), known as cleft sentences) specifical- (16a) If I got it right, either John or Mary will be
ly keyed to them. hired.
(16b) If I got it right, either John or Mary but not
Let me illustrate the wealth of semantic relations
both will be hired.
and their systematic character by means of another
example, which will bring us to the interface between Normally, one tends to interpret (16a) as truth condi-
semantics and pragmatics. Consider: tionally equivalent to (16b); i.e., the disjunction in
(16a) is interpreted exclusively (as incompatible
(15a) Who stole the cookies?
(15b) Fred looks mischievous.
with the simultaneous truth of each disjunct). How-
(15c) Fred stole the cookies. ever, this is not always so. Contrast (16a) with (17a).

If to a question such as (15a), I reply with (15b), (17a) If either John or Mary are hired, we’ll
I do suggest/convey something like (15c). Sentence celebrate.
(15c) clearly is not part of the literal meaning of (17b) (?) If John or Mary (but not both) are hired,
(15b) (however hard defining such a notion might we’ll celebrate.
be). Yet, in the context of the dialogue in (15a), (17c) If John or Mary or possibly both are hired,
we’ll celebrate.
(15b), and (15c), speakers will converge in seeing
that (15c) is strongly suggested by (15b). Here, too, The most natural interpretation of (17a) is not the
we have, thus, a systematic semantic intuition. The exclusive one (namely (17b), which is somewhat odd
suggestion in (15c) can be retracted; that is, one can pragmatically); rather it is the inclusive one, made
continue (15b) with ‘. . . but I know he didn’t do it’. explicit in (17c). (Notice that the emphatic word
However, in the absence of such an explicit correc- either is present both in (16a) and (17a); in spite of
tion, illocutionary agents upon hearing (15b) will this, the interpretation of or shifts.) We might see in
tend to infer (15c). This phenomenon has been these phenomena a lexical ambiguity of disjunction.
studied by H. P. Grice (1989), who dubbed it impli- Words expressing disjunction, we may feel inclined to
cature. His proposal is that it arises through interac- conclude, have a varying interpretation, as it happens
tion of the core meaning assigned to sentences by rule with words such as bank or lap (‘sit on my lap’ vs. ‘he
with principles that govern conversational exchanges. swam three laps’). We may assume that such inter-
The basic idea is that for conversational exchanges to pretations are always in principle available, but then
be successful they have to be basically cooperative we select the most suitable to the context of the
acts; cooperating means that one sticks to relevant speech act. While this seems prima facie possible,
topics, one only gives information believed to be there are reasons to doubt it. In particular, true lexical
truthful, one gives no more and no less than what is ambiguities are resolved across languages (in Italian,
relevant, etc. Applying this to the case at hand, in a there are two different words for the two senses of
situation in which question (15a) is topical, answer- lap). Ambiguities are never universal. The meaning
ing (15b) would seem to be blatantly irrelevant; the shift of or, per contra, seems to be universal: in every
hearer, however, tends to interpret it as relevant and language disjunction appears to have a similar oscil-
sets in motion an inferential process that tends to link lation in meaning. A convincing case for two lexically
it to some piece of information that does address the distinct disjunctions, one exclusive, the other exclu-
topical question; such a link is to be found with sive, has not been made (sometimes it has been pro-
the help of the information available in the context posed that Latin vel vs. autem is just that; for
to the illocutionary agents (e.g., in the common arguments against this, cf., e.g., Jennings (1994)).
knowledge that if people commit a mischief, such as Moreover, other areas of the lexicon have been
stealing cookies, they may well look mischievous, found that display a similar behavior (e.g., the num-
etc.). Thus, this type of semantic judgment (the impli- ber words). This strongly suggests that a different
cature) appears to be best accounted for in terms of explanation for such behavior should be found.
the interaction between grammar and general condi- Grice himself has proposed that the phenomenon
tions on reasonable language use (that fall under the under discussion is to be accounted for in terms of
scope of pragmatics). the interaction between semantics and pragmatics.
Sometimes it is not immediately clear whether The idea is that the basic meaning of or is the inclu-
something is a matter of conventionalized meaning sive one, as it is the most liberal interpretation; the
or pragmatics. To illustrate, consider the oscillation in exclusive construal arises as an implicature, i.e., a
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

570 Formal Semantics

pragmatic enrichment, albeit a generalized one. The The semantic relations in (18a), (18b), and (18c) can
advantage of this move is that it would explain the be viewed as intuitions of semantic relatedness speak-
oscillation in meaning of disjunction without positing ers have about sentences of their own language, as
a covert ambiguity. We will come back to how the judgments that may be elicited, and the like. By anal-
generalized implicature associated with or might ogy with well-formedness judgments, there are some
come about in the later section ‘‘The Semantics/ cases in which things are not so clear and we may not
Pragmatics Interface’’. be sure whether, say, a certain entailment holds or
Wrapping up, the picture that emerges is roughly not. In such a case, more complex arguments, indirect
the following. In using language, speakers display evidence of various sorts, or psycholinguistic experi-
complex forms of spontaneous knowledge. They put mentation may be called for (see, e.g., Crain and
together words in certain ways and not others. This is Thornton (1998) on experimental methodologies for
how knowledge of syntax manifests itself. They also truth-based semantic judgments). But in indefinitely
accept certain paraphrases and not others, draw cer- many cases, simple introspection yields relatively
tain inferences and not others, etc. It turns out to be straightforward judgments. The capacity for making
possible/useful to categorize the latter in three major such judgments is constitutive of our semantic com-
families of semantic relations. petence. Such a competence cannot be simply a the-
saurus, a store of pairs of sentences, with the relative
(18a) Entailment-based (entailment, mutual
entailment, contradictoriness, analyticity,
judgment tacked on, for the number of judgments
etc.) speakers can make on the fly is potentially infinite.
(18b) Presupposition-based (presupposition, Semantic competence must be a computational device
question/answer pairs, etc.) of some sort. Such a device given an arbitrary pair of
(18c) Implicature-based (generalized implicature, sentences <A, B> must be able to determine in prin-
particularized implicature, etc.) ciple whether A entails B, presupposes it, etc. The task
of semantics is to characterize the general architecture
All of them can be readily defined in terms of the
of such a computational device. While there are many
notion of truth:
foundational controversies that permeate the field,
(19a) A entails B ¼ for any conceivable situation s, if there is a broad convergence that this is roughly the
A is true in s, B is also true in s. form that the problem of meaning takes within
(19b) A presupposes B ¼ to use A appropriately in a modern formal semantics.
situation s, the truth of B must be taken for
granted by the illocutionary agents in s.
(19c) A implicates B ¼ use of A in a situation s Semantic Modeling
suggests, everything else being equal, that
B is true in s. In the present section I will sketch how a (necessarily,
much simplified) calculus of semantic relations may
The definitions in (19a), (19b), and (19c) can be
look. Suppose you have a lexicon of the following
readily associated with ‘‘operational’’ tests that en-
form:
able speakers to assess whether a given relation
obtains or not. For example, to check whether (20a) (22a) N: John, Bill, dog, cat, table, . . . .
entails (20b), you might check whether you could (22b) V: runs, smokes, drinks, . . .
sincerely assert (20a) while denying (20b), viz. wheth- (22c) DET: the, a, some, every, no . . . .
er you could sincerely and felicitously utter something Think of syntax as a device that combines lexical
like (20c): entries by merging them in complex phrases and
(20a) It is indeed odd that Mary is home. assigning them a syntactic analysis that can be repre-
(20b) Mary is home. sented by tree diagrams or labeled bracketings of the
(20c) It is indeed odd that Mary is home, even if she following form:
in fact isn’t.
(23a) [VP John smokes]
To the extent that you can’t really say something like (23b) [DP every boy]
(20c), you are entitled to conclude that (20a) entails (23c) [VP [DP every boy] smokes]
(20b). It is useful, in these cases, to use contrast sets I assume, without being able to justify it, that lexical
such as (21a) and (21b). items have phrasal projections. In particular, VP is the
(21a) It is indeed conceivable that Mary is at home. phrasal projection of V and constitutes a clausal nu-
(21b) It is indeed conceivable that Mary is home, cleus composed of the verb and its arguments linked
even if she in fact isn’t. in a predicative structure. Such a nucleus forms the
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 571

innermost skeleton of the sentence (I will have to tells us whether that individual performs a certain
ignore matters pertaining to inflection, agreement, action or not. Here is an example:
tense, and the like). The lexical features of verbs are
crucial in determining the characteristics of clausal (26) smokes in a situation s denotes a function
nuclei. DP is the phrasal projection of D, and it is smokes that applies to animate individuals
constituted by a determiner and a (common) noun. and returns truth values. If a is such an
Clausal nuclei can be formed by merging a verb with individual, then smokes(a) returns ‘true’
(which we represent as the number 1) if that
a (proper) name or a DP, as indicated. In the spirit of
individual performs the action of smoking in
the discussion in the section on Truth and Semantic
s (where smoking involves . . . .); otherwise
Competence, semantics assigns recursive truth condi- smokes (a) returns 0 (i.e., ‘false’).
tions to sentences in terms of the reference assigned to
lexical entries. There are several ways to do this. If a is not animate (e.g., if a is a stone and s is a
Ultimately, the choice one makes on the exact format ‘normal’ situation), then smokes (a) is not defined
of interpretive rules has far-reaching consequences for (lacks a value). The final part in definition (26)
our understanding of grammar. However, our choices reflects the fact that sentences like (27a) and (27b),
here are only in small part dictated by our current out of the blue, are (equally) strange: smoking nor-
understanding of semantics in universal grammar; mally requires its subject argument to be animate.
for the major part, they result from considerations
such as ease of exposition, keeping prerequisites at a (27a) That stone smokes.
(27b) That stone doesn’t smoke.
minimum, and the like. To get started, we should
assign a reference (or denotation, terms we will use The deviance of sentences like (27a) and (27b) has
interchangeably) to lexical entries. To do so, we been variously characterized as a violation of selec-
assume we have a certain domain Ds ¼ {a, b, c, . . .} tional restrictions or as sortal deviance. Here we are
at each given discourse situation s that constitutes couching the relevant phenomenon in presupposi-
our universe of discourse. A discourse situation can tional terms (to illustrate a further application of
be thought of as the time at which the utterance such a concept). The fact that sentences of this sort
takes place. A domain is just a set of individuals, remain deviant across negation may be taken as
pragmatically selected (e.g., those salient to the illocu- evidence that the verb smoke imposes an animacy
tionary agents). Interpretations are relative to an ut- presupposition on its arguments (see e.g., Chierchia
terance situation s and the corresponding domain of and McConnell-Ginet (2000) for more discussion).
discourse Ds. Reference of proper nouns, for example, A definition like (26) can be stated more compactly:
is suitably chosen from the domain of discourse. Sup-
(28) ||smokes||s ¼ smokes,
pose, for example, that a and b are salient humans in
where for each a in Ds, smokes(a) is defined iff a
our universe of discourse, then we might have:
is animate in s; if defined, smokes(a) ¼ 1 if a
(24) For any conceivably relevant utterance situation smokes in s (where smoking involves . . .);
s, the name John denotes a in s; the name Bill smokes (a) ¼ 0, otherwise.
denotes b in s . . .
The definition of (or constraints on) smoking (i.e., the
It doesn’t matter how a or b are characterized (via a dots in (28)) can be elaborated further in several ways
description, an act of indication, etc.) to the extent by refining our lexical analysis of the verb smoke.
that one successfully succeeds in linking the noun to Although much progress has been made on this
its bearer. Also, it is useful to have a uniform category- score, many important issues remain open (including,
neutral notation for semantic values; we will use for e.g., whether a presuppositional treatment of selec-
this the double bar notation || ||; accordingly, for any tional restrictions is ultimately viable). What is im-
expression a, ||a||s will be the semantic value of a in portant, from the point of view of compositional
situation s. Thus, (24) can be abbreviated as: semantics, is the logical type or semantic category of
the denotation of a verb like smoke. Such verbs are
(25) ||John||s ¼ a (where a 2 Ds, the domain of
treated here as functions from individuals into truth
discourse at s)
values. These are called characteristic functions; they
(Technically, || || can be viewed as a function from divide the (relevant portion of) the domain of dis-
expressions and situations into denotations; so some- course of the utterance situation in two: the things
times we will speak of the interpretation function.) that satisfy the verb from those that don’t. Character-
The denotation of a simple (intransitive) verb such as istic functions correspond to sets (which might be
those in (22b) can be thought of as a function that for called the extension of the function), as the following
each (appropriate) individual in the domain discourse example illustrates:
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

572 Formal Semantics

(29) Let universe of discourse be {a, b, c, d}; let a, b, say, cat can be thought of as a characteristic function
and c be people. Of these, let a and b smoke in that selects those entities that are cats out of the
s while b but not a also smokes in a different universe of discourse (or, equivalently, we can say
situation s’. We can represent all this as that cat identifies a class/set across situations). But
follows: what about things like no cat or every cat, which are
a !1 the typical constituents one finds in, e.g., subject
smokes ¼ b !1 corresponding extension: {a,b}
position and the like? What does no cat denote?
c !0
a !1
And, even worse, what do no or every or some
smokes’¼ b !0 corresponding extension: {a} denote? Our program is to assign a denotation to
c !0 lexical entries and then to define in terms of it truth
conditions for sentences. So we must find suitable
As is evident from the example, sets and character- denotations of Ds and DPs.
istic functions are structurally isomorphic (encode To address questions of this sort, we apply a heu-
the same information). In what follows it will be ristic that goes naturally with our general setup:
useful on occasion to switch back and forth between whenever the denotation of an expression is not di-
these two concepts. Use of characteristic functions as rectly accessible to your intuition, look at what that
a formal rendering of verb meanings is useful in expression contributes to the truth conditions of the
giving truth conditions for simple subject predicate sentences it occurs in (the epistemological primacy of
sentences: sentences, again). So, consider for example:
(30a) A sentence of the form [VP N V ] is true in s iff
||V||s (||N||s ) ¼ 1 (32) No boy smokes.
Example:
We know/assume/conjecture that boy and smoke
(30b) [VP Bill drinks ] is true in s iff ||drinks||s (||Bill||s )
¼1 denote characteristic functions and that sentences
contribute truth values (i.e., they are true or false, as
The truth conditions of any sentence with the syntac- the case may be, in different situations). We may
tic structure specified in (30a) boil down to applying a think of no as a function, too. As is evident from
characteristic function to an individual (and thereby (32), such a function combines first with a character-
ascertaining whether that individual belongs to the istic function/set (corresponding to the noun); then
set that constitutes the extension). To find out wheth- the result combines with a second characteristic func-
er Bill in fact smokes in s, we need factual informa- tion (corresponding to the verb) to yield a truth value.
tion about the situation obtaining in s. To understand Schematically, here is what we have:
the sentence, we don’t. We merely need to know its
truth conditions, which in the case of simple subject– (33) no(boys) ( smokes) ¼ 1 or 0
predicate sentences are an instruction to check the
Now we can look at our intuitions. When is (32) true?
value of a characteristic function for the argument
The answer is pretty clear. When among the boys,
specified by the subject. The rules in (30a) and (30b)
nobody smokes. Or, equivalently, when the class of
can be reformulated more compactly as in (31):
boys (i.e., the extension of boys) has no member in
(31) || [VP N V ] ||s ¼ ||V||s (||N||s ) common with the smokers (i.e., the extension of
smokes), (32) is true. In set talk, the intersection
This can be viewed as the kernel of a predication rule
between the boys and the smokers must be empty:
(that tells us how subject and predicates combine
semantically).
(34) no(boys) ( smokes) ¼ 1 iff BOYs \ SMOKEs ¼ B
Everything so far looks like a formally explicit (and
perhaps somewhat pedantic) way of sketching a (where BOYs, SMOKEs are the extensions corre-
denotational, information-oriented semantics, and sponding to boys, smokes, respectively) This is per-
the reader may get the feeling of not yet finding fectly general. Replace boy/smokes with any other
striking insights on what meaning is. In order to noun/verb. The contribution of no stays constant:
grasp the potential of this method, one needs to no(N) (V) is true just in case no member of the exten-
look at a little more of its computational apparatus. sion of N is in V. We thus discover that no has a
So let us turn now to DPs. Things here are definitely perfectly sensible (if abstract) denotation: a function
more challenging. DPs are constituents formed by a that encodes a relation between sets. Our contention
determiner plus a common noun. Common nouns can here is that speakers behave as if they had such a
be given, at least in first approximation, the same function in mind (or something similar to it) in
analysis as (intransitive) verbs, i.e., the meaning of, using no.
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 573

The next step is to see that all determiners express potentially quite effective. A class of words and
relations among sets (characteristic functions), just phrases important and tendentially stable across
like no does. Here are a few examples, along with many languages falls into place: determiners ulti-
some comments. mately express natural relations between sets (the
set associated with the common noun and the set
(35a) Some
(35a.i) Example: some boy smokes
associated with the verb phrase). Our denotational
(35a.ii) Truth conditions: some(boys) ( smokes) ¼ 1 perspective seems to meet rather well the challenge
iff BOYs \ SMOKEs 6¼ B that seemingly denotationless items pose. It is useful
(35a.iii) Comment: some is the contrary of no; some to see what becomes of our rule of predication (viz.
boy smokes is true just in case you can (31) above). Evidently such a rule needs to be split
find someone among the boys who is also into two (main) subcases, depending on whether the
among the smokers; i.e., the intersection subject is a simple N (a proper name) or a complex
between the class of boys and the class of DP. Here is an exemplification of the two cases:
smokers must be non empty. The
indefinite article a can be analyzed along (36a) Mary smokes.
similar lines. (36b) No boy smokes.
(35b) Every
In case (36a), we have semantically two pieces: an
(35b.i) Example: every boy smokes
individual (whomever Mary denotes) and a character-
(35b.ii) Truth conditions: every(boys) ( smokes) ¼
1 iff BOYs  SMOKEs istic function (smokes); so the latter applies to the
(35b.iii) Comment: every expresses the subset former. In case (36b) the two pieces are: a complex
relation: every boy smokes is true just in function (namely no (boys)) that looks for a charac-
case all the members of the class of boys teristic function to yield a truth value, and, as before,
also belongs to the class of smokers the characteristic function smokes; in this case the
(35c) Most former applies to the latter. In either case, the end
(35c.i) Example: Most boys smoke result is a truth value. So our predication rule
(35c.ii) most(boys) ( smokes) ¼ 1 iff the number of becomes:
member of BOYs \ SMOKEs is
bigger than half the number of members of (37a) || [VP N V ] ||s ¼ ||V||s (||N||s )
BOYs. (37b) || [VP DP V ] ||s ¼ ||DP||s (||V||s )
(35c.iii) Comment: most involves actual counting.
This suggests that the core rule of semantic compo-
Most boys smoke is true just in case the
number of boys who smoke (i.e., the sition is functional application. Consider for example
intersection of the boys with the smokers) an ungrammatical sentence of the form:
is greater than half the number of boys (38) * [VP boy smokes ]
(i.e., more than half of the boys are
smokers). Such a sentence, as things stand, would be generated
(35d) The by our (rudimentary) syntax. However, when we try
(35d.i) Example: The blond boy smokes. to interpret it, we find two characteristic functions of
(35d.ii) Truth conditions: the (blond boys) ( smokes) individuals, neither of which can apply to the other.
is defined only if there is exactly one Hence, the sentence is uninterpretable, which explains
blond boy in s. Whenever defined, the
its ungrammaticality. There are languages like, for
(boys) (smokes) ¼ every (boys) ( smokes).
example, Russian or Hindi where singular common
(35d.iii) Comment: this reflects the fact that the
blond boy smokes is only interpretable in nouns without a determiner can occur in subject
situations in which the universe of position:
discourse contains just one blond boy. If
there is more than one blond boy or if (39a) Russian: mal’cik kurit boy smokes ‘the boy
there is no blond boy, we wouldn’t really smokes’_
know what to make of the sentence. So (39b) Hindi: kamre meN cuuha ghuum rahaa
the is a presuppositional determiner; it hai (from Dayal 2004) room in
presupposes the existence and uniqueness mouse moving is ‘a mouse is
of the common noun extension. (This moving in the room’
analysis of the goes back to Frege.)
Notice that (39a) is the verbatim translation of (38)
In spite of the sketchiness of these remarks (that and is grammatical in Russian. The line we are taking
neglect important details of particular determiners), suggests that in such languages it must be possible
it should be evident that the present line of analysis is to turn some covert forms of common nouns into
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

574 Formal Semantics

argumental DPs, i.e., things that can semantically This being so, every subset of the set of scientists must
combine with predicates; for example it is conceiv- also be included among the smokers (by elementary
able that in a language without articles, like Russian, set theoretic considerations). Since, in particular,
the semantic functions associated with the articles can mathematicians are scientists, it follows that
be applied covertly (as part of the interpretive proce-
(44) MATHEMATICIANs  SMOKEs
dure), so as to rescue the semantic mismatch that
would otherwise ensue. This may, in turn, involve But this is just the semantics of (40b). So, if (40a) is
the presence of a phonologically null determiner (for true in s, then (40b) must also be true in s. Evidently,
alternative developments of this line of analysis, as this reasoning goes through no matter which situation
well as details concerning the available interpreta- we are in. Hence, (40a) does entail (40b). On the other
tions, see, e.g., Chierchia (1998), Longobardi hand, it is easy to conceive of a situation in which
(2001), and Dayal (2004)). (44), and hence (40b), hold, but say some economist
The picture that emerges is the following. The basic doesn’t smoke; in such a situation, (43) would fail to
mode of syntactic composition is merge, or some obtain. Hence, (40b) does not entail (40a).
analogously simple operation that puts together two A fully parallel way of reasoning can be put forth
constituents (subject to parametrization pertaining for presuppositions. We said that S presupposes S’ iff
to, e.g., word order, case, etc.). The basic mode of S’ must be taken for granted in every situation in
semantic composition is apply: constituents are com- which S is asserted, denied, etc. This can be cashed
positionally analyzed as functions (of more or less in as follows. We can say that for S to be true or false
complex semantic type) and arguments (individuals (i.e., to have a semantic value that makes it suitable
or other functions); so whenever we find a function for assertion or denial), S’ must be known to be true
and an argument of the appropriate sort, we simply in the utterance situations by the illocutionary agents,
apply the former to the latter. If things go wrong at i.e., S can be true or false in s iff S’ is true in s. Using
any level, the derivation crashes and the result is this definition (known as the ‘semantic’ definition of
ungrammatical. The semantic side of this process presupposition), we can formally prove (though we
has come to be known as ‘type driven interpretation,’ will not do so here) that, for example, (45a) presup-
the main idea being that the semantic categories poses (45b):
of functions and arguments drive the interpretation
process. (45a) The blond boy smokes.
The present approach directly yields a computa- (45b) There is exactly one blond boy around.
tionally tractable theory of entailment and presuppo-
The general point of these examples is the follow-
sition. We have defined entailment roughly as follows:
ing. Intuitions about entailment and the like are a
a sentence S entails a sentence S’ iff whenever S is
priori; speakers have them just by inspecting the
true, S’ is also true. The apparatus we have developed
meaning of the relevant sentences. In the present
allows us to prove whether a certain entailment holds
setup, this central fact is captured as follows. Seman-
or not. Let me show, as an illustration, that (40a)
tics can be viewed as a set of axioms that (a) deter-
entails (40b) but not vice versa.
mines the interpretation of lexical entries and (b)
assigns truth conditions to sentences. Such apparatus
(40a) Every scientist smokes.
yields a calculus of entailment (and other semantic
(40b) Every mathematician smokes.
relations) that reemerge as theorems of semantics.
To show this we need to assume that if one is a We have not formalized each single step of the deri-
mathematician, one is a scientist; i.e., vation (relying on the readers’ patience and under-
standing of elementary set theory); but such a
(41) For every individual a, formalization is, evidently, feasible. We not only
(41a) if mathematicians (a) ¼ 1, then scientists (a) ¼ thereby gain in clarity. We also obtain a device that
1 or, equivalently: constitutes a reasonable (and falsifiable) model of
(41b) MATHEMATICIANs  SCIENTISTs speakers’ linguistic abilities. The claim is not that
Consider now the semantics of (40a), according to the specific rules we have given are actually imple-
our analysis. It is the following: mented in the speakers’ mind. The claim is that speak-
ers, to the extent that they can be said to compute
(42) every(scientists) ( smokes) entailments must be endowed with computational
In virtue of (35b), this is tantamount to facilities that bear a structural resemblance to the
ones sketched here. This, in turn, paves the way for
(43) SCIENTISTs  SMOKEs inspecting the architecture of our linguistic abilities
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 575

ever more closely. Without excessive optimism and meaning relations. We will elaborate by looking more
in full awareness of the controversies that permeate closely at the oscillation in the meaning of or. The
the field, this seems to constitute a step in the right purpose is to illustrate how generalized implicatures
direction. come about and how this bears on the view of seman-
One further remark on the general picture that tics sketched in the preceeding section on Semantic
emerges from the sketch above cannot be avoided. Modeling.
Our approach to meaning is denotational: we assign a The first step is to attempt a semantic analysis of or.
denotation to words and morphemes and (in terms of To this we now turn. Imagine we extend our grammar
such denotations) truth conditions to sentences. This by introducing coordination and negation along the
can be understood in several ways, of which I will following lines:
present two much simplified extremes. We can take
(46a.i) [VP John doesn’t smoke]
truth condition assignment as a way of exposing (46a.ii) [VP NEG VP]
the link between language and the world, which is, (46b.i) [[VP John smokes ] and/or [VP Bill smokes]]
arguably, the ultimate goal of semantics. Words/ (46b.ii) [VP and/or VP]
morphemes are actually mapped into aspects of the
world (e.g., names are mapped into actual individ- The syntax of negation and coordination poses many
uals); sentences are symbolic structures that code thorny questions we simply cannot address here. Al-
through their fine structure how things may be ar- though for our purposes any number of assumptions
ranged in the world. However, it is also possible to concerning syntax might do, let us maintain, again
view things somewhat differently. What really mat- without much justification, that a negative sentence
ters, it can be argued, is not the actual mapping like (46a.i) has the structure in (46a.ii) out of which
between words and aspects of reality and between the observed word order is derived by moving the
sentences and the conditions under which they, in subject left from the inner VP. Furthermore, we will
fact, are true. What we do is give a form or recipe assume that coordinated sentences, whether disjunc-
or potential for actual truth conditions; we merely tive or conjunctive, such as (46b.i), are obtained
constrain the form that truth conditions may take. through schemas such as (46b.ii). Insofar as semantics
What we get out of this is what really matters: a is concerned, the introduction of negation, conjunc-
calculus of semantic relations (entailment, presuppo- tion, disjunction, etc., poses problems similar to that
sition, etc.). Unlike what happens in, say, pure logic, of determiners. The relevant expressions are function
such a calculus is not a normative characterization of words, and it is not obvious how to analyze them in
sound reasoning; it is an empirically falsifiable char- denotational terms. This question, however, can be
acterization of semantic competence (i.e., of what addressed in much the same way as we have done
speakers take to follow from what, when). Under with the determiners: by looking at what the relevant
the latter view, truth conditions (or truth condition elements contribute to the truth conditions of the
potentials, or whatever it is that we map sentences on) sentences they occur in. For sentential operators, we
are a ladder we climb on to understand the working can draw on a rich logical tradition. In the attempt to
of semantic relations, i.e., relations that concern the characterize the notion of valid inference, logicians
information content of linguistic expressions. have discussed extensively propositional connectives
It is evident that we are not going to settle these (like not, and, or), and the outcome is an analysis of
issues here. As a small consolation (but also, if such elements as truth functions or, equivalently, in
you wish, as evidence of the maturity of the field), terms of ‘truth tables.’ For example, the contribution
I hope to have given the reader reasons to believe that of negation to meaning can be spelled out in terms of
progress is possible even if such foundational issues conditions of the following sort:
remain open. (47a) John doesn’t smoke is true in s iff John
We haven’t discussed implicatures and other prag- smoke is false in s
matically driven intuitions about meaning. To under- (47b) || NEG VP ||s ¼ 1 iff || VP||s ¼ 0
stand the full scope of the present proposal, it is (47c) VP NEG VP
important to do so. This requires extending a bit 1 0
what we have done so far. 0 1
In (47c) we display in the form of a truth table the
semantics given in (47b). Essentially, this says that in
The Semantics/Pragmatics Interface
uttering a negation like (47a), the speaker intends to
In the section Truth and Semantic Competence, we convey the falsity of the corresponding positive sen-
mentioned implicatures, a broad and varied type of tence. By the same token, conjunctions can be
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

576 Formal Semantics

analyzed as in (48a), (48b), and (48c), and disjunction As the readers can verify by comparing (49a), (49b),
as in (49a), (49b), and (49c): and (49c) with (50), the two interpretations of
or differ only in case (i); if both disjuncts are true,
(48a) John smokes and Bill smokes is true if
both John smokes and Bill smokes
the whole disjunction is true on the inclusive inter-
are. pretation and false on the exclusive one.
(48b) || [VP1 and VP2]||s ¼ 1 iff || VP1||s ¼ || So, the thesis that or is ambiguous can be given a
VP2||s ¼ 1 precise form. There are two homophonous ors in
(48c) VP1 VP2 [ VP1 and VP2] English. One is interpreted as in (48a), (48b), and
(48c.i) 1 1 1 (48c), the other as in (50). Illocutionary agents choose
(48c.ii) 1 0 0 among these options on pragmatic grounds. They go
(48c.iii) 0 1 0 for the interpretation that is best suited to the context.
(48c.iv) 0 0 0 Determining which one that is will involve knowing
(49a) John smokes or Bill smokes is true if either things like the topic of the conversation (e.g., are we
John smokes or Bill smokes or both are talking about a single job or more than one), the
true. purpose of the conversational exchange, the inten-
(49b) || [ VP1 and VP2]||s ¼ 1 iff either || VP1||s ¼ 1 tions of the speaker, etc.
or || VP2||s ¼ 1 or both We mentioned that Grice proposed an alternative
(49c) VP1 VP2 [ VP1 or VP2] view, however. We are now in position to spell it out
(49c.i) 1 1 1
more clearly. If you look closely at the two truth
(49c.ii) 1 0 1
tables in (49a), (49b), and (49c) vs. (50), you’ll notice
(49c.iii) 0 1 1
(49c.iv) 0 0 0 that in all the cases in which the exclusive or comes
out true (namely case (ii) and case (iii)), the inclusive
This is the way in which such connectives are ana- one does, too, i.e., in our terms, [p orexclusive q] entails
lyzed in classical (Boolean) logic. Such an analysis has [p orinclusive q]. The former is, thus, stronger, more
proven extremely fruitful for many purposes. More- informative than the latter in the following precise
over, there is little doubt that the analysis in question sense: it rules out more cases. If you get the informa-
is ultimately rooted in the way in which negation, tion that [p orexclusive q] holds, you know that case (ii)
etc., works in natural language; such an analysis in- or case (iii) may obtain, but case (i) and case (iv) are
deed captures at least certain natural uses of the rele- ruled out. If you know instead that [p orinclusive q]
vant words. What is unclear and much debated is obtains, you know that you might be in case (i), (ii),
whether such an analysis stands a chance as a full- or (iii); only case (iv) is ruled out. Your degree of
fledged (or nearly so) analysis of the semantics of the uncertainty is higher. So orexclusive is more restrictive;
corresponding English words. There are plenty of orinclusive is more general (more liberal we said).
cases where this seems prima facie unlikely. This is Things being so, suppose for a moment that or in
so much so that many people have concluded that English is unambiguously inclusive (i.e., its interpre-
while Boolean operators may be distilled out of lan- tation is the most general, less restrictive of the two);
guage via a process of abstraction, they actually re- this does not rule out at all the possibility that we are
flect normative principles of good reasoning more in case (ii) or case (iii). The exclusive construal, in
than the actual semantics of the corresponding natu- other words, might arise as a special case of pragmat-
ral language constructions. Of the many ways in ic strengthening. It is as if we silently add to, say,
which this problem might illustrated, I will choose (51a) something like (51b).
the debate on the interpretation of or.
The interpretation of or provided in (49a), (49b), (51a) John or Mary will be hired.
(51b) (. . . but not both)
and (49c) is the inclusive one: in case both disjuncts
turn out to be true, the disjunction as a whole is The silent addition of (51b) to (51a) might be justified
considered true. As we saw, this seems adequate for through a reasoning of the following sort:
certain uses but not for others. The exclusive or can
(52) The speaker said (51a); let us assume she is
be analyzed along the following lines:
being cooperative and not hiding on purpose
(50) Exclusive or any relevant information. This entails that she
VP1 VP2 [ VP1 or VP2] has no evidence that both John and Mary
(50.i) 1 1 0 have been hired, for otherwise she would have
(50.ii) 1 0 1 said so. Assuming, moreover, that she is well-
(50.iii) 0 1 1 informed about the facts, this furthermore
(50.iv) 0 0 0 entails that she thinks that in fact (51b) holds.
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 577

So in this view, the base interpretation (viz. (51a)) is (56a) I believe that either John or Mary will be hired.
enriched through an inferential process that draws on (56b) I really doubt that either John or Mary will be
principles of rational conversational exchanges and hired.
on factual knowledge about the context. The relation Sentence (56a) is likely to get the interpretation
between (51a) and (51b) can thus be analyzed as a ‘I believe that either John or Mary but not both
case of implicature (cf. on this, e.g., Horn (1989), will be hired.’ Sentence (56b), on the other hand
Levinson (2000), and references therein). does not have a parallel reading. It rather means
The debate on how the two interpretations of or ‘I really disbelieve that John and Mary stand a
come about is important and shows different ways in chance.’
which semantics is taken to interact with broader The list could go on. But these examples should
considerations pertaining to communication. Wheth- suffice to instill in the reader the idea that there is a
er the two interpretations of or are a matter of ambi- systematic effect of structure on the interpretation of
guity or arise as an implicature, I want to point out a or. A doubt might linger, though, as to whether it is
generalization concerning their distribution, which really in the nature of structure to have this impact.
I think shows something important concerning how Take, for example, the pair in (55a) and (55b). Is it
language works. I will argue that the cases in which or the position of disjunction that makes a difference?
is construed preferentially inclusively are (1) predict- Or is it rather our knowledge of how classes normally
able, and (2) determined by structure. Then, I will put work?
forth a hypothesis as to why this is so. This is a legitimate question. Noveck et al. (2002)
We have seen that a sentence like (16a), repeated address it experimentally. They designed a reasoning
here as (57a), is interpreted as in (57b), namely exclu- task, in which logically naı̈ve subjects are asked to
sively: judge whether a certain inference is sound or not. For
(53a) If I got it right, either John or Mary will be example, subjects were asked to judge whether one
hired. can infer (57c) from (57a) and (57b):
(53b) If I got it right, either John or Mary but not
both will be hired. (57a) If there is an A, then there is a B or a C.
(57b) There is an A.
Now take the consequent (i.e., the main clause) in the therefore:
conditional in (53a) and move it to the antecedent, (57c) There aren’t both a B and a C.
and the interpretation tends to shift: Subjects were told that this was about inferences that
(54a) If either John or Mary are hired, we’ll could be drawn (on the basis of the given premises)
celebrate. concerning letters written on the back of a certain
(54b) If John or Mary or both are hired, we’ll blackboard. What would your answer be? The experi-
celebrate. mental subjects overwhelmingly accepted the infer-
So, moving a disjunction from the consequent to the ence in (57a) and (57b). What is interesting is that in
antecedent seems to have a systematic effect on the terms of classical Boolean logic (which takes or to be
interpretation of or. The same holds for the pair in inclusive) this inference is invalid. It is only valid if or
(55a) and (55b): in (57a) is interpreted exclusively. At the same time,
subjects rejected inferences of the following form:
(55a) Every student will either take an exam or write
a paper. (58a) If there is an A, then there is a B and a C.
(55b) Every student who either takes an exam or (58b) There is an A.
writes a paper will satisfy the requirements. therefore:
(58c) There is a B or a C.
In (55a), or is within the VP, which corresponds to the
second argument of every, according to the analysis Again, this seems to make sense only if or in (58c) is
sketched in the section Semantic Modeling. Its pre- interpreted exclusively. Things change dramatically if
ferred interpretation is exclusive. In (55b), every is in or is embedded in the antecedent of a conditional:
a relative clause which is part of the subject NP
(59a) If there is an A or a B, then there is a C.
(namely, the first argument of every according to the
(59b) There is an A; there is also a B.
analysis in Semantic Modeling). Its preferred inter- therefore:
pretation is clearly inclusive. (59c) There is a C.
A further class of contexts that displays a similar
effect are negation and negative verbs. Compare Subjects overwhelmingly accepted this inference
(56a) and (56b): as valid. But this is only possible if or in (59a) is
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

578 Formal Semantics

construed inclusively. Our raw intuition thus finds (65a) Everyone had Marlboros.
experimental confirmation, one that passes all due (65b) Everyone had cigarettes.
controls (the inferences were mixed with others con- Sentence (65a) entails sentence (65b) and not vice
taining other connectives and quantifiers, so that sub- versa. So does the consequent of a conditional
jects were not conditioned to devise an answering
strategy, and the order of presentation was duly (66a) If you open the drawer, you’ll find Marlboros.
varied, etc.). What is interesting is that these experi- (66b) If you open the drawer, you’ll find cigarettes.
ments only involved meaningless letters A, B, C . . . so But the NP argument of every (its first argument)
scripts, contextual clues, knowledge of the world can inverts this pattern just like negation, as we saw in
hardly be imputed any role in the outcome. If there is the Semantic Modeling section:
a systematic effect on the interpretation of or, this
must be due to the meaning of conditionals, of dis- (67a) Everyone who had Marlboros shared them.
junction, and to the positioning of the latter. Nothing (67b) Everyone who had cigarettes shared them.
else is at play. Here it is (67b) that entails (67a) and not vice versa.
The reader may wonder how one manages to find The same applies to the antecedent of conditionals:
out which structures affect the interpretation of or.
The answer is that such structures were familiar from (68a) If you smoke Marlboros, you’ll be fined.
(68b) If you smoke cigarettes, you’ll be fined.
another phenomenon: the licensing of Negative Po-
larity Items (NPIs). NPIs are lexical items like any or Sentence (68b) entails (68a); on the other hand (68a)
ever that seem to require the presence of a negative could be true without (68b) necessarily being true (in
element: a town in which Marlboros but no other brand is
(60a) * There is any cake left
banned).
(60b) There isn’t any cake left. In conclusion, the contexts that favor the inclusive
interpretation of or share a semantic property that has
NPIs are acceptable in the contexts that favor the to do with entailment patterns: they all license entail-
inclusive interpretation of or over the exclusive one: ments from sets to their subsets. Such a property has
(61a) * If we are in luck, there are any cookies left come to be seen as the property of being downward
(61b) If there are any cookies left, we are in luck. entailing (where down refers to the directionality of
the entailment from sets to smaller ones). If this char-
(62a) * Everyone had any cookies left
acterization is correct, this means that speakers to the
(62b) Everyone who had any cookies left shared
extent that they interpret or as shown, must differen-
them.
tiate such contexts, and hence must be able to com-
This correlation is striking, for the two phenomena pute the entailments associated with the relevant
(the distribution of any and of inclusive vs. exclusive structure.
or) seem to have little in common. The next question is why or tends to be interpreted
The next question is whether the relevant contexts inclusively in downward entailing structures. I will
have some common property. The answer seems to be only hint at what strikes me as a highly plausible
positive and, surprisingly, points in the direction of a answer. As we saw above, in plain unembedded con-
rather abstract, entailment-based property. Positive texts, exclusive or is stronger (i.e., asymmetrically
contexts typically license inferences that go from entails) than inclusive or. The set of cases in which
sets to supersets. For example, (63a) entails (63b) exclusive or is true is a subset of the set of cases in
and not vice versa. which the inclusive one is true. We evidently prefer,
everything else being equal, to go for the strongest of
(63a) There are Marlboros.
(63b) There are cigarettes. two available interpretations. Now, negation and, in
fact, all downward entailing structures, as we just
The set of cigarettes is a superset of the set of Marl- saw, reverse this pattern. Under negation, first
boros; so the entailment goes from a set to its super- becomes last; i.e., strongest becomes weakest. In the
sets. Negation reverses this pattern: (64b) entails case of disjunction, the negation of inclusive or is
(64a) and not vice versa: stronger (i.e., entails) the negation of exclusive or.
(64a) There aren’t any Marlboros. I’ll leave it to the readers to persuade themselves
(64b) There aren’t any cigarettes. that this is so. Now why is this observation relevant?
Suppose we go for the strongest of two alternatives
Now the VP portion of a sentence with every (i.e., its (i.e., we maximize informativeness, everything else
second argument) patterns with (63a) and (63b): being equal); for disjunction, in downward-entailing
Encyclopedia of Language & Linguistics (Second Edition), 2006, Pages 564-579

Formal Semantics 579

contexts inclusive or is the strongest interpretation; in unconscious form, of the spontaneous knowledge
nondownward-entailing contexts exclusive or is the that endows speakers with their linguistic abilities.
strongest. This explains the observed behavior in
terms of a rather simple principle that optimizes See also: Boole and Algebraic Semantics; Compositional-
information content on the basis of the available ity: Semantic Aspects; Implicature; Monotonicity and
expressive resources. Generalized Quantifiers; Presupposition; Propositional
So pragmatic strengthening (via a generalized and Predicate Logic: Linguistic Aspects; Quantifiers: Se-
implicature) correlates harmoniously with the entail- mantics.
ment properties of various elements.
Bibliography
Conclusions Chierchia G (1998). ‘Reference to kinds across languages.’
Natural Language Semantics 6, 339–445.
We have sketched a view of semantic competence as
Chierchia G & McConnell-Ginet S (2000). Meaning and
the implicit knowledge a speaker has of how the grammar (2nd edn.). Cambridge, Mass: MIT Press.
information content of various expressions is related. Crain S & Thornton R (1998). Investigations in Universal
We have proposed to classify the hosts of semantic Grammar. Cambridge, Mass: MIT Press.
relations in three major families: entailment-based, Dayal V (2004). Number marking and (in)definiteness in
presupposition-based, and implicature-based. Given kind-terms. Linguistics and Philosophy 27, 393–450.
two sentences, speakers can judge whether they entail Grice H P (1989). Studies in the ways of words. Cambridge,
each other or not, whether they presuppose each Mass: Harvard University Press.
other or not, and so on; and they can do so with finite Heim I & Kratzer A (1998). Semantics in generative gram-
cognitive resources. We have sketched a denotational mar. Oxford: Blackwell.
Horn L (1989). A natural history of negation. Chicago:
semantics that accounts for such a competence (i.e.,
University of Chicago Press.
provides a model for it). Our semantics takes the form
Jennings R E (1994). The genealogy of disjunction. Oxford:
of a calculus in which entailments, presuppositions, Oxford University Press.
and even (certain) implicatures re-emerge as theo- Kratzer A (1999). Beyond ouch and oops. How expressive
rems. Such a model is formal in the sense of being and descriptive meaning interact. Paper presented at the
explicit (building on the tradition of logic and model Cornell Conference on Context Dependency, unpub-
theory). It is, however, also substantive, in that it lished manuscript. Amherst Mass.
models a human cognitive capacity (i.e., the ability Levinson S (2000). Presumptive meanings. Cambridge,
to semantically relate sentences to each other). We Mass: MIT Press.
have seen two simple applications of this approach, Longobardi G (2001). ‘How comparative is semantics?
to the analysis of determiners and connectives. We A unified parametric theory of bare nouns and proper
names.’ Natural Language Semantics 9, 335–369.
have also discussed a case of pragmatic enrichment.
Mates B (1950). ‘Synonymity.’ In Meaning and interpreta-
What we found is that the interpretation of or as
tion. University of California Publications in Philosophy,
exclusive or inclusive follows a pattern sensitive to 25.
downward entailingness (much like what happens Noveck I, Chierchia G, Chevaux F, Guelminger R &
with negative polarity items). If this is so, then entail- Sylvestre E (2002). ‘Linguistic–pragmatic factors in
ment patterns are not simply an invention of logi- interpreting disjunctions.’ Thinking and Reasoning 8,
cians or linguists. They must be constitutive, in an 297–326.

You might also like