0% found this document useful (0 votes)
47 views

Formal Semantics

This document provides an overview of formal semantics. It discusses how formal semantics uses logical and mathematical tools to study meaning, focusing on truth conditions and compositionality. The document traces the history and development of formal semantics from its roots in logic and philosophy of language to contemporary work at the semantics-pragmatics interface. It also examines some of the central principles of formal semantics and its relationship to cognitive science and linguistic theory.

Uploaded by

Camila Silvestre
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

Formal Semantics

This document provides an overview of formal semantics. It discusses how formal semantics uses logical and mathematical tools to study meaning, focusing on truth conditions and compositionality. The document traces the history and development of formal semantics from its roots in logic and philosophy of language to contemporary work at the semantics-pragmatics interface. It also examines some of the central principles of formal semantics and its relationship to cognitive science and linguistic theory.

Uploaded by

Camila Silvestre
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

1

Formal semantics
Barbara H. Partee

1.1 Formal semantics: what is it? 3


1.2 The history of formal semantics 5
1.3 Central principles, issues, and points of divergence 13
1.4 From Montague Grammar to contemporary formal semantics 17
1.5 The increasing role of context: formal semantics
and pragmatics 26
1.6 Formal semantics and psychology: the
Frege–Chomsky conflict 29
1.7 Other issues 31

1.1 Formal semantics: what is it?

Formal semantics is an approach to semantics, the study of meaning, with


roots in logic, the philosophy of language, and linguistics. The word formal
in “formal semantics” is opposed to informal and reflects the influence of
logic and mathematics in the rise of scientific approaches to philosophy and
to linguistics in the twentieth century. Distinctive characteristics of this
approach (see also Pagin, Chapter 3) have been truth conditions as a central
part of meaning; (usually) a model-theoretic conception of semantics; and
the methodological centrality of the Principle of Compositionality: “The
meaning of a whole is a function of the meanings of its parts and their
mode of syntactic combination.” Most formal semantics is model-theoretic,
relating linguistic expressions to model-theoretically constructed semantic
values cast in terms of truth, reference, and possible worlds or situations
(hence, formal semantics is not “formal” in the sense of Hilbert, 1922).
And most formal semanticists treat meaning as mind-independent (and
abstract), not as concepts “in the head”; formal semanticists distinguish
semantics from knowledge of semantics (Lewis, 1975b; Cresswell, 1978).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
4 B A R B A R A H . PA R T E E

This sets formal semantics apart from approaches which view semantics
as relating a sentence principally to a representation on another linguistic
“level” (logical form) (May, 1985) or a representation in an innate “language
of thought” (Fodor, 1975) or “conceptual representation” (Jackendoff,
1992). The formal semanticist could accept such representations as an
aspect of semantics but would insist on asking what the model-theoretic
semantic interpretation of the given representation-language is (Lewis,
1970). Kamp’s Discourse Representation Theory is an exception, since as
noted in Section 1.3.3 below, it includes as essential an intermediate level
of representation with claimed psychological reality. Formal semantics is
centrally concerned with compositionality at the syntax–semantics inter-
face (see Sailer, Chapter 21), how the meanings of larger constituents are
built up from the meanings of their parts on the basis of their syntactic
structure.
The most important figure in the history of formal semantics was
undoubtedly Richard Montague (1930–1971), whose seminal works in this
area date from the late 1960s and the beginning of the 1970s. Other impor-
tant contributors will also be discussed below. Since the 1980s formal
semantics has been a core area of linguistic theory; important contributions
also continue to come from philosophy, logic, cognitive science, and compu-
tational linguistics.
In the last thirty years formal semanticists have become increasingly
concerned with issues at the interface between semantics and pragmatics,
including context-dependence, information structure, and the semantics/
pragmatics of dialogue (see Asher, Chapter 4; Ginzburg, Chapter 5;
Schlenker, Chapter 22; Vallduví, Chapter 23). These broadening concerns
have led to a range of newer approaches that treat meaning as something
more than truth conditions, but still including truth conditions, possibly
derivatively, as a central part of what semantics is to capture.
In this chapter we briefly trace the history of formal semantics (Sec-
tion 1.2) and discuss some of its central principles, debated issues, and
divergences within the field (Section 1.3). Since issues concerning the
syntax–semantics interface are so crucial to the central working hypothe-
sis of compositionality, we include some brief case studies relating to the
syntax–semantics interface in Section 1.4; fuller treatments of related issues
will be found in Brasoveanu and Farkas, Chapter 8, and Sailer, Chapter 21.
In Section 1.5 we describe the increasing attention to the role of context
and to language use and the consequent blending of formal semantics and
formal pragmatics (see also Schlenker, Chapter 22). In Section 1.6 we come
back to the foundational question of whether meanings are in the head and
how formal semantics, which has traditionally rested on the assumption
that they are not, connects to cognitive science and studies of language
processing and language acquisition. In the final Section 1.7, we mention
some of the relatively recent contributions of formal semanticists to issues
of language universals and language typology.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 5

1.2 The history of formal semantics

1.2.1 Semantics in linguistics before 1970


Linguistics had partly different beginnings and different emphases in
Europe and in America, growing in considerable part out of philological-
historical work in Europe and out of that plus anthropological studies in
America. Semantics in early linguistics was mainly lexical; lexical semantics
and principles of semantic change and semantic drift were important for
historical and comparative linguistics. Structuralism arose first in Europe,
and de Saussure was influential for structuralism, for putting synchronic
grammar into the foreground, and for conceiving of grammar as connecting
form and meaning. Bühler’s Sprachtheorie (1934) included an early treatment
of indexicality and perspective shift. Jespersen made lasting contributions
to semantics as well as syntax (1924); while in the Netherlands, Evert Beth
was laying foundations (1947, 1963) for the cooperation among logicians
and linguists that has made the Netherlands one of the major contributors
to the development of formal semantics.
Semantics was rather neglected in early and mid-twentieth-century
American linguistics, for several reasons. Early American anthropological
linguistics depended on fieldwork, where phonetics came first, then phonol-
ogy, morphology, perhaps a little syntax, and usually no semantics beyond
a dictionary. Another reason was the influence of logical positivism and
behaviorism: meaning was viewed as an unobservable aspect of language,
not fit for scientific study. And the Chomskyan revolution concentrated on
syntax: there was much talk of the creativity of language and of language
as a window on the mind, but it was all about syntax, investigating finite
recursive mechanisms that could characterize the infinite class of sentences
of a natural language, and trying to solve the mystery of first-language
acquisition.
In 1954, the philosopher Yehoshua Bar-Hillel wrote an article in Language
(1954) urging cooperation between linguists and logicians and arguing that
advances in both fields made the time ripe for combining forces to work
on syntax and semantics together. However, Chomsky immediately replied
(1955), arguing that the artificial languages invented by logicians were
too unlike natural languages for any methods the logicians had developed
to have any chance of being useful for developing linguistic theory. Real
cooperation of the sort that Bar-Hillel urged began only after Montague’s
work.
The first efforts to add semantics to Chomsky’s generative grammar were
made by Katz and Fodor (1963), who were the first to use the term composi-
tionality (although in most of their work they speak rather of projection rules),
proposing rules that first interpret underlying phrase-markers, bottom-up,
and then interpret the result of applying transformations. A little later
Katz and Postal (1964) proposed that all semantically relevant information

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
6 B A R B A R A H . PA R T E E

should actually be contained in the underlying phrase-markers and that


transformations should preserve meaning. The Katz–Postal hypothesis was
adopted in Chomsky (1965) as part of what Chomsky later dubbed “the
standard theory,” with Deep Structure as the input to semantics.
But very soon the “discovery” of quantifiers spoiled the illusion of
meaning-preservingness for many transformations; I illustrate with a few
key examples without giving the rules or the structures. In each case the
(a) example illustrates the apparently meaning-preserving transformation
when the NPs are proper names, but in the (b) example, application of the
same rule does not preserve meaning. We will come back to some of these
examples in discussing the syntax–semantics interface in formal semantics.

(1) a. John wants John to win ⇒ John wants to win


b. Everyone wants everyone to win ⇒ [!?] Everyone wants to win

(2) a. John voted for John ⇒ John voted for himself


b. Every candidate voted for every candidate ⇒ [!?] Every candidate
voted for himself

(3) a. Three is even or three is odd ⇒ Three is even or odd


b. Every number is even or every number is odd ⇒ [!?] Every number
is even or odd

The discovery that quantifiers, negation, and conjunctions were not prop-
erly interpretable at the level of Deep Structure as conceived in 1965 was one
important factor in the genesis of the “Linguistic Wars”. Generative Seman-
ticists (Lakoff, McCawley, Ross, and others, cf., Huck and Goldsmith, 1995)
responded by making Deep Structure “deeper”, closer to “logical form”,
which for linguists then meant more closely resembling first-order predi-
cate logic. Interpretive Semanticists (Jackendoff, Chomsky) kept the syntax
closer to the “standard theory” and added additional interpretive mecha-
nisms at various syntactic levels.
During all this time, the notion of semantic interpretation was in a rather
primitive state. Katz, Fodor, and Postal worked with “semantic markers”
modeled on phonological distinctive features, treating sentence meanings
as bundles of semantic markers. The Generative Semanticists added first-
order logic and uninterpreted but supposedly universal predicates and oper-
ators such as “CAUSE” and “BECOME”. The reaction of philosophers of lan-
guage was most notably formulated by David Lewis (1970).

But we can know the Markerese translation of an English sentence with-


out knowing the first thing about the meaning of the English sentence:
namely, the conditions under which it would be true. Semantics with no
treatment of truth conditions is not semantics. (Lewis, 1970, p. 1)

To linguists, concern with truth looked puzzling. Linguists were trying to fig-
ure out mental representations that could underlie linguistic competence.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 7

“Actual truth” was (correctly) considered irrelevant, and truth conditions


were not understood or appreciated.

1.2.2 Semantics in logic and philosophy before 1970


In the meantime great progress was made in semantics in logic and phi-
losophy of language. The greatest foundational figure for formal seman-
tics is Gottlob Frege (1848–1925). He made crucial advances to the analysis
of variables and quantifiers, and introduced the distinction between Sinn
and Bedeutung, sense and reference, the precursor of the modern intension–
extension distinction. Frege is also credited with the Principle of Compo-
sitionality,1 a cornerstone of formal semantics, and a principle quite uni-
versally followed in the design of the formal languages of logic, with a few
interesting exceptions.2

The Principle of Compositionality: the meaning of a complex expression is a


function of the meanings of its parts and of the way they are syntactically
combined.

Further advances were made by Russell, Carnap, and others. Wittgenstein


(1922) first articulated the idea that “To know the meaning of a sentence is
to know what is the case if it is true” (Tractatus, 4.024). Tarski (1902–1983) for-
malized model theory inside set theory and provided the first formal model-
theoretic semantics for logical languages (Tarski, 1944; Feferman and Fefer-
man, 2004); his goals concerned metalogic and the avoidance of semantic
paradoxes, not semantics itself (Etchemendy, 1988).
Around that time, there was a major dispute within philosophy, the
“Ordinary Language” vs. “Formal Language” war, concerning the role of
natural language in philosophical argumentation, including the question
of whether the analysis of natural language could be a source of philo-
sophical insights, or whether natural language was too unruly and needed
to be replaced by a suitably constructed formal language for purposes
of exact argumentation. The Ordinary Language philosophers, including
late Wittgenstein, Austin, Ryle, and Strawson, were a new generation who
rejected the formal approach and urged closer attention to the functions
of ordinary language and its uses, including much more attention to lan-
guage in context, i.e., to pragmatics. In his critique of Russell in “On Refer-
ring”, Strawson (1950) said, “Neither Aristotelian nor Russellian rules give
the exact logic of any expression of ordinary language; for ordinary lan-
guage has no exact logic.” Russell rejected Strawson’s critique, but added “I
agree, however, with Mr. Strawson’s statement that ordinary language has
no logic” (1957). It is noteworthy that both sides in this “war” (as well as

1
Not without some controversy; see Janssen (1983). And see Hodges (2001) for a discussion of the relation
between compositionality and contextuality in Frege, and Pelletier (2001) for a third evaluation.
2
It has been observed in a number of works (I learned it from Tarski, p.c., 1971) that the usual semantics for
the quantifiers of first-order logic in terms of satisfaction and assignments is not strictly compositional.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
8 B A R B A R A H . PA R T E E

Chomsky) were in agreement that logical methods of formal language anal-


ysis did not apply to natural languages.
The response of some formally oriented philosophers was to try to analyze
ordinary language better, including its context-dependent features. Within
philosophical logic, the foundational work of Frege, Carnap, and Tarski
had led to a flowering of work on modal logic and on tense logic, on con-
ditionals, on referential opacity, and on other philosophically interesting
natural language phenomena. Quine had rejected modal and intensional
notions as incurably unclear, but Kripke (1959), Kanger (1957a,b), and Hin-
tikka (1962) revolutionized the field by providing a model-theoretic possible-
worlds semantics for modal logic. And first Reichenbach (1947) and then
Prior (1967) made great progress on the development of the logic of tenses,
a notorious source of context-dependence in natural languages; Thomason
(1996) identifies Prior as an important contributor to “natural language
semantics logicism”.
Paul Grice (1913–1988) contributed in a quite different way to the eventual
resolution of the war. His work on conversational implicatures (Grice, 1975)
showed that explanatory pragmatic principles can allow semantics to be
simpler, so that the apparent gap between the “logicians’ meaning,” even in
terms of standard, extensional first-order logic, of words like the and if and
or and their ordinary-language meaning might be much less than had been
supposed. And although not all of his proposals for the semantics of such
words have survived, his methodological lesson and his pragmatic principles
became highly influential for formal and informal semantics alike. Once his
lessons sunk in, it became obligatory in studying semantics to also think
about pragmatics, although whether and how pragmatics “belongs in the
grammar” has remained controversial for decades (Horn, 2006).

1.2.3 Montague’s work


Montague had been an important contributor to these developments in
philosophical logic. Montague was a student of Tarski’s, and at UCLA was a
teacher and then a colleague of David Kaplan, co-authored a logic textbook
with his colleague Donald Kalish, and was an active part of a strong logic
group. As a logician, Montague built on the Frege–Tarski–Carnap tradition
of model-theoretic semantics of logic and developed an intensional logic
with a rich type theory and a possible-worlds model-theoretic semantics,
incorporating certain aspects of “formal pragmatics”, including the treat-
ment of “indexical” words and morphemes like I, you and the present tense
(Montague, 1968, 1970b). This was accomplished in part by treating both
worlds and times as components of “indices”, then generalizing from times
to “points of reference”, that is, complexes of relevant features of contexts,
and treating intensions as functions from indices (not just possible worlds)
to extensions. He generalized intensional notions such as property, proposi-
tion, individual concept, into a fully typed intensional logic, extending the

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 9

work of Church (1951), Carnap (1956), and Kaplan (1964), putting together
the function-argument structure common to type theories since Russell
with the treatment of intensions as functions from indices to extensions.
In the late 1960s, Montague turned to the project of “universal grammar”,
which for him meant a theory of syntax and semantics encompassing both
formal and natural languages, a groundbreaking project that became Mon-
tague Grammar,3 and to which his last three papers were devoted, with plans
for a book-length treatment interrupted by his untimely death in 1971.
That project evidently grew out of his work on the development of a
higher-order typed intensional language suitable for doing philosophy. His
paper “On the Nature of Certain Philosophical Entities” (NCPE) (Montague,
1969) contains a great deal that can be considered as much a matter of
semantics as of philosophy, and foreshadows some of his work in his three
final “language” papers. An important passage in that paper with respect to
Montague’s program occurs on pages 154–156, explaining his change from
believing that philosophy should be done in the framework of set theory
to believing that it should be done in the framework of intensional logic,
and announcing his claim that he has constructed an adequate intensional
logic.

One system of intensional logic now exists which fully meets the objec-
tions of Quine and others, which possesses a simple structure as well as
a close conformity to ordinary language, and concerning the adequacy of
which I believe no serious doubts can be entertained. (Montague, 1969,
p. 156)

This big “framework” change in Montague’s approach to logic and philoso-


phy is described and discussed in Cocchiarella (1981).
His attitude toward his work on natural language was ambivalent. On the
one hand, he considered it worthwhile to demonstrate that “The syntax and
semantics of certain not insignificant fragments of English can be treated
just as formally and precisely as those of the first-order predicate calculus,
and in very much the same manner” (Montague, in Staal, 1969, p. 274). How-
ever, at the same time, he asserted that “it would appear more important to
extend the scope of constructed systems than to discover the exact rules
of natural languages” (Staal, 1969, p. 275). As Thomason (1996) notes, Mon-
tague’s quest for a “formal philosophy” grounded on his intensional logic
remains unfulfilled and possibly quixotic, and his legacy is ironically rather
in the “rather easy and not very important”4 project of the analysis of ordi-
nary language.

3
The term entered the Oxford English Dictionary in 2002, the first citation being to Rodman (1972), a
collection of papers by participants in a seminar taught at UCLA by the author.
4
This unpublished quotation from Montague’s notes, as well as evidence that Montague might have later
revised his “rather easy” assessment, is discussed in Partee (2011, 2013).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
10 B A R B A R A H . PA R T E E

Montague’s work on the formal treatment of natural languages is all in


his last three papers, “English as Formal Language” (EFL) (1970a), “Univer-
sal Grammar” (UG) (1970b), and “The Proper Treatment of Quantification in
Ordinary English” (PTQ) (1973b). The one that had the most impact on lin-
guists and on the subsequent development of formal semantics was PTQ:
short, but densely packed (see Partee, 1975, 1997b; Janssen, 1994). Montague
Grammar has often meant what Montague did in the fragment in PTQ and
the extensions of PTQ by linguists and philosophers in the 1970s and 1980s.
But it is the broader algebraic framework of UG that constitutes Montague’s
theory of grammar.
Before Montague, linguists took as the task of semantics the explication
of ambiguity, semantic anomaly, and synonymy: the key questions were
how many readings a sentence has, and which sentences share readings.
The individuation of “readings” had always been problematic, though, and
there was often crucial disagreement about data. Intuitions about “read-
ings” undoubtedly rested in part on judgments concerning truth condi-
tions, but truth conditions were never explicitly discussed. The methods
used in early linguistic semantics primarily involved lexical decomposition
in terms of semantic features, plus hypothesized abstract tree structures dis-
playing scope relations and other aspects of semantic structure. The introduc-
tion of truth conditions as the basic semantic property of a sentence that
a semantic theory should capture profoundly affected the adequacy criteria
for semantics and led to a great expansion of semantic research.
Montague’s (1970) paper, UG, contains the most general statement of Mon-
tague’s formal framework for the description of language. The central idea
is that anything that should count as a grammar should be able to be cast
in the following form: the syntax is an algebra, the semantics is an algebra,
and there is a homomorphism mapping elements of the syntactic algebra
onto elements of the semantic algebra. In the PTQ grammar for a fragment
of English, the syntax is not explicitly presented as an algebra, but if it were
transformed into one, the elements would be the analysis trees.
The choice for the semantic elements is totally free, as long as they make
up an algebra. The semantic elements, or semantic values as they are often
called, could be taken to be the model-theoretic constructs of possible-
worlds semantics as in Montague’s fragments of English and most “clas-
sical” formal semantics, or the file change potentials of (Heim, 1982), or
the game strategies of game-theoretical semantics, or the simple exten-
sional domains of first-order logic, or hypothesized psychological concepts,
or expressions in a “language of thought”, or anything else; what is con-
strained is not the “substance” of the semantics but some properties of its
structure and of its relation to syntactic structure. It is the homomorphism
requirement, which is in effect the compositionality requirement, that pro-
vides the most important constraint on UG in Montague’s sense, and it is
therefore appropriate that compositionality is frequently at the heart of con-
troversies concerning formal semantics.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 11

Stokhof (2006) summarizes two important characteristics of Montague’s


theory as defined in UG and illustrated in PTQ:

a. Semantics is syntax-driven, syntax is semantically motivated.


(Compositionality)
b. Semantics is model-theoretic.

A methodological principle implicit in Chomskyan syntax in the 1960s,


encouraged although not required by the Katz–Postal hypothesis that mean-
ing is determined at Deep Structure, and carried to extremes in Generative
Semantics, was the principle that sameness of meaning should be reflected
in sameness of syntactic Deep Structure. However, from the perspective of
Montague Grammar, sameness of meaning does not require identity at any
syntactic level (see Thomason, 1976): semantics is model-theoretic, not rep-
resentational, not “syntactic”.
In formal semantics, the core of sameness of meaning is sameness of truth
conditions; and it does not require sameness on any syntactic level for two
sentences to end up having the same truth conditions. Thus, formal seman-
tics removed much of the motivation for that aspect of Generative Seman-
tics. Its semantic goals could apparently be met, even exceeded, in a formal
semantic approach, offering greater explicitness and compatible with more
“conservative”, less abstract, syntax. Thus, the rise of Montague Grammar
was one factor in the decline of Generative Semantics and the fading away
of the linguistic wars.
Details of Montague’s analyses have in many cases been superseded, but
in overall impact, PTQ was as profound for semantics as Chomsky’s Syntactic
Structures was for syntax. Emmon Bach (1989, p. 8) summed up their cumu-
lative innovations thus: Chomsky’s Thesis was that English can be described
as a formal system; Montague’s Thesis was that English can be described as
an interpreted formal system.

1.2.4 Other contemporaneous work


While Montague clearly occupies a preeminent position in the history of
formal semantics, he did not work in a vacuum. Works that influenced his
thinking, as evidenced in his papers and in his seminars from 1967 and
later, include, among others, Quine (1960b), Geach (1962, 1967) for puz-
zles of intensionality; Frege (1892b), Davidson (1965, 1967a, 1970), Kripke
(1963), and various works of and/or conversations with Alfred Tarski, David
Lewis, David Kaplan, Dana Scott, Rudolf Carnap, Alonzo Church, Yehoshua
Bar-Hillel, J. F. Staal, Terence Parsons, and Barbara Partee, and several of his
students including Hans Kamp, Dan Gallin, and Michael Bennett.
Independently, on the other side of the country, Donald Davidson and Gil
Harman were both at Princeton from 1967 to 1969, interacting intensely,
optimistic about the potential fruitfulness of linguistics–philosophy interac-
tions and about the prospects of Generative Semantics, with its underlying

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
12 B A R B A R A H . PA R T E E

structures somewhat resembling the structures of first-order logic. Together


they produced some exciting conferences and the influential edited collec-
tion (Davidson and Harman, 1972). Davidson was also influential in urging
a truth-conditional semantics for natural language, arguing from learnabil-
ity that a finitely specifiable compositional semantics for natural languages
must be possible (Davidson, 1967b). Davidson differed strongly from Mon-
tague in wanting to stay with first-order logic and to eschew such model-
theoretic constructs as possible worlds and intensions. One of Davidson’s
contributions to semantics was the idea that the interpretation of most ordi-
nary sentences includes an existentially quantified event argument, mak-
ing sentences partly analogous to indefinite NPs (Davidson, 1967a); after the
work of Terence Parsons (1990), the event argument became widely adopted.
There were major early contributions to formal semantics in Europe start-
ing in the early 1970s. Renate Bartsch had come to UCLA to work with Mon-
tague just about at the time of his death; she and I had many fruitful dis-
cussions, but much more significant was her collaboration with Theo Venne-
mann, which began then at UCLA and continued in Germany (Bartsch, 1972;
Bartsch and Vennemann, 1972b). Arnim von Stechow was an early and influ-
ential contributor to the rise of formal semantics in Germany and Europe, as
well as to Generative Grammar and the integration of semantics and syntax
(Kratzer et al., 1974; von Stechow, 1974). A number of formal semanticists5
in other European countries point to von Stechow as the source of their
earliest acquaintance with Montague’s work. And influential contributions
came from C. L. Hamblin in Australia (Hamblin, 1973) and M. J. Cresswell in
New Zealand (Cresswell, 1973).
The earliest international conference on formal semantics (construed
broadly) of natural language was organized by Ed Keenan at Cambridge
University in 1973; eighteen of the twenty-five published contributions
(Keenan, 1975) were by European scholars, including Östen Dahl, Hans
Kamp, Peter Seuren, John Lyons, Renate Bartsch, Arnim von Stechow, Franz
von Kutschera, Carl Heidrich, and Theo Vennemann. The biennial Amster-
dam Colloquia, still a major forum for new results in formal semantics,
started up in the mid-1970s and became international in the late 1970s.
In the 1970s there were four textbooks on Montague grammar published
in Germany, the last and best being Link (1979); all four were reviewed in
Zimmermann (1981).
Other research that was not “Montague Grammar” in a narrow sense but
which also fed into the development of formal semantics included Terence
Parsons’s combinatorial-based formal semantics of English (1972), David
Lewis’s influential paper (1970), which included a compositionally inter-
preted transformational fragment with a categorial grammar base, the work
of the New Zealand philosopher and logician M. J. Cresswell (1973), who had

5
Personal communication from several semanticists in Scandinavia and elsewhere; more details will be
included in (Partee, in preparation).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 13

been a participant in UCLA’s famous “logic year” in 1967–1968, and the work
of the American linguist Edward Keenan (1971a,b).
And there were of course other important early contributors to the devel-
opment of formal semantics as well (for more on the early history of for-
mal semantics, see Cocchiarella, 1981; Thomason, 1996; Partee, 1997b, 2011;
Abbott, 1999; Cresswell, 2006; Stokhof, 2006; Janssen, 2011).

1.3 Central principles, issues, and points of divergence

1.3.1 The Principle of Compositionality and the theory-


dependence of its key terms
At the heart of formal semantics are the principle that truth conditions
form a core aspect of meaning and the methodologically central Principle
of Compositionality.
For Montague, the compositionality requirement was the requirement of
a homomorphism between syntax and semantics. The theory spelled out in
Montague’s UG (Montague, 1970b) requires that a grammar should take the
following form: the syntax is an algebra, the semantics is an algebra, and
there is a homomorphism mapping elements of the syntactic algebra onto
elements of the semantic algebra. Montague’s very general and formally
very explicit definition leaves a great deal of freedom as to the nature of
these algebras. For a logical language, the elements of the syntactic algebra
can be the well-formed expressions. However, for a natural language, ambi-
guity makes that impossible, since the homomorphism requirement means
that each element of the syntactic algebra must be mapped onto a unique
element of the semantic algebra. So for a natural language, the elements
of the syntactic algebra are usually taken to be expressions together with
disambiguating structural descriptions, typically trees of some sort.
Differences among approaches can often be traced to three crucial theory-
dependent terms in the Principle of Compositionality: “The meaning of a
whole is a function of the meanings of its parts and the way they are syntac-
tically combined.” We discuss meanings in Section 1.3.2, the means by which
meanings compose in Section 1.3.3, and parts, i.e., syntax, in Section 1.3.4,
with more on the syntax–semantics interface in Section 1.4. The concept of
function common to almost all semantic theories is the familiar set-theoretic
one, but theories may differ in what further constraints they add on allow-
able functions.

1.3.2 What are meanings?


David Lewis provided a famous strategy for thinking about what meanings
are: “In order to say what a meaning is, we may first ask what a mean-
ing does, and then find something that does that” (1970, p. 22). There are

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
14 B A R B A R A H . PA R T E E

different proposals about what to count as meanings (or as the linguistically


relevant aspects of meanings) within formal semantics.
An extensional language is one in which compositionality holds for exten-
sions: the extension of the whole is a function of the extensions of the parts
and the way they are syntactically combined. In a simple case for such a
language, the extension of a sentence is a truth value, the extension of a
predicate expression (common noun phrase, simple adjective, verb phrase,
predicative prepositional phrase) is a set of individuals; the extension of a
proper name or a pronoun can be an individual. (See any formal semantics
textbook, such as Gamut, 1991; Heim and Kratzer, 1998; and Chierchia and
McConnell-Ginet, 2000).
Natural languages are at least intensional: the extension of the whole in
some cases depends on intensions as well as extensions of parts, and the
intension of the whole depends on the intensions of the parts.6
Within a decade, the importance of context-dependent expressions was
increasingly recognized and led to increasing integration of formerly “prag-
matic” matters into formal semantics (or “formal semantics and pragmat-
ics”). Kaplan (1979) introduced the character of an expression, a function
from contexts to intensions, and gave an account on which a sentence like I
am here now is always true when spoken but is not a necessary truth. Kamp
(1981b) and Heim (1982) formalized Karttunen’s notion of a discourse referent
introduced into the local context by the use of an indefinite NP (Karttunen,
1976) (see Brasoveanu and Farkas, Chapter 8) and built on Stalnaker’s notion
of a common ground established and constantly updated during the course of
a conversation, resulting in two different but similar theories often collec-
tively referred to as dynamic semantics,7 treating meaning as a function from
contexts to contexts. (See Section 1.5 below and Schlenker, Chapter 22.) An
alternative conception of dynamic semantics was developed by Groenendijk
and Stokhof (1990, 1991) and Veltman (1996), building in part on work by
van Benthem, and with further linguistic development by Chierchia (1995a),
and others.

6
Because any two sentences with the same truth conditions have the same intension on standard possible-
worlds analyses, intensions have often been said to be too weak to serve as meanings. There are various
proposals for treating some constructions as hyperintensional and making use of a richer concept of
structured meanings to handle them, an idea that has roots in Carnap’s intensional isomorphism (Carnap,
1956) and in (Lewis, 1970) and was further developed in the pioneering work of Cresswell and of von
Stechow (Cresswell, 1975, 1985; Cresswell and von Stechow, 1982; von Stechow, 1982; see also Duží et al.,
2010). Montague formalized intensions of sentences as functions from possible worlds and variable
assignments to truth values. And more generally, the intensions of all other categories of well-formed
expressions are formalized as functions from possible worlds and variable assignments to the corresponding
extensions.
7
The common description of the Kamp–Heim approach as involving a dynamic conception of meaning may
be challenged (M. Stokhof, p.c.) as a simplification or as misleading. In DRT, at least, it is the “construction
rules” that build discourse representations that are dynamic; the model-theoretic interpretation, which
involves the embedding of a discourse representation into a model, is static and classical. In the Amsterdam
dynamic semantics of Groenendijk, Stokhof, Veltman, et al., classically constructed syntactic descriptions are
dynamically interpreted.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 15

In an independent refinement, Barwise and Perry (1983) and Kratzer


(1989) argued for replacing possible worlds by (possible) situations, which for
Kratzer are parts of possible worlds, enabling more fine-grained analysis of
meanings (see Kratzer, 2011a).
There are other formalizations of what meanings are in contemporary
formal semantics; see, for example, Hintikka’s game-theoretic approach
(Hintikka and Sandu, 1997), “Glue Semantics” (Asudeh and Crouch, 2002).
Another important approach is that of of constructive type theory, as devel-
oped by Per Martin-Löf and applied to natural language semantics by Aarne
Ranta (Ranta, 1995)8 . See also Pagin, Chapter 3 in this respect.

1.3.3 How do meanings compose?


How are meanings put together? How does the compositional mapping from
syntax to semantics work? The question of what sorts of functions are used
to put meanings of parts together is inextricably linked to the questions of
what meanings are, and of what count as syntactic parts. Frege (1892b) took
the basic semantic combining operation to be function-argument application:
some meanings are construed as functions that apply to other meanings.
With a syntax such as categorial grammar providing the relevant part-
whole structure, especially in the case of a simple extensional language,
Frege’s function-argument principle could be enough; with other kinds of
syntax, other operations may be needed as well. (See Sailer, Chapter 21.) In
the fragment in Montague’s PTQ, the simple grammatical relations are all
treated as intensional and are all interpreted by function-argument appli-
cation (the function applies to the intension of the argument to give the
extension of the result); some other special syntactic rules, especially ones
that are partly “transformation-like,” have more complex corresponding
interpretation rules. Other possible approaches to meaning composition
include unification (Bouma, 1988; Carpenter, 1992) and Glue Semantics,
already mentioned. The possibility of type-shifting processes and various
kinds of coercion complicates the picture and complicates the definition of
compositionality (Pagin and Westerståhl, 2010a,b; Partee, 2007).
Formal semanticists also differ on whether a level of semantic representation
is hypothesized to mediate between syntax and model-theoretic interpre-
tation. Montague’s own work exemplified both direct model-theoretic interpre-
tation (1970b) and two-stage interpretation via translation into a language
of Intensional Logic (1973b). Many but not all formal semanticists make
use of an intermediate semantic representation in some formal language,
either as a convenience (“for perspicuity,” as Montague put it) or with the
hypothesis that it might represent a “linguistically real” level of representa-
tion. Either approach can be compositional: a two-stage interpretation pro-
cedure is compositional if the syntax-to-semantic-representation mapping

8
I am grateful to Martin Stokhof for bringing Ranta’s work to my attention.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
16 B A R B A R A H . PA R T E E

rules are compositional and the model-theoretic semantics of the represen-


tation language is also compositional. When those conditions are met, the
intermediate language is, from a model-theoretic perspective, in principle
eliminable. Linguists who hypothesize that it has some psychological reality
would still want to keep such a level: it may represent an aspect of the means
by which humans compute the mapping between sentences and their mean-
ings. But it is a major challenge to find empirical evidence for or against such
a hypothesis.
It is worth noting that it is possible to advocate direct model-theoretic
interpretation without being anti-psychologistic, via the notion of mental
models (Johnson-Laird, 1983). But approaches using mentally represented
“formulas” (logical forms, conceptual representations) and computations on
such formulas, as advocated by Jackendoff for many years, are preferred by
many Chomskyan linguists.
Kamp in his Discourse Representation Theory (DRT) (1981b) proposed Dis-
course Representation Structures (DRSs) as a non-eliminable intermediate
level of representation, with claimed psychological reality: Kamp hypoth-
esized that his DRSs could be a common medium playing a role both in
language interpretation and as objects of propositional attitudes. Kamp
argued against full compositionality; he was challenged by Groenendijk
and Stokhof (1991), who argued that a fully compositional dynamic semantics
could accomplish what Kamp could do with DRT. Muskens (1993) proposed
a reconciliation with his Compositional Discourse Representation Theory.

1.3.4 Part-whole structure: syntax


Logicians typically specify the syntax of a formal language as a recursive def-
inition; in that case the requirement of compositionality as homomorphism
can be satisfied by giving the semantics in the form of a parallel recursive
definition. The “derivation trees” that correspond to the steps in applying a
recursive definition become the elements of the syntactic and semantic alge-
bras, and the homomorphism requirement says that each syntactic deriva-
tion must be mapped onto a unique semantic derivation.
The simplest linguistic examples take the elements of the syntactic alge-
bra to be (sub)trees generated by a context-free grammar, and semantic inter-
pretation to specify how the interpretation of a given subtree is computed
from the interpretations of its immediate “daughter” subtrees; in this case
the algebraic conception can be taken as compatible with and an improve-
ment on what Fodor and Katz, and Katz and Postal, were aiming at with
their Projection Rules.
Some formal semanticists argue in favor of having a monostratal grammar
for the syntactic component, that is, a grammar with a single level of syntax
and no transformations, such as Generalized Phrase Structure Grammar
(GPSG), Head-Driven Phrase Structure Grammar (HPSG), Lexical-Functional
Grammar (LFG), Categorial Grammar, Tree-Adjoining Grammar (TAG).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 17

Others who work with a Chomskyan syntax take the syntactic input to
the semantic component to be a specified level of semantically relevant
syntactic representation called LF (a term meant to suggest “logical form”,
but defined purely theory-internally).
The relation between the preceding issues and syntax shows up clearly
in debates about direct compositionality: some linguists argue that a directly
compositional model-theoretic semantics can apply to non-abstract surface
structures (see the debates in Barker and Jacobson, 2007), without abstract
syntactic representations, movement rules, or a level of Logical Form. Advo-
cates of direct compositionality use an enriched arsenal of semantic com-
bining rules, including not only function-argument application but also
function composition and a number of type-shifting operators. There may
or may not be an inevitable tradeoff between optimizing syntax and opti-
mizing semantics; it is a sign of progress that many linguists work on syntax
and semantics with equal concern for both.

1.4 From Montague Grammar to contemporary


formal semantics

1.4.1 Issues in combining formal semantics with syntax


The earliest works in Montague Grammar followed Montague’s pattern of
giving the syntax as a simultaneous recursive definition of expressions of
all syntactic categories, and the semantics as a corresponding recursive def-
inition of the interpretations of those expressions. In the first papers by
linguists and philosophers, collected in Rodman (1972),9 the form of the
syntax was close to Montague’s; Partee took on the challenge of trying to
combine Montague Grammar (MG) with Transformational Grammar (TG)
(Partee, 1973b, 1975), trying to recast transformational rules into recursive
definitions that could be interpreted compositionally. On Partee’s approach
to the MG–TG combination, a syntactic derivation worked bottom-up with
phrase structure rules as the basic building rules and with transformations
applying whenever their conditions are met; Partee (1980) suggested that a
grammar thus organized might be able to meet a constraint that all gen-
erated expressions be well formed, unlike the common practice in then-
current transformational grammar to generate tree structures that had to
be subjected to obligatory transformational rules to produce well-formed
expressions.
The task of putting MG and TG together was made difficult by the fact that
within TG, the “building blocks”, or “kernel sentences”, were closed sen-
tences, while in MG, the syntax includes “indexed pronouns” (Montague’s
he0 , he1 , . . . ), interpreted as variables. (See later uses of PRO, pro, and traces in

9
That first collection resulted from Partee’s winter–spring 1972 seminar on Montague Grammar at UCLA, and
contains papers by Partee, Bennett, Bartsch, Rodman, Delacruz, and others.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
18 B A R B A R A H . PA R T E E

generative grammar.) One of Montague’s innovations was the use of lambda


abstraction as the central device involved in the interpretation of variable-
binding constructions, possible only if well-formed expressions can contain
elements interpreted as variables. And one immediate obstacle to synthesis
was the existence of various “deletion rules” in TG. In classical TG, (4a) was
derived from something like (4b) by “Equi-NP Deletion”.

(4) a. Mary was eager to win.


b. [ S Mary was eager for [ S Mary to win]]

But given the principle of compositionality, and given the way MG works
by building up the meanings of constituents from the meanings of their
subconstituents, this derivation seemed to present a problem. The syntactic
derivation uses deletion, but the semantic derivation cannot: there is no
permissible operation that would “delete” a piece of a meaning of an already
composed subpart. Recall the discussion in Section 1.2.1 of the consequences
of analyses like (4b) for sentences like (5a). The presumed Deep Structure (5b)
would clearly give the wrong meaning.

(5) a. Everyone was eager to win.


b. [ S everyone was eager for [ S everyone Tns win]]

The MG–TG resolution suggested in Partee (1973b, 1975) was that the
“underlying” subject in the embedded sentence should be a bindable
variable.10 Partee followed Montague’s line and bound it by lambda abstrac-
tion to make a VP type, as in (6a), assuming that the complement of the
adjective eager is a VP. Others have proposed an S type for the infinitive, with
the variable bound by the lambda abstract associated with the higher quan-
tifier, as in (6b). In this very simple example, the VP in (6a) could alternatively
just be base-generated and interpreted directly; Partee’s “Derived VP rule”
was motivated by VPs like to see herself or to be elected, which she derived trans-
formationally from open sentences like she0 sees her0 self and he1 elects her0 .

(6) a. [[to win]] = λx[win(x)]


b. alternatively: everyone (λx[x was eager for [x to win]])

In Chomskyan syntax, a corresponding change was eventually made. The


first step, influenced by Reinhart (1976, 1983), involved replacing the “iden-
tical NP” by the special null element PRO, interpreted as a bound variable.
A considerably later step, probably influenced by Heim (1982), introduced
functional heads that could be interpreted as abstraction operators rather
than assuming that indexed quantifiers themselves were responsible for
binding. Other syntactic theories, like GPSG, HPSG, and LFG, and mod-
ern versions of Categorial Grammar, were developed after the quantifier
and binding issues had become well known, and their design included
mechanisms to deal with those problems.

10
A similar proposal had already been made within Generative Semantics by McCawley (1968b).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 19

There were other obstacles to combining Transformational Grammar


and Montague Grammar, since some transformational rules were neither
meaning-preserving nor easily reformulated as rules interpretable by a uni-
formly meaning-changing function, but the problem just described is a good
example of an important problem in principle, whose resolution requires
rethinking parts of the architecture of one theory or the other. In any case,
the goal of combining TG with MG lost some of its urgency as various lin-
guists began to realize that with a powerful semantics to do some of the
work, some kinds of transformations – possibly all – might be eliminated.
Dowty (1978) argued in favor of eliminating all “governed transforma-
tions” (those whose application depended on the presence of a particular
lexical class, such as the easy-class of adjectives implicated in the rule of
“Tough-movement” mapping [structures corresponding to] It’s tough to solve
this problem to This problem is tough to solve) and replacing them by lexical
rules which in effect transform the argument structure of a given lexical
item (with or without some morphological change) and spell out the corre-
sponding change in semantic interpretation.
Gazdar (1982) proposed that all transformations could be eliminated,
so that the syntax could be context-free. He and colleagues developed his
approach into Generalized Phrase Structure Grammar (GPSG) (Gazdar et al.,
1985). Transformations were replaced by meta-rules that extended the
grammar; for instance, in place of the passive transformation, a meta-rule
specified how for each phrase structure rule generating a phrase containing
a transitive verb and an object, plus possibly more constituents, a phrase
structure rule should be added generating a corresponding passive verb
phrase. And for each such meta-rule, there was a uniform semantic meta-
rule mapping the compositional interpretation rules for the original phrase
structure rules onto compositional interpretation rules for the derived
phrase structure rule. So instead of a transformation mapping active sen-
tences to passive ones, there would be a meta-rule mapping certain VP rules
onto other VP rules, with corresponding semantics. This was the first of
several proposals for a non-transformational syntax with a compositional
formal semantics. Extended Categorial Grammar (Bach, 1981, 1984) was
developed around the same time, and HPSG followed soon after (Pollard
and Sag, 1994). LFG had been invented a little earlier (Kaplan and Bresnan,
1982); a compositional formal semantics interpretive component for it was
proposed by Halvorsen (1983), and in the intervening years there have been
other proposals such as the earlier mentioned Glue Semantics.

1.4.2 Case study: noun phrases, quantifiers, and relative clauses


Much of the discussion in this article has been at an abstract level; in this
section we will look at some examples that illustrate a number of issues.
One of the reasons that so many philosophers and linguists had agreed
earlier that linguistic structure and logical structure were very different

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
20 B A R B A R A H . PA R T E E

was the apparent mismatch between the syntax and semantics of noun
phrases (NPs).11 Russell considered it an illogicality of English that expres-
sions semantically so different as Jones, a philosopher, every student, no man,
and the king have largely the same syntactic distribution,12 and thus evi-
dently belong to the same syntactic category (NP).
A major legacy of PTQ was the very important and influential analy-
sis of noun phrases as uniformly denoting generalized quantifiers. Part of
the appeal of this analysis for linguists was that it captured the impor-
tant semantic differences among NPs headed by different determiners, as
in generative semantics treatments, while at the same time giving all NPs
a similar syntactic structure and an interpretation of the same semantic
type, interpreting them all as generalized quantifiers, denoting sets of prop-
erties of entities (see Westerståhl, Chapter 7). Because most linguists had
earlier known nothing about type theory, certainly nothing about general-
ized quantifiers, those who wanted to capture meaning at Deep Structure
had been led to posit abstract deep structures that resembled first-order
logic; dependence on first-order logic had made it impossible for linguists
to imagine giving an explicit semantic interpretation for the or a/an or every
or no that didn’t require decomposition into formulas with quantifiers and
connectives, more or less like the translations one finds in logic textbooks.
The Generative Semanticists embraced such structures and made under-
lying structure look more like first-order logic (Lakoff, 1971a,b; McCawley,
1970), while Chomsky and Jackendoff rejected putting such logical decom-
positions into the syntax and devised various proposals for how some sort of
semantic component could interpret the combination of deep and surface
structure (Chomsky, 1971; Jackendoff, 1972). One can speculate that the rift
might never have grown so large if linguists had known about generalized
quantifiers earlier; the productive teamwork of Barwise and Cooper (1981) is
a classic early example of how formal properties and linguistic constraints
and explanations can be fruitfully explored in tandem with the combined
insights and methodologies of model theory and linguistics, and general-
ized quantifiers have continued to be a fertile domain for further linguisti-
cally insightful work exploiting formal tools (Peters and Westerståhl, 2006;
Szabolcsi, 2010; Keenan and Paperno, 2012).
In recent decades the differences among different noun phrase interpre-
tations, including differences among referential, predicative, and quantifi-
cational uses, have led many semanticists to a less uniform treatment of the
semantics of NPs, to the exploration of type-shifting mechanisms to help
keep the best properties of a uniform analysis while doing justice to the
apparent flexibility of natural language interpretation (Partee, 1986; Hen-
driks, 1993; Szabolcsi, 1997), and to innovative proposals for many aspects

11
I use the older term NP in a broad sense, to include the contemporary syntactic categories NP and DP.
12
If I say Scott was a man, that is a statement of the form “x was a man”, and it has Scott for its subject.
However, if I say the author of Waverley was a man, that is not a statement of the form “x was a man” and
does not have the author of Waverley for its subject. (Russell, 1905, p. 488)

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 21

of NP (or DP) interpretation. Recent and current work focuses on such top-
ics as the mass-count distinction, plurality, weak and strong quantification,
indefinites, vagueness and gradability among modifiers and quantifiers,
modal and temporal aspects of NP and DP interpretation, and more. (See
Dekker and Zimmermann, Chapter 6; Westerståhl, Chapter 7; Brasoveanu
and Farkas, Chapter 8; Nouwen, Chapter 9; and Cohen, Chapter 10.) Quan-
tification and related issues in the interpretation of NPs and DPs have also
proven to be extremely fertile ground for cross-linguistic and typological
studies in formal semantics (Bach et al., 1995; Chierchia, 1998b; Kratzer,
2005; Matthewson, 2001; von Fintel and Matthewson, 2008).
Here we give an illustration of the methods of formal semantics and their
impact on the resolution of the “mismatch” between logical and linguis-
tic structure by considering one aspect of the analysis of restrictive relative
clauses like that Pat had lost in (7a) and (7b) and their interaction with the
semantics of various quantifiers and determiners.

(7) a. Mary found a hat that Pat had lost.


b. Mary found every hat that Pat had lost.

In the 1960s, there were debates about whether the relative clause com-
bines with the common noun phrase (today’s NP), as in structure (8), or with
the full DP a hat, every hat, as in structure (9).

(8) Mary found [ DP a/every [ NP [ NP hat][ that [ Pat had lost ]]]]

(9) Mary found [ DP [ DP a/every [ NP hat]][ that [ Pat had lost ]]]

There were also debates about the semantics of the relative clause, with
some arguing that in (7a) that Pat had lost means and Pat had lost it, whereas in
(7b) it means if Pat had lost it, creating tension between the uniform surface
structure of that Pat had lost in (7a) and (7b) and the very different “underly-
ing” semantic interpretations posited for them (see Stockwell et al., 1973),
inspired by the structure of their translations into first-order logic, as in
(10a) and (10b).

(10) a. ∃x(hat(x) & lost(Pat, x) & found(Mary, x))


b. ∀x((hat(x) & lost(Pat, x)) → found(Mary, x))

The formal semantics perspective suggests searching for a unitary syntax


and meaning for that Pat had lost and locating the semantic difference
between (7a) and (7b) in the semantics of a and every. The solution (due
to Quine (1960b) and Montague (1973b)) requires structure (8): the noun
and relative clause denote sets, and their combination denotes the inter-
section of those two sets. Then the phrase hat that Pat had lost denotes the
set (11a), whose characteristic function is denoted in the lambda-calculus
by (11b).

(11) a. {x : x is a hat and Pat had lost x}


b. λx.hat(x) & lost(Pat, x)

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
22 B A R B A R A H . PA R T E E

Different theories of the semantics of determiners give different techni-


cal implementations of the rest of the solution, but that first step settles
both the syntactic question and the core of the semantics. Nouns and com-
mon noun phrases (NPs) denote sets (or their characteristic functions), and
restrictive relative clauses also denote sets (or their characteristic functions);
in extensional type theory, they are both of type e, t. Restrictive relative
clauses uniformly combine with an NP to give a new NP; the semantics is
just set intersection (12) or its equivalent in the lambda calculus as shown
above in (11b).

(12) [ NP NP REL] = NP ∩ REL

In the treatment of DPs as generalized quantifiers, the determiner or quan-


tifier is interpreted as a function that applies to a set and gives as a result a
generalized quantifier of type e, t, t, a set of sets of individuals.13 In the
classic treatments of Montague and of Barwise and Cooper, the interpreta-
tion of a DP of the form a NP is the set of all those sets that have a non-empty
intersection with the NP set, and the interpretation of a DP of the form every
NP is the set of all subsets of the set denoted by NP.
As a result, sentence (7a) asserts that the set of hats that Pat had lost and
the set of hats that Mary found overlap; (7b) says that the set of hats that
Pat had lost is a subset of the set of hats that Mary found. Thus the apparent
difference in the interpretation of the relative clause in the two DPs turns
out to be the predictable result of the semantic difference between the two
determiners; there is no need to give the relative clause a non-uniform inter-
pretation, and no reason to give the DPs syntactic structures resembling the
formulas of first-order logic. See Partee (1995) for a fuller argument, and Bar-
wise and Cooper (1981) for a fuller treatment of the classical formal seman-
tics of determiners.

1.4.3 Varieties of syntax and semantics combinations


As noted above, formal semantics can in principle be combined with many
kinds of syntax, but different syntactic frameworks may require differences
in how the semantics is structured if compositionality is to be observed. We
illustrate with a thumbnail review of how quantifier scope ambiguities have
been handled in a number of different approaches, omitting most details.
Quantifier scope ambiguity is a major challenge for compositionality. A
sentence like (13) is semantically ambiguous, but there has never been much
evidence for it being syntactically ambiguous; the challenge is to try to find
an analysis which is both compositional and syntactically well motivated.

13
Within generalized quantifier theory, this functional treatment of determiners can be equivalently replaced
by a treatment of determiners as denoting a relation between two sets; the two approaches are
interdefinable. The relational interpretation is often logically more perspicuous, the functional treatment
more faithful to natural language compositional structure.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 23

(13) Every student read one book.

There were at least six kinds of solutions proposed from the 1960s to the
1980s, even before the introduction of choice functions and various non-
quantificational analyses of indefinites, and in the ensuing decades more
proposals have been made than I can list.14

Generative Semantics
The first serious attempts to account for quantifier scope ambiguity in gen-
erative grammar came from Generative Semantics; there was great progress
in uncovering some of the principles that govern possible quantifier scope,
bound variable anaphora, and related phenomena. Classic works include
Bach (1968); Lakoff (1971b); McCawley (1970). Generative Semanticists pro-
posed underlying structures that looked similar to the structure of first-
order logic, plus a transformation of Quantifier Lowering so that a quan-
tifier that starts as a sentence operator can end up as a determiner on an
NP. The actual data about scope possibilities were controversial, as pointed
out by Carden (1976), who was an early advocate of and pioneer in experi-
mental methods in semantics. The perceived need to constrain derivations
so that scope in English corresponds to surface c-command (now considered
incorrect for English but probably correct for some languages) led to trans-
derivational constraints (Lakoff, 1973), observed by Langendoen (2001) to be
an early example of optimality-theoretic-like devices.

Interpretive Semantics
Interpretive Semanticists, led by Jackendoff and Chomsky, maintained an
“autonomous syntax” and argued that different semantic phenomena were
to be accounted for at various different syntactic levels: argument structure
is determined at Deep Structure, but quantifier scope and variable bind-
ing may depend on structures at various levels, possibly including surface
structure. Classic works include Chomsky (1971); Jackendoff (1972). Cooper
and Parsons (1976) showed how a basic version of the scope mechanisms of
Generative Semantics, Interpretive Semantics, and Montague Grammar (see
below) were intertranslatable.

Montague’s Quantifying-In rule


Montague’s rule-by-rule approach is illustrated by the rule of “Quantifying-
In” in Montague (1973b), which combines an NP with a sentence S by

14
Martin Stokhof asks how this proliferation of proposals sits with the assumed empirical nature of the
problem. Indeed, the quantifier scope problem is puzzling precisely because there is no independent
debate about the syntax; in this way it differs from constructions for which the demands of compositionality
can help to constrain choices among syntactic analyses, as with the attachment of relative clauses or the
internal structure of comparative constructions. The choices in this case are between different kinds of
theoretical apparatus in the syntax and semantics, and evaluation is of necessity theory-wide more than
construction-specific.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
24 B A R B A R A H . PA R T E E

substituting that NP for an indexed pronoun (interpreted as a variable) in S


and substituting appropriate pronouns for further occurrences of the same
indexed pronouns. The given NP is semantically interpreted as a generalized
quantifier, and that generalized quantifier takes as argument the property
expressed by a representation obtained by lambda-abstracting over the
corresponding variable in the formula corresponding to S. For sentence (13),
there are different derivations that differ in the order in which the two NPs
are quantified into the structure, with correspondingly different scopes in
the semantics. On this approach, a single surface syntactic structure may
be derived via distinct derivations; compositionality is homomorphism
between syntactic and semantic derivations, not between some levels of
syntactic and semantic representations. But as Cooper and Parsons (1976)
showed, one could algorithmically convert a Montague-style analysis into a
Generative Semantics or Interpretive Semantics analysis, and vice versa (in
this particular domain).

Cooper storage
Robin Cooper (1975) proposed an approach that would treat sentence (13) as
syntactically unambiguous while analyzing it as semantically ambiguous,
and maintaining a quasi-compositional semantics. His idea was to amend
the syntax–semantics interface so as to non-deterministically derive a set
of interpretations for each syntactic structure; interpretation was composi-
tional apart from the element of non-determinism. In the process of seman-
tic interpretation (bottom-up, like the syntactic derivation), when you hit a
quantifier phrase, you optionally “store” it, and then you may “retrieve it”
from storage when you hit a suitable higher node, such as an S or a VP, at
which point you interpret the combination of the NP with the S or VP as
in Montague’s treatment. Scope islands represent points in the derivation
where the storage register must be empty. It is of interest that the mono-
stratal syntactic theory GPSG (Gazdar et al., 1985) uses a context-free gram-
mar with a semantics that is quasi-compositional in just this way: straight-
forwardly compositional but with the use of Cooper storage to interpret
sentences like (13).

Quantifier Raising
Later versions of Chomskyan generative grammar added a syntactic level of
“LF” or “Logical Form”, intended to provide the syntactic input to semantic
interpretation. One early proponent of such a level was Robert May (May,
1977, 1985). His syntactic rule of Quantifier Raising, roughly inverse of the
Generative Semanticists’ Quantifier Lowering, produces a derived syntactic
structure in which the various quantifiers are adjoined to the clause that
represents their immediate scope. In this and a number of other respects,
the LF approach may be approximately regarded as “Generative Semantics
upside down”.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 25

Type-shifting
David Dowty, whose dissertation followed the Generative Semantics
approach but who soon became a leading Montague Grammarian and for-
mal semanticist, suggested in the 1970s that many transformations, espe-
cially “governed” ones, should be eliminated and replaced by lexical rules
(Dowty, 1978, 1979). This is a very early example of the idea that much of
the “grammar” is really contained in the lexicon, and rules that map lex-
ical items onto related lexical items can often be interpreted semantically
in terms of diathesis changes. Herman Hendriks (Hendriks, 1988) applied
this perspective to quantifier scope ambiguity. His analysis does not require
any “movement”; he derives alternative readings via type-shifting of verbs
and other functors so as to change relative “function-argument command”
relations, extending the kinds of “function-argument flip-flop” shifts intro-
duced by Partee and Rooth (1983). Relative scope is then reflected in local
function-argument hierarchical structure rather than requiring some level
with attachment to higher S or VP nodes. Using a type-shifting approach to
quantifier ambiguity rather than movement rules is one good example of
direct compositionality, as discussed in Section 1.3.4.

Underspecification
For Montague, relative scope was captured at the level of derivation trees.
Muskens (2004), influenced by work in computational linguistics on under-
specification of the parsing process, and by Reyle’s work (1993) on under-
specification in Discourse Representation Theory, takes that idea a step
further to provide a formalism that underspecifies syntactic and semantic
ambiguity analogously, with the help of descriptions in the object language.
Muskens provides underspecified derivation trees with constrained possibil-
ities for how they may be completed. Each corresponding complete deriva-
tion tree generates the given sentence with one of its possible readings. One
of the appeals of this approach, not so far taken up in the psycholinguis-
tic literature as far as I know, is its potential for solving the problem of the
“psychological reality” of quantifier scope ambiguity. Not only is there no
perceived syntactic ambiguity in a sentence like (13), but there is little evi-
dence of the kind of combinatorial explosion of ambiguity that is otherwise
predicted for sentences with more than two quantifiers, and little evidence
that ordinary speakers are very sensitive to the large numbers of quanti-
fier scope readings that linguistic analyses have classically predicted. Under-
specification might very well be a psychologically as well as computationally
reasonable approach: the information is “there”, but it need not always be
“computed” in actual processing.

Pseudoscope
All of the approaches discussed above were designed to account for quanti-
fier scope ambiguity in its “classical” form, with the distinct readings the

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
26 B A R B A R A H . PA R T E E

same as in standard predicate logic. The differences all concerned how the
syntax–semantics interface works to generate the sentences and derive the
readings. But starting in the 1980s, with some even earlier suggestions,
the nature of the readings was challenged, particularly but not only for
indefinite NPs like a book. The relation between semantic wide scope and
semantic or pragmatic notions of specificity was questioned by linguists
in several frameworks (Fodor and Sag, 1982), and the idea that indefinite
NPs are always to be interpreted via existential quantifiers was particularly
strongly challenged in the work of Kamp and Heim discussed in Section 1.5.2
below. The importance of non-quantificational analyses of indefinites was
emphasized in Kratzer (1998b); see also Brasoveanu and Farkas, Chapter 8.

1.5 The increasing role of context: formal semantics


and pragmatics

1.5.1 Early views on the autonomy of


context-independent semantics
The term pragmatics is due to the philosopher and semiotician Charles Mor-
ris (1938). Within semiotics, the general science of signs, Morris distin-
guished three branches: syntactics (now syntax), the study of “the formal
relation of signs to one another”; semantics, the study of “the relations
of signs to the objects to which the signs are applicable” (their designata);
and pragmatics, the study of “the relations of signs to interpreters” (Mor-
ris, 1938, p. 6). Semantics in the work of Formal Language philosophers
such as Russell is pure semantics, just as is the semantics of first-order
logic; no consideration was paid to indexicals or demonstratives. The need
for attention to context-dependence in interpretation was one of the con-
cerns of the Ordinary Language philosophers.15 One of the first aspects of
context-dependence on which logicians made progress was tense, where
Prior and Reichenbach were pioneers. And as noted in Section 1.2.3, Mon-
tague was an important contributor to these developments, developing a
higher order typed intensional logic that unified tense logic and modal
logic, and more generally unified “formal pragmatics” with intensional
logic.
In early Montague Grammar (e.g. as reported in Partee, 1984a), context-
dependence was recognized and treated by the use of free variables. Mon-
tague’s own PTQ included indexed pronouns like he3 ; these were normally
replaced by ordinary gendered pronouns in the course of a derivation as
part of some rule that involved variable binding, but they could also remain

15
As Martin Stokhof has reminded me, the kind of context-dependence of interpretation that Ordinary
Language philosophers were concerned with was much wider ranging than the type of
context-dependence that is exhibited by indexicals. I neglect the deeper issues here.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 27

free, in which case it was suggested that they could provide the interpre-
tation for pronouns without antecedents (demonstratives, indexicals, and
anaphoric pronouns without sentence-internal antecedents), to be inter-
preted by a variable assignment considered as part of a context of use. Other
early examples of “free variables” to be interpreted via assignment functions
coming from the context of use came from work on tense and aspect that
appealed to one or sometimes two or more temporal indices corresponding
to “now” and possibly to additional “reference times” (Kamp, 1971, 1979;
Vlach, 1973). Implicit arguments were often treated as covert free variables,
and a free relation variable was suggested as part of the interpretation of
genitive constructions like Mary’s team (Barker, 1995; Partee, 1997a).
In such work, the assumption was that the output of semantic interpreta-
tion is something like a Kaplanian “character,” a function from contexts to
an intension. This makes the semantics autonomous in the sense that the
compositional interpretation process is context-independent, and a com-
plete sentence can be interpreted semantically, and then “handed over” to
a pragmatic component to fill in the values of the context-dependent vari-
ables. For tense and other indexical expressions, such an approach seems
perfectly appropriate: we understand the invariant meaning of I am alone
here now, and we know how to take contextual information to fill in the val-
ues of speaker, time, and place to arrive at a specific proposition.
But by the early 1980s, evidence of more complex interactions between
semantic interpretation and context-dependence was accumulating to the
point where Lewis’s dictum, “In order to say what a meaning is, we may
ask what a meaning does, and then find something that does that” (1970,
p. 22), called for considering some aspects of context-dependence, tradition-
ally thought of as pragmatics, as integral parts of meaning. What is gener-
ally referred to as dynamic semantics was a first result.

1.5.2 The beginnings of dynamic semantics


A landmark innovation in the integration of context-dependence into
semantics came with the dynamic semantics16 of Kamp (1981b) and Heim
(1982). Consider the contrasting mini-discourses in (14) and (15):

(14) A baby was crying. It was hungry.

(15) Every baby was crying. #It was hungry. (“#” means “anomalous”.)

On the Kamp–Heim theory, an indefinite NP like a baby in (14) introduces a


“novel discourse referent” into the context, and the pronoun it in the second
sentence of (14) can be indexed to that same discourse referent, whose “life
span” can be a whole discourse, not only a single sentence. The discourse

16
An even earlier example of interpretation via a context that is systematically built from denotations of
expressions can be found in Groenendijk and Stokhof (1979).

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
28 B A R B A R A H . PA R T E E

referent introduced by an essentially quantificational NP like every baby in


(15), however, cannot extend beyond its clause, so the pronoun it in (15) is
anomalous. The Kamp–Heim theory also includes an account of the famous
“donkey”-sentences of Geach, variants of which are given in (16a) and (16b).

(16) a. If a farmer owns a donkey, he always beats it.


b. Every farmer who owns a donkey beats it.

These sentences had previously resisted compositional analysis, including


with the tools of Montague Grammar and its early extensions. On Kamp’s
and Heim’s theories, the indefinite a donkey introduces a discourse refer-
ent into the local context, but has no quantificational force of its own; it
ends up being bound by the “unselective quantifiers” always in (16a) and
every in (16b). The theories involve the interdependent areas of quantifi-
cation and anaphora, and the relation of sentence semantics to discourse
semantics and pragmatics, giving rise to much new work in these areas. See
Brasoveanu and Farkas, Chapter 8.

1.5.3 Further developments in formal pragmatics and semantics


In subsequent work in formal semantics, the treatment of context-
dependent phenomena has been a central concern. Other aspects of formal
pragmatics such as the treatment of presupposition and focus-sensitivity
have also long been recognized as central to the study of meaning, so that
“formal semantics and pragmatics” has increasingly come to be seen as a
single field, as described in Kadmon (2001) and as evidenced by the online
journal Semantics and Pragmatics founded in 2007. The integration of prag-
matics and semantics has been further supported with arguments that the
interpretation of pragmatic implicatures occurs interspersed with semantic
interpretation (Chierchia, 2004).
Information structure has also played an important role in the develop-
ment of formal pragmatics; notions such as topic, focus, presupposition,
and discourse structure have been central concerns of linguists and philoso-
phers of many different theoretical inclinations since before Montague’s
work, and have loomed large in discussions of pragmatics and semantics
and the border between them (see Beaver, 1997; Karttunen and Peters, 1979;
Roberts, 1998; Rooth, 1992; von Heusinger and Turner, 2006). The formal-
ization of Gricean implicatures and Potts’s (2005) proposal to treat conven-
tional implicatures as part of semantics have also led to active research
areas.
Semantics and pragmatics are both involved in recent work on Optima-
lity-theoretic semantics and bidirectional OT, as well as in game theoretic
approaches; these and other approaches to formal pragmatics are not well
represented in this book, but see Hintikka and Sandu (1997), Chierchia
(2004), Benz et al. (2005), Blutner et al. (2006), von Heusinger and Turner
(2006). (See also Schlenker, Chapter 22.)

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 29

1.6 Formal semantics and psychology: the


Frege–Chomsky conflict

The relation of linguistics to psychology is nowhere more controversial than


in semantics. While it is easy enough to argue that the syntax of a person’s
language is directly a function of what is in that person’s head, the paral-
lel claim for semantics is highly controversial. The clash between Frege and
Chomsky on the issue of “psychologism” and the notion of semantic com-
petence has led to a conflict in the foundations of formal semantics, some-
times expressed as the question of whether semantics is a branch of mathe-
matics or a branch of psychology (Partee, 1979, 1988b; Thomason, 1974).
Frege, whose ideas were part of the foundation of Tarski’s and Montague’s
work, took an anti-psychologistic view of meanings, and so did many other
logicians and philosophers, including Russell (sometimes), Tarski, Carnap,
and Montague. Their tradition of “objective” (though abstract) meanings
contrasts with the psychologistic view of meanings “in the head” (Fodor,
Lakoff, Jackendoff, and all psychologists). The psychologistic view fits into
the Chomskyan tradition of focusing on linguistic competence, defined in
terms of the implicit knowledge of a native speaker.
Most formal semanticists who are linguists are very much concerned with
human semantic competence. Some formal semanticists have advocated fol-
lowing David Lewis (1975b) in distinguishing semantics from knowledge of
semantics, making semantic competence interestingly different from syn-
tactic competence (Partee, 1988b).
In recent decades the question of whether meanings are “in the head”
has taken an interesting turn as a result of work in the philosophy of mind.
The early arguments by Kripke and Putnam (Kripke, 1972; Putnam, 1975)
against a purely “internalist” account of meaning were arguments based
on a narrow view of psychological states common at the time, a view later
epitomized by Fodor’s methodological solipsism (1980). However, philosophers
working on the philosophy of mind in subsequent decades have increasingly
argued that much of psychology, including perception as well as belief and
other psychological states, needs to be viewed relationally, concerning the
individual and her environment (Burge, 1992, 2003; Stalnaker, 1989). As
Stalnaker puts it, meanings may be “in the head” in the way that footprints
are “in the sand”; it’s a perfectly reasonable way of speaking, but a footprint
clearly isn’t a footprint without a certain relation to causal history, and
meanings probably aren’t either. What is semantic competence? For formal
semanticists, it is common to take the fundamental characterization of
semantic competence to involve knowledge of truth conditions: given a
sentence in a context, and given idealized omniscience about the facts con-
cerning some possible situation, a competent speaker can judge whether
the sentence is true or false in that situation. From that basic competence,
allowing idealizations about computational capacity, it follows that a

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
30 B A R B A R A H . PA R T E E

competent native speaker can also make judgments about entailment


relations between sentences; so idealized semantic competence is widely
considered to consist in knowledge of truth conditions and entailment
relations of sentences of the language.
Many linguists have resisted the relevance of truth conditions and entail-
ment relations to natural language semantics, and some still do. Some of
the objections to truth conditions are countered by arguing that part of
human semantic competence is the matching of sentences with their truth
conditions relative to possible worlds (including fictional worlds), with no
necessary reference to the “actual world”.
Entailment is a central semantic concern in logic and remains so in for-
mal semantics. Given the distinction between meaning and knowledge of
meaning, there is no contradiction between taking entailment relations as
central to meaning while acknowledging that human language users are
not logically omniscient and hence do not always know whether a given
sentence of their language entails another. In this sense it may not be unrea-
sonable to say that speakers can’t know their language; logical fallibility
may be considered a sort of “performance limitation”. The real problem
arises if the semantic value of a proposition is taken to be its truth con-
ditions, and propositions are analyzed as the objects of verbs like believe;
then we get the unwelcome conclusion that sentences with the same truth
conditions should be substitutable in belief-contexts. This is the problem of
hyperintensional contexts (see Pagin, Chapter 3). And it is this problem that
has motivated richer notions of meaning such as the structured meanings
mentioned earlier.
Cognitive semanticists replace concern with logical entailment by con-
cern with human inference; formal semanticists see the relation of entail-
ment to actual human inference as indirect. However, semanticists of many
stripes agree about the importance of revising the formal logics invented by
logicians to model the “natural logic(s)” implicit in the semantics of natural
languages.
A number of linguists still think of semantics in terms of a “level of rep-
resentation” of expressions analogous to a syntactic or phonological level. A
representational view of semantics is quite congenial to the popular compu-
tational theory of mind. But David Lewis’s objection that semantics without
truth conditions is not semantics remains in force, so that formal semanti-
cists consider such representational views at best incomplete. The contrast-
ing model-theoretic view that is dominant in formal semantics takes seman-
tic interpretation to relate expressions to elements of models (possibly men-
tal models; see Johnson-Laird, 1983) defined in terms of constituents such as
possible situations, entities, properties, truth values. Whether some level of
representation plays an intermediate role in interpretation, as in the two-
stage method of interpretation discussed in Section 1.3.3, remains an open
question that so far is not obviously an empirical one.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
Formal semantics 31

For further issues in the relation of formal semantics to cognitive science,


see Baggio, Stenning and van Lambalgen, Chapter 24.

1.7 Other issues

1.7.1 Natural language metaphysics, typology and universals


The issue of “Natural Language Metaphysics” (Bach, 1986b) is an important
foundational area that connects cognitive issues with formal semantics and
linguistic typology. What presuppositions concerning the constitution and
structure of the world as humans conceive it are built into human lan-
guages, and how, and which are universal? These questions may concern
both semantic structure and semantic content, from the semantic differ-
ence between nouns and verbs to the content of color terms or whether
time is discrete or continuous. Their investigation may challenge the lines
between semantic knowledge and other kinds of knowledge. Formal seman-
tics, following the logical tradition, initially employed relatively “austere”
model structures; recent investigations, particularly into lexical semantics,
have led to richer model structures.
Formal semantics has steadily expanded its scope to include work on a
wide variety of languages, and semantic fieldwork is an increasingly active
and well-developed field. Semanticists have made significant contributions
in recent decades to the study of linguistic universals and linguistic typol-
ogy (Bach et al., 1995; Bittner, 1994; Keenan and Paperno, 2012; Matthewson,
2010; von Fintel and Matthewson, 2008). Typological investigations, espe-
cially but not only in the domains of quantification and of tense, aspect
and mood, have included the distribution of nominal vs. adverbial quan-
tification, the semantics of languages that lack a determiner category, and
the range of tense and aspect systems. Recent work on the formal seman-
tics of sign languages has led to enriched perspectives on the semantics
of demonstratives, variable binding, propositional attitudes, quotation, and
other domains (Quer, 2005; Schlenker, 2011e).

1.7.2 Experimental, corpus-based, and computational formal


semantics and pragmatics
Respected early work on applications of formal semantics to computa-
tional linguistic tasks included some in North America (Gawron et al., 1982;
Halvorsen, 1983; Hobbs and Rosenschein, 1978; Schubert and Pelletier, 1982)
and some in Europe (Cooper, 1987; Hirst, 1986; Scha, 1976). One of the most
successful and large-scale efforts was the Rosetta project led by Jan Landsber-
gen at Philips Eindhoven in the 1980s (Rosetta, 1994), which grew out of ear-
lier work by Landsbergen and by Scha. Recently, there have been advances in
methods for making formal semantics computationally tractable, and inter-
est in computational formal semantics has increased to the point that there

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002
32 B A R B A R A H . PA R T E E

are textbooks and courses in the field (Blackburn and Bos, 2003, 2005; Bunt
and Muskens, 1999). A few companies have begun to offer computational
linguistic products that use formal semantics and others are exploring the
possibility of doing so. For some kinds of natural language processing prob-
lems, such as the solving of the kinds of logic puzzles found on the Graduate
Record Exam (GRE), formal methods can offer important advantages and
have been implemented (Lev et al., 2004). Connecting computational linguis-
tics, psychology, and formal semantics, there is current research aimed at
combining the best of statistical/probabilistic and formal approaches, exem-
plified by the work of Joshua Tenenbaum’s Computational Cognitive Science
group at the Massachusetts Institute of Technology (Tenenbaum et al., 2011).
There has always been an interest in acquisition and processing of seman-
tics, often together with syntax, and in recent decades there has been a
significant amount of research in these areas specifically connected with
formal semantics and pragmatics (see, for instance, Chambers et al., 2002;
Chierchia et al., 2004; Crain et al., 1996; Crain and Thornton, 2011; Fra-
zier, 1999; Papafragou and Musolino, 2003; Philip, 1995). The emerging area
of “experimental pragmatics” is represented by (Noveck and Sperber, 2007;
Sauerland and Yatsuhiro, 2009).
Computational and experimental methods in and applications of formal
semantics and pragmatics are developing very quickly, and neither this
introductory chapter nor this handbook is pretending to try to do justice
to them.

Acknowledgments
I am grateful to many colleagues for suggestions and responses to queries
along the way, including Reinhard Muskens, Richmond Thomason, Nino
Cocchiarella, Robert Stalnaker, and Seth Yalcin. I am particularly grate-
ful for extensive helpful comments from a referee who agreed to be non-
anonymous so that I could thank him publicly, Martin Stokhof.

Downloaded from https://www.cambridge.org/core. Caltech Library, on 03 May 2018 at 12:50:52, subject to the Cambridge Core terms of use, available at
https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139236157.002

You might also like